CN115004258A - Computer-implemented method and system including coat identification of scales - Google Patents

Computer-implemented method and system including coat identification of scales Download PDF

Info

Publication number
CN115004258A
CN115004258A CN202180010672.9A CN202180010672A CN115004258A CN 115004258 A CN115004258 A CN 115004258A CN 202180010672 A CN202180010672 A CN 202180010672A CN 115004258 A CN115004258 A CN 115004258A
Authority
CN
China
Prior art keywords
scale
scales
fur
identification
identified
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180010672.9A
Other languages
Chinese (zh)
Inventor
谢尔盖·斯塔特奇克
塞尔日·赫赫伯特
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Unocal Systems Co ltd
Original Assignee
Unocal Systems Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Unocal Systems Co ltd filed Critical Unocal Systems Co ltd
Priority claimed from PCT/EP2021/052337 external-priority patent/WO2021152182A1/en
Publication of CN115004258A publication Critical patent/CN115004258A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

A computer-implemented method of fur identification, in particular reptile fur identification, comprising scales (6) comprising the steps of: acquiring at least one image of a fur part to be identified; detecting features corresponding to boundaries (69, 70) of scales (6) in the image; establishing a pattern of scale positions of the detected repetitive pattern of scales; determining the profile of the detected scale (6); and representing the detected scales (6) on the basis of their profiles (69, 70); identifying characteristic data of the detected scales (6) is determined for traceable identification of fur comprising the scales, wherein the detection of the scales is based on the scan lines (61).

Description

Computer-implemented method and system including fur identification of scales
Technical Field
The present invention relates to a computer-implemented method and system for fur identification, including scaling, particularly reptile fur identification, from pictures taken by a smart phone camera or scanner under various environmental conditions.
The invention relates to the identification and identification of animal skins, in particular teguments, with the possibility of tracking via the global trade channel. The tracking and tracing process comprises the complete life cycle of the animal coat, wherein all relevant data is available in real time and globally.
Prior Art
Protection of documents and objects against all forms of counterfeiting crime is a widespread and growing need of manufacturers of original products, national publishers of identity documents and other organizations. Closely related to identity is the notion of authenticity. In the case of documents, authenticity represents the authenticity of the associated document. For the application field of centralized processing of the tracking of objects and living beings, identity and authenticity are closely related to each other. In essence, in the narrow sense of the word, the tracking of objects (tracking and tracing) is spread around the identity of these objects, the uniqueness of which is determined by a series of special attributes that are not clonable. Tasks with traceability of individual objects to subjects have the trueness-and thus the authenticity-of the accompanying documents of a physical and digital nature as a common basis. The subject of the present invention is a method for tracking and tracing the coat of reptiles from the birth of an animal to the final product made from its coat. Identity and authenticity are considered equivalent in the sense of the present invention and they are mentioned as alternatives to each other or in combination due to their interdependence. In a broader sense, the terms tracking and tracing include traceability of structurally identical but respectively different objects. This is for example the case in a method that is successfully performed as follows: this approach allows tracking of product lines via trade channels by deterministic changes in product identification vector maps as disclosed in EP 2614486B 1. The method by which the identity of individual objects is made available and measurable at the item level is referred to as the fingerprinting method. For this purpose, on the one hand, natural random microstructures as in EP 2257909B 1 or artificially introduced (pseudo-random) microstructures as shown in US 7,684,088B 2 were tried.
For the tracking and tracing of living beings, the random nature of living beings is formalized in some way. This process is called a biometric, whereby random properties, in particular visually or acoustically available properties, are converted into a technically detectable structure, a quasi-template, as a basis for identification. This inevitably involves simplifying the complex combination of random attributes into simpler structural elements in order to make the biometric technically manageable. For example, facial recognition evaluates visual metrics of the face by eye distance, nose to upper lip distance, and the like. Other biometric techniques rely on fingerprints, the shape of the retina, or voice signatures. Depending on the application, the biometric must be adjusted. In order to verify the identity of the reptile, for example, facial recognition is quite unsuitable. In addition, biometric applications must take into account the life cycle of the organism. Successful facial recognition is practically impossible, and if so, the identity should be checked over the life cycle from the infant to the elderly. Even methods based on DNA determination are of little help if the "egg to purse" life cycle of reptiles or reptile skins or products made therefrom is to be covered. Finally, in biometric tracking and tracing methods, the allowed error rate must be specified, e.g. how many individuals must be distinguished with what reliability. Reliability is accompanied by performance criteria such as a False Acceptance Rate (FAR) or False Rejection Rate (FRR). In case of a person's question, the different biometric test results are compared with each other. Thus, a fingerprint may be sufficient in the first test, but in case of doubt needs to be done by more complex DNA evaluation to confirm. This redundant biometric check allows for a very low probability of authentication error. If only one biometric method, that is to say a certain set of biometric parameters, is available, the biometric requirements for FAR and FRR must be very high. A method comprising: the traceability of individual reptiles or their fur or their products must have very specific identification parameters adapted to the species and at the same time must be extremely powerful in terms of their reliability.
EP 1751695B 1 proposes to increase the reliability by multiple scans and deviations between them via the path of the stored two-dimensional change code. A common and necessary data acquisition device with ubiquitous usability, i.e. a smartphone, ideally should have only one recording to capture biometric parameters, especially because the images are taken by hand and several images already contain variability themselves. It is expected that multiple scans or images according to the proposed do not provide advantages.
CA 2825681C proceeds in the same direction as compared to several Feature Vectors (FVs). This is quite similar to the methods for object identification described in US 2014/0140570 and US 2015/0371087.
KR20180061819A improves the apparatus for evaluating several biometric signals from the same object. The device itself may be mobile and connected to a server and thus meet the criteria of the market demand.
US 2007/211921 a1 describes a method for linking a biometric signal to a physical characteristic of an object. These are in turn stored as profiles on a database. The method is suitable for interactive input by a user, so that e.g. the biometric data of one of his hands is sufficient to order a suitably sized garment. Thus, using fingerprint data stored in this manner to track a user's history over a large, if not the entire, life span of the user is not possible and not expected by the inventors.
WO 2018/129051 a1 relates to a fabric or tissue of clothing, wherein the clothing is thus assigned an identification code (UID). One limitation is that the relevant pattern must be distributed over at least 10% of the surface of the selected article. The invention discloses a bar code in the form of a pattern in a substance to be detected. This is all interesting smartphone applications, but is not applicable or only conditionally applicable for tracking and tracing, and if used, can only be used on immutable items.
In contrast, the system shown in WO 03/038132 a1 is suitable for the tracking of animal fur. In this case, the hole pattern and the detection of the hole pattern with the optical sensor are the core of the invention. The proposed technique is an inadequate solution to the current problem. Physical damage from the passage of animal fur and its products through the hole pattern alone is unacceptable.
For the same reason, the solutions according to WO 2018/189002 a1, applied in particular to decorated leather, and WO 2015/179686 a1, applied in general to the life cycle of animals, are not acceptable, since the tattooing method described therein is both damaging to the animal product and prevents counterfeiting in a very limited way.
The invention according to CN 108645797A, CN 107525873 a and US 2017/243284 a1 considers DNA analysis as a reliable tool for tracking and tracing animal skins from live animals to processed gelatine, meat, leather or fur products. However, DNA analysis is not a good enough solution for rapid and simple reptile fur tracking throughout the value and supply chain.
The authors of WO 2013/084242 a2, EP 1438607 a1 or US 2018/153234 a1 support the deployment of radio transponders or transponder cards. The first two patent documents relate to the tracking of herds of livestock on the supply chain, while US 2018/153234 a1 discloses the tagging of animal fur and leather products with RFID chips to prevent loss or to prevent counterfeiting. Moreover, in this case, the cost of tracing throughout the life and product chain is not trivial, and the method is only feasible if verified with special equipment. Such devices are not universally available and therefore represent an undesirable obstacle.
In principle, the features of the individually designed surface can be used to track and trace animal fur throughout the value chain up to the treated, commercially available product. Such features should not only be characteristic of the individual product, but should also have an uncloneable or unclonable shape (physically unclonable function, PUF) to ensure protection against counterfeiting. Typically, such non-replicable surface elements are realized by complex random microstructures (physical random functions, PRFs). Hence, the terms PUF and PRF are used below as synonyms. For example, using such a structure as a fingerprint of an object is discussed in CN106682912A or US 10019627B 2 and JP 2015/228570 a, where the former two patent documents specifically consider a three-dimensional structure, and the latter specifically considers communication of UIDs (unique information functions) connected to a random structure via a portable device. No feasible embodiments are discussed in any way regarding the tracking and tracing of animal products from birth to the final animal product.
One method of tracking items throughout the value chain is US 2013/277425 a 1. The motivation of the underlying invention is to prevent theft, counterfeiting and unauthorized use of objects. For authentication, a "querying device" is proposed, which performs its task at various points along the value chain by interrogating the tag. This method is hardly suitable for living bodies, and therefore does not provide a solution to the problem of the present invention.
Fumiaki Tomita et al, 3.1982, IEEE Transactions on pattern analysis and machine interaction, IEEE computer society, USA, volume 30, phase 2, pages 183 to 191, XP 011242555, ISSN: the article "Description of textiles by a Structural Analysis", a computer-implemented method comprising surface identification of scales, is published in 0162-8828 and is applied in particular to identification of reptile fur.
Disclosure of Invention
Based on the above mentioned prior art it is an object of the present invention to provide an improved computer implemented method comprising surface identification of scales, in particular for reptile fur identification, comprising creating a database for later comparison of the acquired data. It is another object of the present invention to provide a method of: the method is used to track and trace animal skins, especially reptile skins, based on such created databases throughout the life cycle and subsequent processing steps of tanning, dyeing and manufacture of the final product (e.g., handbags, shoes, etc.).
Other possible advantages of embodiments of the invention should comply with additional standards such as authentication with ubiquitous devices, smart phones, tablets, etc., protection against manipulation, counterfeiting and trading of endangered species or via illegal distribution channels, and high efficiency with priority.
Finally, the method according to the invention may not invade or damage the material and at the same time must be highly reliable and robust. Since the public is highly sensitive, the value of wild life trading is billions, the trade is highly regulated to better manage and control the survival of wild life. The invention will allow efficient and economical specification management.
The present invention provides a method of capturing the appearance of a reptile fur from a camera-based device and allowing the unique identification of the fur or a portion of the fur.
The first mentioned object is achieved by a computer-implemented method comprising surface identification of scales, in particular reptile fur identification, comprising the steps of: acquiring at least one image of a surface portion to be identified; in the edge feature detection step, a feature corresponding to a boundary of a scale in an image is detected by scanning the acquired image along a scan line over an area supposed to include a plurality of scales to acquire an intensity or color curve; an edge identification step of determining scale edge candidates of one or more scales based on the acquired scan line image curve; a scale construction voting step of determining an appropriate scale edge as a part of a specific scale based on the scale edge candidates; a scale vote accumulation step of determining a dataset representative of the identified scales, the dataset optionally comprising one or more data taken from the group comprising data relating to the ellipse, major and minor axes of the ellipse, the central position of the ellipse; using the dataset for each of the identified scales to build a graph of repeating pattern scale locations of the detected scales; introducing a step of recalculating the scale, which identifies other scales where the pattern of the established scale exhibits gaps; determining a profile of the detected scales and creating a representative dataset comprising the profile data for each of the detected scales; determining identifying characteristic data of detected scales from a plurality of representative data sets of a surface including scales; and storing the identifying characteristic data for the surface comprising scale in a database.
Preferably, the scale construction voting step is followed by a scale verification step comprising checking the acquired data relating to the identified scale according to predetermined scale profile attributes.
The step of establishing a pattern of repeating pattern scale positions of detected scales may further comprise determining adjacent non-overlapping scales from the group of identified scale candidates.
The above-mentioned method forms the basis of a method for tracking and tracing animal skins, in particular reptile skins, comprising the steps of: the above method is performed on an animal fur sample comprising said surface comprising scales to obtain identification characteristic data, followed by the steps of: comparing the acquired identifying feature data of the animal fur sample with a previously acquired and stored set of identifying feature data comprising surfaces from scales of reptile fur to identify within the stored identifying feature data a surface portion of an animal fur sample, and in the event that acquired identifying feature data of an animal fur sample matches a stored set of identifying features, updating the database with the comparison and the updated acquired identifying features.
Then, preferably, the step of comparing the acquired identification features is followed by the steps of: the database is updated with the comparison and the updated acquired identification features so that the authenticity of the surface can be checked over time.
The step of detecting features corresponding to boundaries of scales in the image may include: an edge feature detection step of acquiring an intensity or color curve by scanning the acquired image along a scanning line over a region assumed to include a plurality of scales; an edge identification step of determining scale edge candidates of one or more scales based on the acquired scan line image curve; a scale construction voting step of determining an appropriate scale edge as a part of a specific scale based on the scale edge candidates; and a scale vote accumulation step of determining a dataset representative of the identified scales, the dataset optionally comprising one or more data taken from the group comprising data relating to the ellipse, major and minor axes of the ellipse, the central position of the ellipse.
When the same surface is scanned at different times, the acquired identification features are preferably stored as updated identification features as the surface is identified, so that the evolution of the surface over time can be tracked and smaller changes in surface properties can be updated.
When the same surface is scanned at such different times, other surface portions of the same surface may be scanned to obtain the identifying features of the other surface portions, and when the same surface is identified, the identifying features of the other surface portions are stored as separate data sets as updated surface portion identifying features. This is particularly helpful when the surface is later separated into different smaller surface portions for separate handling, processing and sale of these portions.
The method is preferably performed in a computer system comprising a processor, a computer storage means comprising a computer program product adapted to perform the above mentioned method steps, further comprising a camera adapted to acquire an image of a surface to be identified, and further comprising a computer memory for storing the acquired identification features in a database.
Further embodiments of the invention are set forth in the dependent claims.
Drawings
Preferred embodiments of the present invention will now be described with reference to the accompanying drawings, which are for the purpose of illustrating the presently preferred embodiments of the invention and are not for the purpose of limiting the same. In the drawings, there is shown in the drawings,
FIG. 1 shows visible details on reptile fur in three detailed views in the range of 1 μm to 1000 μm;
FIG. 2 shows a camera shot of microscopic details of three different sized scales and an assembly of scales;
fig. 3 shows a photograph taken with three cameras of microscopic details of a scale or assembly of scales;
FIG. 4 illustrates the main steps of a method of identifying interactions between a system and a supply chain;
FIG. 5 shows a first part of the development of a final fur product step by step during the supply chain;
FIG. 6 shows a second part of the development of the final fur product step by step during the supply chain after FIG. 5;
fig. 7A to 7C show a photograph 7A) of a fur area, a photograph 7B) of a simplified figure, and a photograph 7C) of a final product as a belt;
fig. 8A and 8B show a flow chart of an algorithm overview of the method in 8A) and sub-steps of the second step of the method step of 8A) in 8B);
fig. 9 shows a series of scales with detection of edge features according to the steps mentioned in fig. 8B);
fig. 10 shows elements identified for voting in the voting step as mentioned in fig. 8B);
11A-11C illustrate in three steps how to prepare the voting step from FIG. 8B) based on the scan line results from FIG. 10;
fig. 12A to 12C show three possible voting elements;
13A-13D illustrate method steps relating to adjacency and neighborhood assessment;
FIG. 14 shows three planes that account for the clustering of ellipses;
fig. 15A and 15B show in 15A) a contour construction step for scales and in 15B) a detail view;
16A and 16B illustrate joint contour correction of neighbors in two different representations;
FIGS. 17A and 17B show two representations of a coat prior to identification;
FIGS. 18A-18C show a basic comparison of small pieces of reptile fur;
FIGS. 19A and 19B illustrate a rotation invariant index;
FIG. 20 shows a detailed profile comparison;
fig. 21A-21C show a grid of scales and their graphical representation with nodes and graphical links in fig. 21A, and surfaces associated with representative data attributes of scales in fig. 21B and 21C.
Fig. 22A to 22C show the effect of size on scale in fig. 22A, the missing scale case in fig. 22B, and the correlation across the gap line within the missing scale detection in fig. 22C.
Detailed Description
The devised solution consists of using a portable computer, preferably with a processor, memory and a camera, in combination with a broader remote computer system, preferably using serialization and traceability, and a data management element, in order to track the reptile fur effectively and transparently on an item and part level in real time, and to support remote access, coupled with an authentication system that protects business sensitive information and can also manage user-specific privileges. Traceability data can be obtained globally and at any time. The system thus relies on a secure 24/7 database that can be easily accessed in real time using all types of connectivity solutions such as cellular telephone standards or other networking technologies. The information stored in the database covers the entire value chain of fur-based products from upstream hunters and/or farms all the way to the final product (similar to going from bassinet to graveyard and further).
The invention includes many advantageous elements that can be implemented in different embodiments of the invention. Such features are in particular:
first, a representation of the fur that is reliably extracted enables the identification of a large number of furs, covering the range of 10 to 100 million individual furs, preferably 1' 000 to 1 million individual furs, while the representation needs to be resistant to any kind of fur treatment, since it is a key element of a strong robust tracking process across the entire supply chain, and can account for various impairments in surface, shape or drape. The numbers mentioned above are based on reasonable commercial estimates and efficient data processing in terms of speed and data volume, without loss of generality. The various parts of the representation are damaged to different extents. The most stable part is the shape and relative position of the scales of reptile fur. More exposed to damage and absent throughout the supply chain is a wrinkled microstructure, which is a characteristic feature at a certain step of the supply chain, but is lost in a deterministic manner when entering the next step of the supply chain. It prevents any manipulation attempt by illegal intervention, while the representation is based on biometric elements of the visible part of the scale-type coat texture.
In some particular cases, the fur representation may be converted to a unique identifier UFI, a unique fingerprint identifier. This transition is influenced by approximately the same fur area to be represented, by similar acquisition conditions and foreseeable fur quantity.
Second, the combination of values increases traceability features, such as: fur quality control, central regulatory management and optimized breeding, species identification, internal tracking, optimal cutting selection, perception of good looking fur parts by the human eye, applicability of symmetric fur parts to bags, whereas traceability characteristics remain stable and represent a high level of identification on the whole supply chain or at least part of the supply chain by technical and software means.
Third, an open IT platform, comprising: multi-server solutions-accordingly cloud solutions, i.e. Amazon Web Services (AWS), microsoft Azure or other clouds from Google (Google), Samsung (Samsung) or huaye (Huawei), etc.; and smart phones, i.e. iPhone X, Galaxy S10, Huawei P20, etc., based on all common operating systems, e.g. IOS (Apple) or Android (Android) (google), HongmengOS-respectively harmony os (hua) etc., but in principle also operating systems based on proprietary solutions enabling dedicated scanners for capturing micro-structural details of the skin hairs of the reptiles.
Fourth, the high integrity of the supply chain, which fulfills formal obligations such as required documentation, tax, customs declaration, CITES certificates, etc., as well as ensuring the authenticity of the fur product at each point of the supply chain.
Fifth, methods and algorithms for reliably extracting the appearance of a fur under any acquisition condition (using video, partial recognition, alignment based on partially overlapping data, composition of fur maps from elements, user guidance under various acquisition conditions, detection of appearance features adaptive to environmental conditions).
Sixth, method steps and algorithms for reliably extracting individual scales regardless of the elements present within the scale (texture, reflections, broken boundaries, marks, salt, invisible edges, etc.).
Seventh, method steps and algorithms for building a grid of scales and measuring their properties.
Eighth, the scale shape is adjusted to accommodate possible deformations or resistance to deformations experienced by the fur during the supply chain.
Ninth, method steps and algorithms for calculating the basis of identification of grid propagation through scales.
Tenth, several layer identification methods, both macroscopic and microscopic, that rely on invariant features and variations.
Eleventh, a method for calculating various features (such as "beauty" for certain parts of the human eye) from the resulting fur representation.
Figure 1 shows visible details on reptile fur in three detailed views in the range of 1 μm to 1000 μm (microns). This technical approach relies on the macroscopic coat appearance and the microscopic surface structure and texture of animal coats that are visible. Other disclosures related to paper, plastic, fabric/textile, leather, etc. suggest similar concepts. In contrast to other published methods, the present invention analyzes surface structures covering a continuous range from 1 micron to more than a few centimeters, as in fig. 1 with different types of structural elements. In addition, various specific geometric structures, such as scales, are extracted.
A first range 1, with a size in the range of 1 to 100 μm (micrometers), shows the characteristics of the surface microstructure on and in the gaps between scales. A second range comprising 100 μm to several centimeters particularly shows the characteristics of a plurality of reptile scales.
The smallest scale coat element 7 "in the range of 1 to 5 microns shown in the above detailed view is characterized by the edges 4 of the scale, the gaps between the scales and the irregularities of the folds 3 on the original reptile coat 12 of the scale surface. The preferred image capture device is a microscope-correspondingly a hand-held microscope, but may also be a dedicated lens or a dedicated lens-camera combination (counter-coupled lens).
The second detailed view of fig. 1 shows a middle portion or middle size scale coat element 7' of a first size range that is already available from a high resolution image capture device (e.g., a camera of a prior art smart phone). The characteristic parameters of this intermediate portion are the shape of the centre point 5 of the scale and the scale periphery 6 of the scale.
The third detailed view of fig. 1 shows the upper part of the range where in the first, second range 2, in terms of the scale coat elements 7 of larger size of the surface elements, parameters such as the combination of the centre points 5 of a plurality of scales are revealed, a multi-dimensional centre point vector being defined based on the positions of the scale centres 5 and the links 11 between them. The definition of the larger size 7 is a detailed view of a scale image with sides of 100 micrometers to 10 millimeters, showing a plurality of scales with the center point 5 of the scale. Reference numeral 11 is used for a point-to-point connection between two scale centers 5. The scale centre 5 distribution and the link 11 distribution are specific to reptile fur.
The geometrical relative position between the scale centres 5 is a non-directional pattern (mathematical structure with nodes connected by lines/links). Nodes (e.g. scale centers 5) may have attributes such as coordinates or size and connections (i.e. links 11 between scale centers 5) may have attributes. In the current case with a regular grid, such a graph will have a meta-layer where information of the probability of the frequency or scale distribution will be stored. This is therefore related to the probability map.
Fig. 2 shows the relationship at the microscopic level in the form of a photograph taken with a real camera corresponding to fig. 1. The camera may be from the smart phone mentioned above. The following method may be performed by a processor in the smartphone. Any data obtained as basic data for performing the method steps is stored in the memory of the smartphone. The acquired data may be stored in a further memory of the smartphone. It is also possible that no computation and memory space is used beyond that necessary to use the camera in the smartphone, but is provided by a network service and directed by a remote processor. Each time the term "method step" is used in the present application, it relates to a method performed by a computer processor through a computer program provided for this effect in a storage means of a computer device. The computer program may be provided in the mobile device that takes the picture, or the mobile device may host only the image taking program and transmit the images for all other computing steps to the remote computer and only receive back the following information: in the case of enrolling reptile fur, the enrolment and data collection is successful, and in the case of examining a portion of reptile fur, the portion is identified or not.
The same features have the same reference numerals as the centers 5 of the scales, while the three scales shown are numbered: a first scale 8, a second scale 9 and a third scale 10. Since scale sizes are typically between 5mm and 4cm, the image sizes here can be 5mm by 2 cm to 4cm by 12 cm. For a better understanding, fig. 2 shows a non-realistic cut-out of reptile fur on a medium level, which is available from a high-resolution camera. The side lengths of several scales are indicated by scales 8, 9 and 10. The representation shown in fig. 2 shows a smooth shape of the scale perimeter 6 and the centre point 5 of the scale, the scale perimeter 6 varying from scale to scale in a clearly random manner. The unique pattern of the selected arrangement of the plurality of center points defines the size of the fingerprint of the fur of different reptiles.
Another fingerprint size evolves from the perimeter of the selected plurality of particular shapes of the corresponding scale. In the diagram of fig. 3 is shown a representation of the reptile skins in different production steps, including the original reptile skins 12 prepared shortly after slaughter of the individual animals, the pretanned skins 13 and the leather surface 14 made from the same individual cut from the same individual animal. At least a part of the scale structure 15 remains unchanged during all steps of the processing of reptile fur. Once the reptile has formed the typical squamous fur, invariant structural elements are assigned during the enrollment age. Thus, registration must be done by the reptile breeder. In the case of a legal hunter individual animal, registration needs to be done by the hunter or the distribution station, respectively, wherein the reptile's body or coat is brought into the supply chain. It is assumed that the region 15 can be recognized by the user who performs registration in the case where the region 15 is a relevant body part. Otherwise more parts may be scanned up to the majority of the surface of the reptile's coat and then each cut body part may be identified.
The main steps of the interaction method between the identification system and the supply chain are shown in fig. 4. The steps of registration of the main coat portion 16 on the animal, verification 17 of the animal, verification 18 of the main coat portion, registration 19 of the full raw coat, verification 20 of the raw coat, tanning 21, verification 22 of the tanned coat, registration 23 of the whole coat on a macro/micro level, cutting 24, verification 25 of the coat portion and final verification 26 of the end product indicate at which moment a coat or part thereof is to be registered and at which moment it can or should be verified. Any check that does not confirm the current validation step results in the exclusion of the corresponding reptile fur portion. In other words, the later described enrollment process as part of the computerized method may occur at any time-space between the verification 17 of the animal up to the verification 26 of the final product. Animal validation 17 refers to the examination of the animal at any stage of its life (of course, out of egg for oviparous reptiles, or out of mother in the case of fetal reptiles).
In contrast to this block diagram of fig. 4, fig. 5 and 6 visualize the life cycle of reptile fur in two parts: from the eggs 27 to the hatchlings 28 to the young animals 29 to the fur 30 to the different parts of the fur part, i.e. the first stage 31 of the fur part, the second stage 32 of the fur part and the third stage of the fur part, as is the case with the original fur by tanning the fur to the fur for further processing to the end product 41. The converting step includes the following conversion: egg to hatchling 36, hatchling to baby animal 37, prepared coat 38, coat portion and painting 39, tanning and painting between the different stages 31, 32 and 33 of the coat portion, and object manufacturing 40. And reptile skins may be registered or may be registered at the stage of the hatchling 28 shortly after the formation of the identified invariant element regions 42 in the scale structure of the reptile's coat.
The area containing invariant elements (or elements that co-vary with some transitions during the supply chain) to be analyzed at each step by the image capture device is referred to as a fingerprint area. The position of the fingerprint area is indicated by a characteristic pattern 15 generated by the arrangement of a plurality of scales of a certain size, which can be easily detected even by a low resolution camera.
The digital fingerprint, which can be obtained by a medium-to low-resolution image capturing device, is specified by the size and shape of the individual scales and the size and shape and arrangement of a plurality of selected individual scales 5, which, due to their neighborhood properties, provide a vector.
Figures 7A to 7C demonstrate the particular shape 43 produced on the fur by the collection of successive scales at defined locations on the reptile fur region in figure 7A, and which is defined by the boundary 44 of the region. As reflected in the scale-based coat representation in fig. 7B, those elements correspond to the perimeter representation area boundary 45 surrounding such scale collection area 46 as represented. Such a representation may later be associated with an area 48 on the end product 47 as in fig. 7C. An added value to this technique is that the area on the final product can eventually be identified as a certain portion from known coat, as the coat portion of the hatchling 28.
The visual appearance of the coat evolves along the steps of its life: hatchling, young animal, adult animal, fur, salted fur, dried fur, tanned fur, painted fur, treated fur. As outlined in fig. 5, there are different elements of this appearance during various stages of the supply chain. The scale profile is the most representative information. They are resistant to injury throughout the life of the coat and undergo only limited degradation. The disadvantage is that it requires a large area to describe the fur. The color or texture of the scales or internal scales is quite unique information, but it disappears after tanning and cannot usually be relied upon. In contrast, wrinkles along the contour of scales appear after tanning and have fairly extensive information about the area of the fur. These folds will allow identification of the fur, especially when cut into wristbands.
The business process behind using the appearance of a coat as an identification involves taking all or part of the coat at one or more steps in the supply chain and storing or updating information about it in a local or central database. Furthermore, it is verified whether the appearance of the whole coat or parts thereof belongs to the stored coat. The aim is to identify the whole or a piece of coat belonging to a coat.
The method according to the invention comprises creating/extracting a robust representation of the coat and using this representation to identify the coat or parts thereof and to do some value added service like quality check or selection of the correct area for the wristband. The hatchling 28 may initially be registered with a particular body part 15 and only after inspection of the treated coat (e.g. coat 30 or a coat in one of the stages 31 to 33), i.e. as an inspection of the whole animal, is the coat cut into pieces and re-registered for a number of parts, i.e. each cut part is re-registered and linked to the initially registered total coat identity. The portion used in the pack 41 or belt 47 can then be tracked, with the identification portion 48 of course being different from the identification portion 42.
The technical description of the method is presented below and consists of the following parts: a method overview; image acquisition and initial candidates.
The method should operate within the acquisition condition perimeter. This includes: any light, any phone, any/most fur condition.
The method of fur representation extraction is summarized in fig. 8A and 8B as a method overview. Fig. 8 shows a flow chart of the method algorithm overview steps 49 to 55 in 8A), and the substeps of the second step 50 of the method steps of 8A) in 8B).
The first step acquisition 49 is responsible for the image or video acquisition step. It does not rely on smartphone capabilities to select parameters. In essence, the exposure and focus are set in the app according to a proprietary adaptive focusing algorithm to set the focus when needed and without relying on the phone. The goal of the focus is to reveal clear details of each of the coat elements (scale boundaries, folds, smooth changes in depth of the scale inner surface). The focus is directed to reveal those elements in the image that will be based on scale shape, deformation of fur at several scale levels, clarity of scale boundaries, clarity of folds. Alternatively, instead of one focal position for each feature, there will be a range of focal planes and the characteristics of the folds will be expressed, for example, as they represent a geometry spanning several focal planes. To a lesser extent, white balance and image resolution are selected. Preferably using the original image format.
In video mode, real-time feedback of the quality of the video being captured is constantly provided to the user as guidance. During video, these parameters are measured from the video frames and image metadata at a rate between 1 and 120 frames per second. The parameters measured from the video are: the focus and thus the distance of the fur from the phone, the sufficiency of the light, the color of the fur, the presence of reflection, but this is not a limiting list. The result of the first step is a video or image sequence of sufficient quality to be processed in the next step. The result of step 49 is at least one image to be subsequently processed.
The second step of the algorithm is the scale candidate step 50. The goal of this step is to find the initial scale position based only on the local properties of each scale. The initial scale position means that any information outside the scale perimeter does not contribute to or affect the detection of scale boundaries. In other words, the detection of scales is done only on the basis of the information available around the scale boundaries (hence "local"). In contrast, at a later stage, if a scale does not have a complete boundary and cannot be detected locally as described above, and its neighbors are all detected, the information from the neighbors optionally enables to enhance the confidence that the scale boundary should appear at that location, and to complete its detection even if some information is locally lost. Thus, the contextual information can be used at a later stage to detect scale. For understanding, the opposite case is to use the properties of the neighbours of the scales to detect all scales. As explained above, only 20% of the boundaries of a scale may be visible in the image and therefore cannot be detected by itself. However, adjacent scales can be detected and their position, size, orientation, shape (squares, circles, diamonds) displayed. From a probabilistic point of view, the relative positions of the scales in a group can indicate where the other scales should be, and support the assumption that detecting such scales for only 20% of the boundaries is available. Of course, 20% is only an example and generally refers to the fact that: only about one quarter of a rectangular scale boundary can be directly detected. The result of the second step is a collection of individually detected scales, a probability map of the detected scales and the expected positions and attributes of the final undetected scales, as well as the newly detected scales from the partial local information enhanced by the confidence obtained from the neighbors.
The third step of the algorithm is the build up of a grid map step 51. It corresponds to building a grid of scales and calculating the properties of the grid, such as the repetitiveness of the scales and the distribution of scale sizes and their evolution with direction. The result of this step is a graph in which scales are nodes and arcs are contiguous links to adjacent scales. Each scale is characterized by its geometrical properties and the evolution of the properties along the evolution direction and by a probability map of the center. In this context, geometric properties refer to the fact that: each scale is represented by its centroid or geometric center, by unique shape description parameters such as the major and minor axes of the ellipse and the attributes of its outline, which can be represented by a simple point at a given resolution, vector curves approximating the boundary, and parameters of various shape representation techniques. Evolution means that the grid of scales cannot have a sequence of small-large-small-large scales in one direction and there is a consistency of the evolution/propagation of the geometrical properties of the scales corresponding to approximately the same orientation and the same size, or there is a clear boundary of properties corresponding to e.g. the transition between the belly region and the flank region where the square scales are replaced by round scales. The central probability map relates to the fact that: if a scale center map is created, it will correspond to a grid with steps between scale centers, which is about the same size if the scales are the same size. Thus, the probability of encountering a flake at a given location will be established based on the grid properties. This is labeled "center graph" herein.
The fourth step of the method is the recalculate scale step 52. This corresponds to trying to find additional scales, where the grid probability predicts the additional scales. Since the coat of the animal is not known at the time of identification, the probability of the scale position is based only on the scales present and visible in the image. Thus, the probability map or mesh probabilities are constructed from scales detected in the image and can be used as a basis for building the mesh. In other words, it is preferable that the scale to be detected is not used as the starting scale because the boundary portion is missing or unclear. This step occurs in order to recover the scale which is not too large and where the scale pattern appears to be gapped. In other words, in the case where the second step fails to determine the presence of scales but the gap is not a gap between scales but an undetected scale, this step attempts to identify and distinguish additional scales from the image taken in the first step. At the first step, some scales are not detected based on only local information that may not be complete (e.g., only 20% of the boundaries). At the level of the grid or probability map construction, the following predictions or expectations are obtained: such a grid is likely to be in place, and likely/likely that scales should be present at certain locations, considering the observed positions of the detected scales, but not detected in the first step. Furthermore, the grid provides not only a probability of location, but also a probability of orientation and size. With this knowledge, it becomes possible to have a more deterministic re-detection as to which scales are detected. The re-detection step is considered to be a mandatory step in the method according to the invention, which improves the overall scale detection quality.
The fifth step is the build profile step 53, which enables the precise profile of each scale to be built based on several criteria and abutting constraints. Which are subsequently reflected in the description with the description of fig. 15pp in the present description. In short, these are criteria (or constraints) consistent with scales on reptile fur. These are related to the geometry of the scale (e.g. smoothness), the correspondence and adjacency of high gradients in the image. The result of this step is an accurate representation of each scale by its geometry. Here, exact refers to a geometric representation of the scale shape reflecting the properties of the scale up to a detail of defined size, preferably up to 500 microns.
Once the representation of the fur has been established, it can be identified by an identification step 54, the identification step 54 consisting of several sub-steps as explained below.
Finally, a verification step 55 is responsible for detailed verification of the relative scale position. Fur identification is accompanied by a degree of tolerance. Once the coat is identified, a detailed verification can be made, which can be seen as a more detailed comparison between the two representations with a higher accuracy. If the initial identification is targeted at speed, the verification verifies whether the difference between the two representations can be classified as from stretching, cutting or tanning. Thus, validation is a step in which two coat representations are compared to be the same over some area and the difference can be classified as reasonable.
Fig. 8 determination of scale candidates is shown in fig. 8B. In more detail, the scale candidate step 50 is shown in the lower part of fig. 8B. This step of the method comprises three phases with an end step: edge features are extracted in an edge features step 56, votes are defined for the elements of the scale/protrusion structure in an element votes step 57, votes are cast in a voting step 58, and finally votes are accumulated in a vote accumulation step 59.
As shown in fig. 9 and 10, the edge feature detection step 56 is performed using a one-dimensional scan of the image. Without loss of generality it can be replaced by a standard 2D filter, including an edge detection operation in image processing with adaptive parameters that can reveal scale properties such as boundaries, wrinkles or texture. In fig. 9, an image is scanned with a first scanning line 61 and an image with a second scanning line 62, the image showing three "picture elements" one beside the other. The intensity or color curves corresponding to those scan lines are a first color curve 63 and a second color curve 64. One, two or more scan lines such as these scan lines 61 and 62 may be provided and executed simultaneously or sequentially. The scanning lines 61 and 62 are parallel to each other. Non-parallel scan lines are possible but require further calculations.
For each scan line 61 and 62, the processor calculates a maximum envelope and a minimum envelope defined by a local weighted maximum 65 and a local weighted minimum 66 shown for the first scan line 61. The envelope may be evaluated for each scan line 61, 62 or information may be shared between scan lines 61, 62. The value of the envelope can be calculated with a variable sampling density.
The edges of objects in the image, such as scales, are first detected as intensity or color transitions with respect to the envelope. Different sizes of such transitions may be accepted. Without loss of generality, a simple first edge transition pair (edgelet) 67 and a second edge transition pair (edgelet) 68 may be detected as two adjacent transitions along two scan lines, which may be placed in different spaces, preferably with a parallel distance between them. It should be noted that various combinations of transitions may give multiple assumed edges. It should also be noted that double transitions have useful information such as edge orientation. The rising or falling transitions are distinguished for the different scan lines 61, 62 and this information is stored.
A first diagonal scan line 261 and a second diagonal scan line 262 are shown to indicate that different scan lines may also provide different answers. Corner information is less preferred.
A first vertical scan line 361 and a second vertical scan line 362 are shown to indicate that scan lines perpendicular to the first scan line 61 may also be used, which would yield similar side information as curves 63 and 64.
Depending on the image acquisition and the type of coat, various combinations are envisioned as shown in fig. 10. The edge 69 of a scale may be shared by two adjacent scales; scales may contain artifacts 69'; weak edges 70 may be present when the contrast is low or the individual edges of each scale are different, and missing edges 71 may be present when the edges are seen on only one side of the scale.
The first scan line 61 produces a first color curve 63 and this exhibits several types of transitions. In this case, the first transition type 72 has clear rising and falling transitions. In the next case along scan line 61, second transition type 73 has a clear double transition, one of which is not very pronounced. In the case of the third transition type 74, a clear single transition is followed without any transition for the next scale.
Once the above step, the edge feature detection step 56, is complete, the next step in the module of scale candidate step 50 is the identification of the elements of the vote step 57. In computer vision, there is always a tradeoff between grouping features to make more certain votes or making more extensive votes based on small features that are less reliable. Here, several possible voting options are presented. The first stage of voting is to have a pair of edges. As shown in fig. 11A and 11B, edges are identified along the scan lines, and several edges are detected along the scan lines in fig. 11C. Note that the two scan lines clearly give information about the edge orientation and preference. Using scan lines instead of edge detection filters, scan lines provide a very important speed advantage over edge detection, but both methods can be used. In other words, using scan lines is one embodiment of detecting scale boundaries, but performs better than the edge detection filters also mentioned. At this stage, the scale is unknown, and such edges can belong to anything: scale boundaries, reflections, textures, markings as explained in connection with fig. 10.
FIG. 11A illustrates the selection of a single transition with a pair of transitions 76, the pair of transitions 76 being characterized by several properties of which are selected from a set of transitions. First, in the case of originating from one edge, the height of the transition should have some similarity. The color properties of the scale boundaries must also be similar, i.e. if it is a brown texture, the two curves should have a similar transition from one color to the other. In contrast, transitions from small objects, such as wrinkles, will exhibit quite different properties between scan lines and can be discarded. Of course, the transition should be a rising or falling slope on both curves. The transition width can in the simplest embodiment be measured as the distance between one side reaching the maximum and the other side reaching the minimum. Therefore, both transitions should have similar widths.
Fig. 11B shows a dual transition pair 176 characterized by similar intensity/color curves, rising or falling slopes, similar shapes, and acceptable widths. The second scale boundary is less distinct than the first scale boundary.
Fig. 11C shows a first edge (point pair) 75 and a second edge (point pair) 77 that are selected/paired if they have some specific properties. Such an attribute may be the distance between them, which should be within a reasonable range for the scale size to exhibit a given type of coat and distance from the smartphone. The second condition is that the intensity or color ramp directly in the image color space should be complementary. Generally, there should be a falling ramp 78 and a rising ramp 79, but in some cases, two ramps of the same type are selected. The combination of edge pairs along one or more scan lines results in possible edge pairs 80. The selected edge pair has reference numeral 81.
Fig. 12 is used to illustrate how the voting step 58 is triggered and executed. In section a) above, the two initial edges that form the primary pair, the selected edge pair 81, will be the basis for the vote. Depending on their orientation, there are two possibilities. If their orientations are similar to each other as shown in section a) above, they may come from opposite walls of the scale. Their corresponding secondary edges will be sought along a vertical scan 82, which vertical scan 82 is implemented by four secondary scan lines 161, 162, 163, 164, which are perpendicular to the primary scan line 61 (not shown in fig. 12A to 12C) and are located approximately in the middle between the primary edge pairs. The secondary edges 83 obtained from the secondary scanlines will be subjected to complementary filtering.
As shown in b) of fig. 12, in case the selected edge pair 81 assumes a different orientation, a complementary secondary vertical scan 82 will still be performed perpendicular to the original scan line but at the location where the primary edge 81 is found, to obtain a secondary edge 83. This switching to a different scan line can be done quickly, since a scan line only means that image data from a still image is analyzed along a particular line. The two horizontal scan lines 61, 62 reveal that the edges (transition pairs) have different orientations and are not parallel. At the position of these pairs, vertical scanning, in particular with the scanning lines 161, 162, is performed.
In fig. 12C, once three or four sides of a scale are identified, it can be seen how votes are stored. The ellipse 180 is chosen to represent the scale as the most compact structure and most useful for the next step. The ellipse hypotheses are stored with an ellipse center 84, an ellipse major axis 85, and an ellipse minor axis 86. This set of parameters represents one assumption of scale with the smallest set of data representing the whole. The result of this step is a collection of ellipses 180 representing individual scales established from the local structure of the scale. The following will be the case: many ellipses are voted for artifacts and all generated sets of ellipses must be filtered to make such votes corresponding to scales.
Optional steps of the method are: once the assumption of scale is confirmed by several consistent votes, the verification process of the guidance is triggered. In this case, various scale profile attributes will be verified. The assumption of scales means that the detected data defines the scales of the image taken by the smartphone.
The next step of the method is the vote accumulation step 59. Several variations are embodiments according to the invention. Each vote for a scale is stored in the form of an ellipse 180 describing the position (center), two major axes 85 and 86 describing the height and width of the scale. In case the scale is not square and thus the ellipse is circular, the main axis also gives the orientation. As shown in fig. 14, which can be viewed as a 2D map, the values of the center position as x, y keys and the two axes and orientations are stored in the map as the payload. The description refers to a data structure (referred to as a graph) that accesses and stores values using x, y coordinates. The voted ellipses 180 may first be indexed/accessed/selected by their center 5 coordinates (lowest plane). This selection can be performed with tolerances, i.e. finding all ellipses with a center 5 in a given area defined as a rectangle, circle or other shape. Once a set of ellipses 180 has been selected based on a pair of coordinates x, y, they may be clustered (second plane) based on their size and orientation for consistency. From each cluster/group, an ellipse can be selected that has scale boundary edges that are sufficiently present along the ellipse boundary to vote and thus correspond to the most complete scale. In other words, the first packet votes to one ellipse vote 88 geographically close to each other to some extent by being accumulated by clustering according to the centers of the ellipses in the center map 87. This can be done by dropping the speed or finer tree methods. For each set of clustered ellipse votes, separation is performed by attributes such as major and minor axes, and orientation filtering is performed by the axes 89. One scale that triggers several elliptical votes will correspond to the voting peak in x, y space at its central position. These votes are filtered for consistency with respect to scale size, giving a vote for a particular scale size. A certain number of votes for approximately the same central position, with the same major and minor axes, corresponds to a valid assumption of a coat scale. This assumption may be caused by artifacts in some cases, but meets the criterion of convexity and size. The ellipse is accessed/selected through the x, y center, which is the most representative and spatially important information in 2D. By defining the range of x1.. x2 and y1.. y2, all ellipses centered within this interval are extracted. In the second step, parameters/dimensions such as the major axis, the minor axis and the orientation of the ellipse may be used. They reflect size and orientation. Thus, ellipses selected by x, y coordinates may also be grouped/clustered, e.g., first based on their orientation or size of the major axis. In a third step, if a group of e.g. 100 ellipses have similar x, y centers, similar orientations, similar major and minor axes, the distribution of the edges voted along their perimeter is examined. Thus, in general, this is the order of the elliptic properties used for the verification of their clustering and scale voting. This is filtering 90 by orientation to provide a residual ellipse.
An additional check is performed on the elements voted to the ellipse to preliminarily think about the distribution of the edges voted to the ellipse. The result of the phase vote accumulation step 59 is an ellipse set for which a sufficient number of consistent votes are cast.
The next step is to create a grid map step 51 which moves up one level in the hierarchy of the representation and creates a map whose nodes are scales and whose links define neighborhoods between scales. In the previous step, ellipses are generated based on their local properties, and each scale is identified as a convex profile (which may contain any kind of artifact inside it). However, the skin has very strong scale properties that are repetitive and consistently located in both directions (this creates an easily identifiable python or alligator pattern). Fig. 13A-13C show that the neighborhood of one ellipse is first defined by the distance to a neighboring ellipse. The criterion used is the distance to the center, where the threshold is defined by the size of the center ellipse and may be about three times the diameter of the ellipse, in order to be able to reach small cells in the vicinity of the large cell. For biological reasons, it is rare that adjacent scales are 4 times larger than their neighbors. Thus, a reasonable (but without loss of generality) search radius is 3 times the diameter. In fig. 13A, the center ellipse 91 evaluates the distances to all possible neighbor ellipses 92 to establish an initial adjacency that includes all possibilities.
The next step is to filter the neighbor adjacency by applying two conditions. Neighbors should not overlap and they should share a boundary proximity 93 within a limit proportional to the size of the adjoining ellipse. In fig. 13B, it can be seen that after applying this step, the neighbors of the central ellipse 91 are only those that can correspond to real scales (which cannot actually overlap). The adjacency condition may be evaluated with respect to an ellipse or from a boundary portion where the voting portion exists. The current state of such an abutment may still contain ellipses corresponding to true scales and ellipses caused by some artifacts (such as reflections or texture or double scales), but the abutment relationship only links the ellipses that are non-overlapping and can reflect the scales from their size. Thus, the calculated ellipses 91, 92 on the left side of fig. 13B are related to the true adjacent scale shown on the right side of fig. 13B. There are no spaces 94 between the detected scales.
In fig. 13C, another contiguous filter applied is scale attribute consistency. The grid of scales on reptile fur follows a fairly smooth pattern of variation in scale size and orientation. To check this consistency several options can be applied in the method. One possibility is to analyze the scale grid 97, where e.g. the value of the property 95 as the long axis is represented as a 3D surface. The long axis size as z-value that produces a sparse but smooth value surface means a good neighbor. The condition of overlapping scales is implied in size consistency, but the point is smoothness — proximity. The gradient 96 of the attribute will be evaluated to determine the smoothness of the scale attribute. Another possibility is to create a set of contiguous ellipses 98, also shown in fig. 13D, and to locally evaluate the set of geometric properties of these sets to verify consistency. The geometric property that will validate a group can be divided into three parts. One is the consistency of the scale properties, such as the same orientation, size and center position. Second, they are common attributes of their boundaries. The profiles of adjoining cells should share the proximity of a common boundary/profile without containing too much space or other scale (consistent with scale size) between them. Third, the exact shape of the profile may exhibit some similarity, for example, a less convex shape or a specific rounded corner on all scales, all as shown in fig. 13D. One embodiment of such a step is to group ellipses by a "cross" mode or a 3 x 3 mode and evaluate the smoothness of the ellipse properties in different directions in each of such groups.
The above condition will serve as the target for filtering the contiguous links between the voted ellipses. Links that do not meet the consistency and smoothness of the attributes are removed. After such filtering, links will only remain between potential scales that may form a grid. The final filtering at this stage is to select the ellipse/scale that provides the greatest coverage of the fur area while exhibiting consistent attributes. The result of this step ends the step of building the grid map 51.
In some cases, all previous steps may still not detect some flakes, usually due to weak illumination or weak incident light or other reasons. When the grid is built, it creates a probabilistic surface where scales of a particular size and orientation are expected to occur. The missing scale will correspond to a gap in the grid where a recalculation scale step 52 can be utilized to attempt to again detect a scale with the desired parameters using the set of parameters that facilitates its detection. Thus, if at positions x, y scales with axes a, b are expected, the step scale candidate 50 can be repeated with more forgiving parameters favoring the edges at the expected scale contours.
An embodiment of the recalculating scale step 52 will be described below with reference to fig. 21A to 21C and fig. 22A to 22C. As mentioned above, this typically occurs if there is weak illumination or weak incident light, because the edges of the flakes are not visible due to weak ambient illumination, the direction of the light, the flat edges of some flakes, the dark color of the flakes, and the spaces between them. Step recalculation scale 52 attempts to recover the missing scale by typically using information about the similarity of adjacent scales and the dissimilarity along the boundaries between fur regions. This information is present in the scale properties about the newly created mesh.
The first part of step 52 is to upgrade the grid built prior to this stage with information about the scale properties. In fig. 21A, a grid of scales is represented as a graph with nodes 401 corresponding to the centers of the scales and a graph link 402 connecting the centers of the scales sharing a common boundary. For each link, several eigenvalues will be calculated and stored with the x, y coordinates of the link midpoint 403 and the value of the eigenlink attribute 404 as z coordinates. This will create sparse value points in 3D space.
In fig. 21B, three features are reflected. One feature is the inter-scale distance 405 and the sum of radius 1 and radius 2. Reference numeral 406 points to a direct connection and includes the distance between two scales, which means that 406 is related to the sum, i.e. minus the distance between two scales. If the scales are contiguous, the two values should be similar. The second feature is the inter-scale orientation 407, i.e. the direction of the connection, rather than its length, and the first 408 and second 409 principal axis orientations of the two scales corresponding to the link. Two major orientations 408 and 409 are measured relative to the inter-scale orientation 407. Finally, the shapes of the scales are not reflected by the major and minor axes. The shape is defined relative to the inter-scale orientation 407 and defines how elliptical the scale has if the inter-scale orientation 407 is an elliptical axis, which is referred to as the ellipticity 410. One extreme of the measurement is the value 0 and reflects a perfect circle, the other extreme is the value 1 and is related to a perfect square with sharp corners. Once these features are calculated, the information about the contiguous scales is part of the graph and is associated with each link between the nodes of the graph (corresponding to the scale centers).
In fig. 21C, each link in the graph stores information on a first feature having, for example, a pair of values: the inter-scale distance 411 and the radius 1 plus the radius 2 referenced 406. Sparse points in 3D can be generalized to two surfaces: the sum of the inter-scale distance surface 415 and the radius surface 416, which approximate the sparse values of these two features. The approximation by the surface may be performed by various fitting methods. Mainly for fitting the surface, it is necessary to detect areas with scales of different nature (e.g. the belly area with large and square scales and the side areas with small and round scales). Such a boundary is shown in fig. 22A with two rows of larger scales from one region 412 and one row of smaller scales from another region 413. The fitted inter-scale distance surface 415 in its parameter definition should allow for a steeper step along such boundary between fur areas. After fitting, such a surface represents a probability map of encountering a particular value of a given feature at a particular location of the fur map. The smoothness of the surface, which integrates information from the domain of scales, represents the evolution of the features in the x, y direction.
In fig. 22B, a case where one scale is missing is shown. The constructed diagram shows the link across the missing scale position 420. The value of the feature in this region will not coincide with the neighbors and for example the sum 406 of the scale radii on both sides of the link 402 will be much smaller than the inter-scale distance 405. Note that some nodes 401 do not appear to be a single circle, but rather a collection of circles; this is an unintended artifact of the figure. To find such a gap, the value measured at each link value 421 of the link (e.g., the inter-scale distance 405) is compared to an expected value 422 on the surface. If the difference is not consistent with a nearby link, a gap is detected and marked as a missing scale position 420. It should be noted that some links cross each other and may provide different values at the same x, y position. For consistency, a scan of the pattern is performed in a first direction 417 (possibly following a link at an angle of less than 70 degrees to that direction). And then a second scan is performed in a second direction 418. The link with the value difference is identified as a link 419 across the gap, which in the case of fig. 22B is also a graphical link with reference numeral 402.
In fig. 22C, three of the four links 419 across the gap are shown. The properties stored in those links, by comparison with the expected values 422 on the surface, allow to reconstruct the scale shape 423 of the expected scale normally present in the gap, so that the uniformity of the properties better follows the surface. Thus, the final stage of step 52 will follow the expected scale contour (here shown in dotted lines) 423 and the original image below it is analyzed for evidence, e.g., edges, slight variations in contrast, texture irregularities along the contour, which will confirm the presence of a scale (which was not detected in the original step, requiring a higher degree of certainty.
Once the grid of scales is established, either directly or via the recalculate scale step 52, the exact contour construction can begin with the establish contour step 53. The nature of the scale of reptile fur is such that: there is no obvious path where the scale profile exists. The operation of the supply chain of creasing, flattening, painting, tanning, etc. will change the way the scales are seen. Therefore, establishing a version of the scale profile is a compromise between several factors. It should be calculated to provide the most similar results in supply chain variations of the fur and therefore the most reliable scale properties should be selected.
The context-following outline is set on fig. 15A. The initial scale is detected as an ellipse or partial outline composed of elements voting for an ellipse. Thus, the template 101 may be proposed and built as an initial version of the outline. The tolerance limits 102 will provide the following envelope: the resulting contour 103 will be established within the envelope. These tolerance limits 102 are defined by scale size and the surrounding grid inside and by size tolerance and the neighbors outside.
Contour following itself is an optimization process as follows: the optimization process may use templates in the form of points or curves defined by equations or spline-type curves. The optimization can be done by following the template 101 several times, applying adjustments, or letting each point evolve in parallel. Each next point step 104 is adjusted in turn, following the template 101. The updating of the next point location is performed by considering several criteria and reaching a compromise if the weighted combination of the criteria reaches an acceptable minimum.
In FIG. 15B, a simplified set of criteria is shown. The next point 106 is updated as the iteration of the optimization moves from the previous point 105 to the next point 106. A non-exhaustive list of criteria is: smoothness of the contour, proximity between points, attraction to high gradients of scale edges, edge angle benefits (focusing on the contour rather than wrinkles), closed contour, convexity, and conformance to adjacent boundaries. These criteria may be combined into a second weight by weighted combination: attraction to gradient 108 and first weight: contour smoothness 107, or as a logical combination (if a certain scale shape is occurring, then a conditional weight like e.g. proximity to adjacent boundaries may be advantageous, whereas if another shape is present, the conditional weight is degraded).
Fig. 16A and 16B show a more detailed illustration of the joint contour adjustment. In fig. 16A, one central scale SC and its neighbors are shown. Adjacent scales 110 are numbered N1, … …, N8. For the center scale SC, one can start from the center 111 of the profile and establish the closest curve 112, which curve 112 is characterized by the presence on the disk of adjacent scales, outside the center of the profile and without any other scale between the two curves. Such a curve also allows defining a sector center 114 and a sector neighbor 113, the sector center 114 and the sector neighbor 113 defining sectors of the portion in which the corresponding center and adjacent scales share a boundary. The contour of the central scale and the corresponding contour of the adjacent scale can then be shifted separately. Such a displacement of the contour should comply with other criteria mentioned above, such as smoothness and gradient following.
In fig. 16B, the same set of profiles is represented in angle-radius (or θ - ρ) space. The central profile 115 curve is shown at the bottom and the adjacent profile 116 is also shown at the bottom. The joint displacement of the profiles is illustrated by the shifted central profile 117 and the shifted adjacent profile 118.
Fig. 17A and 17B show representations of a fur prior to identification. As shown in fig. 17A, the phase of the build profile step 53 ends with a complete build coat macro representation. Each scale representation 119 is characterized by at least its center and a descriptive ellipse, but also by an exact contour, which can be characterized by a series of points or by a curve. The fur representation is also an adjacent linked representation 120 that establishes a relationship between two adjoining scales.
Other advantages of the present invention are reflected in fig. 17B. The contour representation may be more efficient in vector form. Given the shape of the existing fur base and its contour, the vector representation can be represented by vector curve nodes 121 and vector curve tangents 122, rather than by 50 to 200 points. Then, the approximation of the contour by the vector-form curve will be limited to approximately eight nodes and its tangents that cover the shape of the curve with sufficient accuracy. Thus, using the vector form reduces the macroscopic shape representation by orders of magnitude. Fig. 17B shows two such nodes with a vector for each of the four predominantly rectangular scale boundaries.
The placement and selection of nodes in the vector curves used in the present invention will seek other goals than just compactness. First, it will help in the identification of standardized scale comparisons. Second, it will simplify the indexing step of identification. Third, it will help to assess some quality-related characteristics of the coat. Fourth, it will aid in the display. Fifth, it will help estimate the deformation of the fur. Sixth, it will serve as a frame of reference for describing the location of the microfeatures (see paragraph below). Thus, the placement and selection of nodes is a constrained step for generating a set of vector nodes and tangents that satisfy a particular property.
In addition to the macro features, each fur portion has micro features such as folds and micro creases of tissue, texture, and color. Most of these features are not present throughout the supply chain. The main interest is represented by the substantially widespread appearance of the folds after tanning and, due to their rich diversity, will allow the identification of small parts of the coat up to the size of the wristband. The detection of such features from the image will occur during the first step of the method or in an additional step after the scale detection. Each fold or crease of the surface will then be represented as a curve in the coordinate system of two points represented by a vector. In practice, two vector curve nodes 121 and their vector curve tangents 122 will be sufficient to geometrically describe the exact position of the fold. Upon identification, the user may not need to find a particular region, as all regions will be stored in the database. It will hover over the wristband until the system confirms that the fold was well acquired and then identifies that intervention is to take place. The second part of the phrase refers to the fact that wrinkles appear only after tanning. This makes this method advantageous for initially carrying out the identification process, when the hatchling can be scanned and then the data updated during the life of the animal, and especially after its death and coat treatment, the database is updated while checking whether it is the same animal.
Various applications are foreseen based on a reliable representation of the fur, but the following section focuses on the identification step 54. In short, it is the following step: wherein a number of fur representations are stored in a database and based on the newly presented fur or part thereof, a computer-implemented method can indicate whether the fur or part is known. No a priori knowledge of which coat, parts, location, acquisition distance, light is known.
FIGS. 18A-18C show a basic comparison of small pieces of reptile fur. In FIG. 18A, identifying the required items can be summarized. The fur portion 123 is presented to the system to be identified. This may be any portion of the fur as shown in fig. 5. Since no reference information is available, several options exist. First, the identification can start with any starting scale 124 and propagate in all propagation directions 125. This method requires that the database have features (discussed below) that allow identification from any point of the coat stored in the database. Second, a fur representation can be created and some attributes (such as abdominal or spinal orientation) detected, and then identified that will start with a limited number of scales near a particular area. However, this method requires some specific features to be visible, which is not always the case, for example, with a watch wristband. Third, the identified point may start from a location indicated by the user.
Thus, it is worth mentioning the uniquely identified tile region 126. This means that its surface allows to uniquely identify a part of the coat. The identification process identifies common parts between the fur presented to the system and the fur parts stored in the database. Macroscopic recognition means that in certain areas the shape and relative position of the scales match the tolerances that may result from the treatment applied to the coat during the supply chain, rather than from two different coats. The performance of the system will be measured by the minimum area of the fur required for fur identification.
One basic step of identification is given in fig. 18B. Since there is no rotational reference available, the basic implementation will rely on information that can be used as a rotational reference. One option is to use a starting scale 124 and define the scale as a rotating reference scale 127. The selection of such scales, e.g., N2, N4, N6, N8, may be defined as the scales that share a significant portion of their boundaries with the starting scale 124, which starting scale 124 may be compared to the starting scale SC mentioned above.
Fig. 18.c) shows scale patches up to the first circumferential row in the theta-p representation. Scale patches in this form are more suitable for both required normalisations. The size of the starting scale 124 is unknown, and therefore, a scaling factor may be applied before comparing the two patches. Various values may be used, such as an average between the major and minor axes of the ellipse, or the scale area may be normalized to one. Since there will be several possible rotating reference scales 127, a comparison needs to be made from several comparison start angles 128. However, in θ - ρ space, this would correspond to a simple offset before comparison.
In this particular embodiment, it is necessary to start with each scale and make a comparison for each reference angle of rotation.
In FIG. 19A, one embodiment of a rotation invariant index is outlined. Several scales will be used to define several reference direction base references 129. Since it is desirable to avoid reliance on a starting direction, the ray cross based on those base references will rotate with the rotating cross 130. For each of its rays, the location of the first contour outside the center contour point 131 on the nearest contour will be established. For each of the rays, the distances D1, … …, D4 will be calculated as the radius 132. In fig. 19B, a symmetric function among these four values will provide a rotation invariant curve 133 shown in the lower representation of the coordinate system. In fact, the axis is a rotation independent measure and is not necessarily associated with a dimension. In this case, however, the X-axis would correspond to an angular range of 1/4. It can be seen as the span of range between one orientation (between the scale centre and the first neighbour) and the orientation between the scale centre and the second neighbour. In the case where the scale is a perfect rectangle, it will be 90 degrees, but will usually be different. The Y-axis will correspond to a function of radius (e.g., the ratio between the radii of the rays in the four quadrants). For rotation invariance, such a curve would correspond to one quadrant of the scan where 3/4 information is missing. In the case of four quadrants with 90 degrees and radius in each quadrant, a search is made to achieve independence of the quadrants from which comparisons are made. Thus, if the scale is rotated to any of the fundamental orientations, the curves should be the same. That is four. Thus, for example, by taking the ratio between the two opposing quadrants 1-3 and 2-4, two curves for one quadrant are each obtained. Such a representation would be a 2-fold reduction of the information and would be invariant for two out of four possible rotations (not complete rotational invariance). In the present case, the ratio between the radii in quadrants 1-2-3-4 is used to obtain a curve of one quadrant length, but this will be constant for a quadruple rotation. Full rotational invariance is thus obtained, but 3/4 is lost in radial angular extent. Another way to look at this problem is to think what shape of the geometry is rotationally invariant for four basic angles of 0 degrees, 90 degrees, 180 degrees, 360 degrees. And the answer is that it is somewhat like a "square" in which all four sides can be varying curves, but all four sides are the same. Rotating such a pattern four degrees will bring it back on itself. However, the only unique part in such a figure is one of its edges, which is 1/4 of its perimeter, and its 3/4 is a copy of 1/4. This is illustrated at 3/4.
Such an index function will be established for all the pelts and their patches in the database, quantized and used as a multi-dimensional index vector to select a subset of curves corresponding to the pelts worth comparing. It should be noted that the curve corresponds to the first profile closest to the central profile, whereas the second profile waveform and the third profile waveform may be used.
Figure 20 shows a detailed profile comparison. In fig. 20, the gradual comparison of the fur selected in the indexing step will be explained. A comparison will be performed between the compared fur representation curve 134 and the curve 135 in the database. The comparison will start gradually from the first radius range 136 and if successful switch to the second radius range 137 and further to the third radius range 138.
The speed-up of the comparison is achieved by the fact that: the vector curves may not be compared point-by-point, but the difference of the two curves may be calculated in a closed form from the node positions. Special mandatory node locations also facilitate this method and comparison by radius range.
Other applications of the fur representation are possible, such as quality analysis of the scale shape, selection of the best visual attraction segment of the wristband, and the like.
List of reference numerals
1 registration of first range 19 of intact original fur
2 verification of the second range 20 of original fur
3 gap 21 tanning between scales and wrinkles
Verification of 4-scaly edge 22 tanned skins
5 center of scale 23 registration of whole fur macro/micro scale
6 shape of scale perimeter 24 cut
7 verification of 25 fur part of large-size scale fur element
Validation of 7' Medium-sized Scale coat element 26 end product
7' minimum scale fur element 27 egg
8 first scale 28 hatchling
9 second Scale 29 Young animal
10 third scale 30 fur
11 first stage of the link 31 fur part between the centres of the scales
12 second stage of the fur portion of the original reptile fur 32
13 third stage of Pre-tanned fur 33 fur portion
14 leather surface 34 area on the final product
15 scale structure 35 end product
Registration of Primary coat sites on 16 animals 36 eggs to hatchlings
17 verification of animals 37 hatchlings to young animals
18 verification of the major fur part 38 prepared fur
39 coat part and paint 66 local weighted minimum
40 object fabrication 67 first transition pair (small edge feature)
41 second transition pair of end products 68 (small edge feature)
42 shared edge scale
43 area 69' artifact on fur
Boundary 70 weak edge of region 44
45 denotes the region boundary 71 missing edge
Region 72 indicated at 46 is of a first transition type
47 end product 73 of the second transition type
48 area 74 on the final product third transition type
49 video and image acquisition step 75 first edge (point pair)
50 scale candidate step 76 conversion pair
51 build grid map step 77 second edge (Point Pair)
52 recalculation scale step 78 descending ramp
53 build Profile step 79 Up ramp
54 identify possible pairs of step 80
55 verifying the edge pairs selected in step 81
56 edge feature step 82 vertical scan
Element step 83 Secondary edge of 57 votes
58 voting step 84 ellipse centers
59 vote accumulation step 85 ellipse major axis
86 elliptical minor axis of scale 60 boundary
61 center of the first scanning line 87
62 second scan line 88 an elliptical vote
63 first color Curve 89 is filtered by the Axis
64 second color curve 90 is filtered by orientation
65 local weighted maximum 91 central ellipse
92 neighbor 121 vector curve node
93 shared boundary 122 vector curve tangent
94 has no space 123 fur part
95 Property value 124 starting Scale
Gradient 125 propagation direction of 96 properties
97 uniquely identified regions of the scale grid 126
98 set of adjacent elliptical 127 rotating reference scales
101 template 128 comparison start angle
102 tolerance limits 129 basic reference
103 profile 130 rotating cross
104 step 131 points on the nearest contour
105 front point 132 radius
106 next point 133 rotation invariant curve
107 first weight: curves with profile smoothness 134 compared
108 second weight: attraction to gradient 135 curves in the database
109 third weight: angle 136 of gradient first radius range
110 adjacent scale 137 second radius
111 center of profile 138 third radius
112 closest curve 139 identifies the region
113 sector neighbor 161 first secondary scan line
114 sector center 162 second secondary scan line
115 center profile 163 third sub-level scan line
116 adjacent to the profile 164 fourth secondary scan line
117 moving center profile 176 dual conversion pairs
118 adjacent outline 180 ellipse of shift
119 scale representation 261 first diagonal scan line
120 adjacent linked representations 262 of second diagonal scan lines
361 boundary between first vertical scan lines 411 fur regions
362 second vertical scan line 412 large scale area
401 node 413 minor scale region
402 graphically linking 415 inter-scale distance surfaces
403 link midpoint 416 radius sum surface
404 Link Attribute 417 first Scan Direction
405 inter-scale distance 418 second scan direction
406 sum of two scale radii 419 across the gap
407 oriented 420 between scales missing scale positions
408 first principal axis orientation 421 linked values
409 second principal axis orientation 422
410 ovality 423 Scale shape of an expected Scale

Claims (7)

1. A computer-implemented method comprising surface identification of scales, in particular reptile fur identification, comprising the steps of:
-acquiring (49) at least one image of a surface portion to be identified,
an edge feature detection step (50, 56) corresponding to the boundaries of scales in the image by scanning the acquired image along a scanning line (61, 62, 161, … …) over an area assumed to comprise a plurality of scales to acquire an intensity or color curve,
an edge identification step (57) of determining scale edge candidates for one or more scales based on the acquired scan line image curves,
-a scale construction voting step (58) which determines an appropriate scale edge as part of a particular scale based on the scale edge candidates,
-a scale vote accumulation step (59) of determining a dataset representative of the identified scales, said dataset optionally comprising one or more data taken from a group comprising data relating to an ellipse (180), major (85) and minor (86) axes of said ellipse, a central position of said ellipse,
-using the dataset for each of the identified scales to build (51) a graph of repeating pattern scale positions of the detected scales,
-introducing a recalculation scale step (52) which identifies other scales where there are gaps in the pattern of established scales,
-determining a contour (103) of each of the detected scales and creating (53), for each detected scale, a representative dataset comprising data of said contours (103),
-determining identification feature data (54) from a plurality of representative datasets of detected scales of the surface comprising scales, and
-storing the identifying characteristic data (54) for the surface comprising scales in a database.
2. The method of claim 1, wherein the scale construction voting step (58) is followed by:
-a scale verification step comprising checking the acquired data relating to the identified scales according to predetermined scale profile attributes.
3. Method according to claim 1 or 2, wherein the step of establishing (51) a pattern of scale positions of the repeating pattern of detected scales comprises:
-determining adjacent non-overlapping scales from the identified set of scale candidates.
4. A method of tracking and tracing animal skins, in particular reptile skins, comprising the steps of: -performing the method according to any one of claims 1 to 3 on an animal fur sample comprising said surface comprising scales to obtain identification feature data (54), followed by the steps of:
-comparing the acquired identification feature data (54) of the animal fur sample with a previously acquired and stored set of identification feature data of a surface comprising scales from reptile fur to identify within the stored identification feature data a surface portion of the animal fur sample, and
-in case the acquired identification feature data of the animal fur sample matches a stored set of identification features, updating the database with the comparison result and the updated acquired identification features (54).
5. The method of claim 4, wherein the acquired identifying feature data (54) is stored as updated identifying feature data (54) when a match of the surface is identified when the same surface is scanned at different times.
6. The method according to claim 4 or 5, wherein when the same surface is scanned at different times, other surface portions of the same surface are scanned to obtain identification feature data (54) of the other surface portions, and when a match of the same surface is identified, these newly obtained identification feature data (54) are stored as a separate data set as updated surface portion identification feature data.
7. A computer system comprising a processor, a computer storage device comprising a computer program product adapted to perform the method steps of any of claims 1 to 6, further comprising a camera adapted to acquire an image of a surface to be identified and further comprising a computer memory storing the acquired identification features in a database.
CN202180010672.9A 2020-01-31 2021-02-01 Computer-implemented method and system including coat identification of scales Pending CN115004258A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
EP201550423 2020-01-31
EP15050423 2020-01-31
PCT/EP2021/052337 WO2021152182A1 (en) 2020-01-31 2021-02-01 A computer implemented method and system of skin identification comprising scales

Publications (1)

Publication Number Publication Date
CN115004258A true CN115004258A (en) 2022-09-02

Family

ID=83023446

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180010672.9A Pending CN115004258A (en) 2020-01-31 2021-02-01 Computer-implemented method and system including coat identification of scales

Country Status (1)

Country Link
CN (1) CN115004258A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170116744A1 (en) * 2015-10-23 2017-04-27 International Business Machines Corporation Imaging segmentation using multi-scale machine learning approach
CN108604293A (en) * 2016-01-30 2018-09-28 三星电子株式会社 The device and method for improving picture quality
CN109313802A (en) * 2016-06-03 2019-02-05 皇家飞利浦有限公司 Biological object detection

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170116744A1 (en) * 2015-10-23 2017-04-27 International Business Machines Corporation Imaging segmentation using multi-scale machine learning approach
CN108604293A (en) * 2016-01-30 2018-09-28 三星电子株式会社 The device and method for improving picture quality
CN109313802A (en) * 2016-06-03 2019-02-05 皇家飞利浦有限公司 Biological object detection

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
FUMIAKI TOMITA ET AL: "Description of Textures by a Structural Analysis", IEEE, 1 March 1982 (1982-03-01), pages 183 - 191 *
SCHAFFALITZKY F. ET AL: "Geometric Grouping of Repeated Elements within Images", PROCEEDINGS OF THE NINTH BRITISH MACHINE VISION CONFERENCE UNIV, 1 January 1998 (1998-01-01), pages 1 - 10 *
YIP R KK ET AL: "MODIFICATION OF HOUGH TRANSFORM FOR CIRCLES AND ELLIPSES DETECTION USING A 2-DIMENSIONAL ARRAY", ELSEVIER, 1 September 1992 (1992-09-01), pages 1007 - 1022, XP000330366, DOI: 10.1016/0031-3203(92)90064-P *
杨建忠, 王荣武: "羊绒与羊毛纤维表面形态的图像处理与识别", 毛纺科技, no. 05, 30 October 2002 (2002-10-30), pages 12 - 15 *

Similar Documents

Publication Publication Date Title
US11645875B2 (en) Multispectral anomaly detection
US20220215686A1 (en) Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices
US9361507B1 (en) Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices
US9076048B2 (en) Biometric identification, authentication and verification using near-infrared structured illumination combined with 3D imaging of the human ear
CN102844766B (en) Human eyes images based multi-feature fusion identification method
Deng et al. Retinal fundus image registration via vascular structure graph matching
Haindl et al. Unsupervised detection of non-iris occlusions
WO2016086341A1 (en) System and method for acquiring multimodal biometric information
CN105654035B (en) Three-dimensional face identification method and the data processing equipment for applying it
Labati et al. Touchless fingerprint biometrics
Agarwal et al. Enhanced binary hexagonal extrema pattern (EBH X EP) descriptor for iris liveness detection
Polewski et al. A voting-based statistical cylinder detection framework applied to fallen tree mapping in terrestrial laser scanning point clouds
Johar et al. Iris segmentation and normalization using Daugman’s rubber sheet model
Bača et al. Basic principles and trends in hand geometry and hand shape biometrics
CN110555348A (en) Fingerprint identification method and device and computer readable storage medium
Barra et al. Unconstrained ear processing: What is possible and what must be done
US20230351722A1 (en) Computer Implemented Method and System of Skin Identification Comprising Scales
CN115004258A (en) Computer-implemented method and system including coat identification of scales
Gunasekaran et al. Hierarchical convolutional neural network based iris segmentation and recognition system for biometric authentication
Cadoni et al. Large scale face identification by combined iconic features and 3d joint invariant signatures
Shaikh et al. An adaptive central force optimization (ACFO) and feed forward back propagation neural network (FFBNN) based iris recognition system
Dong et al. Synthesis of multi-view 3D fingerprints to advance contactless fingerprint identification
Malik et al. An efficient retinal vessels biometric recognition system by using multi-scale local binary pattern descriptor
Mohammed et al. Conceptual analysis of Iris Recognition Systems
Poosarala Uniform classifier for biometric ear and retina authentication using smartphone application

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination