WO2018094532A1 - Visual pattern recognition system, method and/or computer-readable medium - Google Patents

Visual pattern recognition system, method and/or computer-readable medium Download PDF

Info

Publication number
WO2018094532A1
WO2018094532A1 PCT/CA2017/051416 CA2017051416W WO2018094532A1 WO 2018094532 A1 WO2018094532 A1 WO 2018094532A1 CA 2017051416 W CA2017051416 W CA 2017051416W WO 2018094532 A1 WO2018094532 A1 WO 2018094532A1
Authority
WO
Grant status
Application
Patent type
Prior art keywords
pattern
features
ratio
set
step
Prior art date
Application number
PCT/CA2017/051416
Other languages
French (fr)
Inventor
Arash Abadpour
Original Assignee
Fio Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/62Methods or arrangements for recognition using electronic means
    • G06K9/6201Matching; Proximity measures
    • G06K9/6202Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • G06K9/6203Shifting or otherwise transforming the patterns to accommodate for positional errors
    • G06K9/6211Matching configurations of points or features, e.g. constellation matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/36Image preprocessing, i.e. processing the image information without deciding about the identity of the image
    • G06K9/46Extraction of features or characteristics of the image
    • G06K9/4671Extracting features based on salient regional features, e.g. Scale Invariant Feature Transform [SIFT] keypoints

Abstract

The present invention is directed to a system, method and/or computer readable medium for visual pattern recognition using a binary operator. Patterns are recognized by their overlap with identified distinctive and/or prominent regions found in a pattern library generated through analysis of multiple samples of reference patterns.

Description

VISUAL PATTERN RECOGNITION SYSTEM, METHOD AND/OR COMPUTER- READABLE MEDIUM

RELATED APPLICATIONS

[0001] The present application claims the benefit of the earlier filed United States Patent Provisional Application No. 62/426,515 filed on November 26, 2016. FIELD OF THE INVENTION

[0002] The present invention relates generally to the field of pattern recognition, and more particularly to a system, method and/or computer readable medium for visual pattern recognition using a binary operator.

BACKGROUND OF THE INVENTION [0003] In the field of pattern recognition, the ability to accurately recognize the appearance of specific feature sets may be used to convey information. For example, in the medical diagnostics industry, it may be desirable to provide for the recognition of rapid diagnostic tests ("RDTs") based on their appearance. The correct identification of distinctive feature sets, preferably with a high degree of accuracy and/or with high sensitivity and/or specificity, may be desirable as it may facilitate, among other things, the diagnosis of a disease state, the presence or absence of a biomarker, the presence or absence of environmental agents and/or other distinctive feature sets as desired (e.g., road signs, logos, hazard signs, etc.).

[0004] As may be appreciated by persons having ordinary skill in the art, some of the challenges of accurate pattern recognition may arise from the features comprising an image being highly variable, perhaps due to variability in manufacturing tolerance (e.g., variability in the manufacturing of an RDT cassette, which may affect its position during image

96426460 1 capture), lighting (e.g., variability in ambient lighting during image capture may affect image contrast) feature state, and/or high degrees of similarity between features of unique patterns. Similarity between unique patterns may have been particularly problematic in prior art RDT image recognition as only a subset of features may be indicative of a unique RDT. For example, in some instances, it may be critical and/or preferable to be able to detect an RDT based solely of those unique features.

[0005] Template matching may be a well-established fundamental approach to localize objects within an image [see, for example, W.K.Pratt, "Digital Image Processing, 3rd Ed.", John Wiley & Sons, Inc., New York, 2001, pgs 613-625]. As may be appreciated by persons having ordinary skill in the art, template matching may have been used, more or less extensively, in computer vision applications, such as facial recognition, medical image processing, and/or image registration. Perhaps in its simplest form, template matching may be performed by taking a sub-image, and sliding it across an entire image, preferably while calculating one or more types of scoring functions (e.g., absolute difference, cross- correlation, etc.). The areas of the image that return the highest score(s) may be considered as possible matches.

[0006] In practice, persons having ordinary skill in the art may appreciate that image features may possess one or more complicating factors which may impact performance, possibly including one or more of the following: noise (e.g., a random variation of brightness or colour information in images); affme transformations (e.g., translation, rotation); lighting difference (e.g., contrast); feature variability; and/or other distortions.

[0007] Prior art solutions may have previously failed to consider using approaches for image registration in conjunction with the decomposition of feature sets to address pattern similarity when matching feature sets of similar patterns. These approaches may have

96426460 1 extracted and/or matched specific sets of features in (and may be invariant to) surrounding regions. Popular approaches in the prior art— such as the scale- invariant feature transform ("SIFT") algorithm [see, for example, D.Lowe, "Method and apparatus for identifying scale invariant features in an image and use of same for locating an object in an image", U.S. Patent No. 6,71 1,293] or histogram of oriented gradients ("HOG") technique [see, for example, N.Dalal, "Histograms of Orientated Gradients for Human Detection", Computer Vision and Pattern Recognition, 2005. CVP 2005. IEEE Computer Society Conference, June 2005] — may have extracted feature descriptors from an image, which then may have been compared to a set of known descriptors. One of the limitations of this prior art approach may have been that when comparing extracted features to those of a known feature set, it was unknown which features are either unique or common to the known feature set.

[0008] Even in view of the above prior art approaches, persons having ordinary skill in the art may have previously failed to recognize patterns by their overlap with identified distinctive and/or prominent regions found in a pattern library generated through analysis of multiple samples of reference patterns, though it may be desirable to do so. Such identified patterns may include only those features which are desired for matching, preferably allowing and/or facilitating selective exclusion of areas with high variability, and/or preferably to provide selectable regional invariance, prominence, and distinctiveness when compared to other patterns in the library.

[0009] As a result, there may be a need for, or it may be desirable to provide, one or more systems, methods, computer readable media, and/or cooperating environments that overcomes one or more of the limitations associated with the prior art. It may be advantageous to provide a system, method and/or computer readable medium that preferably facilitates visual pattern recognition and/or enables determinations based on the pattern.

96426460 1 There may also be some advantage to providing a method, system and/or computer readable medium that preferably provides for visual pattern recognition with a high degree of accuracy and/or with high sensitivity and/or specificity.

[0010] It may be an object of one preferred embodiment according to the invention to compare a feature of an image with a reference feature. [0011] It may be an object of one preferred embodiment according to the invention to identify prominent and/or distinctive features of an image.

[0012] It may be an object of one preferred embodiment according to the invention to identify certain features of a reference pattern to be stored in a database.

[0013] It may be an object of one preferred embodiment according to the invention to identify prominent and/or distinctive features of a reference pattern in an image.

[0014] It may be an object of one preferred embodiment according to the invention to determine the quality of a match with a reference image based on a subset of features.

[0015] It may be an object of the present invention to obviate or mitigate one or more disadvantages and/or shortcomings associated with the prior art, to meet or provide for one or more needs and/or advantages, and/or to achieve one or more objects of the invention— one or more of which may preferably be readily appreciable by and/or suggested to those skilled in the art in view of the teachings and/or disclosures hereof.

SUMMARY OF THE INVENTION

[0016] The present disclosure provides a system, method and/or computer-readable medium for visual pattern recognition. More specifically, embodiments of the present invention are directed to a system, method, and/or non-transitory computer-readable medium

96426460 1 for matching a first pattern against a second pattern. The system, method, and/or non- transitory computer-readable medium includes a first pattern and a second pattern. Feature detection may be conducted by the one or more processors and includes (i) generating a set of first features associated with the first pattern, the first features comprising first feature locations and first feature descriptors; and (ii) generating a set of second features associated with the second pattern, the second features comprising second feature locations and second feature descriptors. Pattern registration may be conducted by the one or more processors and includes: (1) matching the set of first features with the set of second features to generate a set of matching points; and (2) determining, based on the set of matching points, a match ratio, a localization ratio, and registration data comprising a rotation angle and a translation vector. Pattern comparison may be conducted by the one or more processors and includes: (1) decomposing the second pattern into a prominent component of the second pattern and a distinct component of the second pattern; (2) applying the registration data to the first pattern to generate a registered first pattern; (3) determining a prominence ratio based on the registered first pattern and the prominent component of the second pattern; and (4) determining a distinction ratio based on the registered first pattern and the distinct component of the second pattern. An evaluation comprises a comparison of the match ratio, the localization ratio, the prominence ratio and the distinction ratio with a predetermined match ratio, a predetermined localization ratio, a predetermined prominence ratio and a predetermined distinction ratio. Thus, according to the invention, the system, method and/or non-transitory computer readable medium matches the first pattern and the second pattern if each of the match, localization, prominence and distinction ratios exceed the predetermined match, localization, prominence and distinction ratios.

[0017] According to an aspect of one preferred embodiment of the invention, the determination of the prominent component of the second pattern includes providing a

96426460 1 plurality of sample patterns associated with a predetermined reference sample pattern. Registering each sample pattern with the predetermined reference sample pattern and generate transformed sample patterns comprising transformed prominent features. Adding the transformed prominent features for each transformed sample pattern to a prominent features set. Determining the prominence of each prominent feature in the prominent features set and selecting the prominent features with prominence exceeding a predetermined prominence threshold.

[0018] According to an aspect of one preferred embodiment of the invention, the determination of the distinct component of the second pattern includes providing one or more predetermined reference sample patterns. Providing a plurality of sample patterns for the one or more predetermined reference sample patterns. Registering each sample pattern with the respective one or more predetermined reference sample pattern and generating transformed sample patterns comprising transformed distinct features. Adding the transformed distinct features to a distinctive features set. Determining the distinctiveness of each feature in the distinctive features set and selecting the distinct features with distinctiveness below a predetermined distinctive threshold which are also in one or more predetermined reference sample patterns.

[0019] According to an aspect of one preferred embodiment of the invention, the second pattern is a reference pattern stored in a database.

[0020] According to an aspect of one preferred embodiment of the invention, the first pattern and the second pattern are binary.

[0021] According to an aspect of one preferred embodiment of the invention, the registration step is two-dimensional.

96426460 1 [0022] According to an aspect of one preferred embodiment of the invention, the registration step comprises random sample consensus (RANSAC).

[0023] Other advantages, features and characteristics of the present invention, as well as methods of operation and functions of the related elements of the system, method and computer readable medium, and the combination of steps, parts and economies of manufacture, will become more apparent upon consideration of the following detailed description and the appended claims with reference to the accompanying drawings, the latter of which are briefly described herein below.

BRIEF DESCRIPTION OF THE DRAWINGS

[0024] The novel features which are believed to be characteristic of the system, device and methods according to the present invention, as to their structure, organization, use, and method of operation, together with further objectives and advantages thereof, may be better understood from the following drawings in which presently preferred embodiments of the invention may now be illustrated by way of example. It is expressly understood, however, that the drawings are for the purpose of illustration and description only, and are not intended as a definition of the limits of the invention. In the accompanying drawings:

[0025] FIG. 1 is an illustration of labels printed on RDTs;

[0026] FIG. 2 is a flow chart of an embodiment of the present invention;

[0027] FIGS. 3A--F are illustrations of an embodiment of the present invention;

[0028] FIGS. 4A-B are illustrations of a further embodiment of the present invention;

[0029] FIGS. 5A-C are illustrations of various embodiments of the present invention;

96426460 1 FIGS. 6A-C are illustrations of yet another embodiment of the present invention; FIG. 7 is an illustration of a further embodiment of the present invention;

FIGS. 8A-C are illustrations of yet another embodiment of the present invention; FIGS. 9A-B are illustrations of a further embodiment of the present invention; FIG. 10 is an illustration of yet another embodiment of the present invention; FIG. 11 is an illustration of yet another embodiment of the present invention; FIG. 12 is a an illustration of a further embodiment of the present invention; FIGS. 13A-B is a summary of various embodiments of the present invention; FIG. 14 is an illustration of other visual patterns;

FIG. 15 is a flow chart of an embodiment of the present invention;

FIG. 16 is a flow chart of a further embodiment of the present invention;

FIG. 17 is a flow chart of yet another embodiment of the present invention; FIG. 18 is a flow chart of a further embodiment of the present invention;

FIG. 19 is a flow chart of a further embodiment of the present invention;

FIG. 20 is a an illustration of yet another embodiment of the present invention; FIG. 21 is a an illustration of a further embodiment of the present invention; FIG. 22 is an illustration of a yet another embodiment of the present invention; FIG. 23 is an illustration of a yet another embodiment of the present invention; [0048] FIG. 24 is a flow chart of an embodiment of the present invention;

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

[0049] The description that follows, and the embodiments described therein, may be provided by way of illustration of an example, or examples, of particular embodiments of the principles of the present invention. These examples are provided for the purposes of explanation, and not of limitation, of those principles and of the invention. In the description, like parts are marked throughout the specification and the drawings with the same respective reference numerals. The drawings are not necessarily to scale and in some instances proportions may have been exaggerated in order to more clearly depict certain embodiments and features of the invention. [0050] The present disclosure may be described herein with reference to system architecture, block diagrams and flowchart illustrations of methods, and computer program products according to various aspects of the present disclosure. It may be understood that each functional block of the block diagrams and the flowchart illustrations, and combinations of functional blocks in the block diagrams and flowchart illustrations, respectively, can be implemented by computer program instructions.

[0051] These computer program instructions may be loaded onto a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions that execute on the computer or other programmable data processing apparatus create means for implementing the functions specified in the flowchart block or blocks. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction

96426460 1 means which implement the function specified in the flowchart block or blocks. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks.

[0052] Accordingly, functional blocks of the block diagrams and flow diagram illustrations support combinations of means for performing the specified functions, combinations of steps for performing the specified functions, and program instruction means for performing the specified functions. It may also be understood that each functional block of the block diagrams and flowchart illustrations, and combinations of functional blocks in the block diagrams and flowchart illustrations, can be implemented by either special purpose hardware-based computer systems which perform the specified functions or steps, or suitable combinations of special purpose hardware and computer instructions.

[0053] The present disclosure may be now described in terms of an exemplary system in which the present disclosure, in various embodiments, would be implemented. This may be for convenience only and may be not intended to limit the application of the present disclosure. It may be apparent to one skilled in the relevant art(s) how to implement the present disclosure in alternative embodiments.

[0054] In this disclosure, a number of terms and abbreviations may be used. The following definitions and descriptions of such terms and abbreviations are provided in greater detail.

[0055] As used herein, a person skilled in the relevant art may generally understand the term "comprising" to generally mean the presence of the stated features, integers, steps, or

96426460 1 components as referred to in the claims, but that it does not preclude the presence or addition of one or more other features, integers, steps, components or groups thereof.

[0056] It should also be appreciated that the present invention can be implemented in numerous ways, including as a method, a system, or a computer readable medium such as a computer readable storage medium or a computer network wherein program instructions are sent over a network (e.g. optical or electronic communication links). In this specification, these implementations, or any other form that the invention may take, may be referred to as processes. In general, the order of the steps of the disclosed processes may be altered within the scope of the invention.

[0057] Preferred embodiments of the present invention can be implemented in numerous configurations depending on implementation choices based upon the principles described herein. Various specific aspects are disclosed, which are illustrative embodiments not to be construed as limiting the scope of the disclosure. Although the present specification describes components and functions implemented in the embodiments with reference to standards and protocols known to a person skilled in the art, the present disclosures as well as the embodiments of the present invention are not limited to any specific standard or protocol. Each of the standards for non-mobile and mobile computing, including the Internet and other forms of computer network transmission (e.g., TCP/IP, UDP/IP, HTML, and HTTP) represent examples of the state of the art. Such standards are periodically superseded by faster or more efficient equivalents having essentially the same functions. Accordingly, replacement standards and protocols having the same functions are considered equivalents.

[0058] As those of ordinary skill in the art would generally understand, the Internet is a global computer network which comprises a vast number of computers and computer networks which are interconnected through communication links. A person skilled in the

96426460 1 relevant art may understand that an electronic communications network of the present invention, may include, but is not limited to, one or more of the following: a local area network, a wide area network, peer to peer communication, an intranet, or the Internet. The interconnected computers exchange information using various services, including, but not limited to, electronic mail, Gopher, web-services, application programming interface (API), File Transfer Protocol (FTP). This network allows a server computer system (a Web server) to send graphical Web pages of information to a remote client computer system. The remote client computer system can then display the Web pages via its web browser. Each Web page (or link) of the "world wide web" ("WWW") is uniquely identifiable by a Uniform Resource Locator (URL). To view a specific Web page, a client computer system specifies the URL for that Web page in a request (e.g., a Hypert ext Transfer Protocol ("HTTP") request). The request is forwarded to the Web server that supports the Web page. When the Web server receives the request, it sends the Web page to the client computer system. When the client computer system receives the Web page, it typically displays the Web page using a browser. A web browser or a browser is a special-purpose application program that effects the requesting of web pages and the displaying of web pages and the use of web-based applications. Commercially available browsers include Microsoft Internet Explorer and Firefox, Google Chrome among others. It may be understood that with embodiments of the present invention, any browser would be suitable.

[0059] Web pages are typically defined using HTML. HTML provides a standard set of tags that define how a Web page is to be displayed. When a provider indicates to the browser to display a Web page, the browser sends a request to the server computer system to transfer to the client computer system an HTML document that defines the Web page. When the requested HTML document is received by the client computer system, the browser displays the Web page as defined by the HTML document. The HTML document contains various

96426460 1 tags that control the displaying of text, graphics, controls, and other features. The HTML document may contain URLs of other Web pages available on that server computer system or other server computer systems.

[0060] A person skilled in the relevant art may generally understand a web-based application refers to any program that is accessed over a network connection using HTTP, rather than existing within a device's memory. Web-based applications often run inside a web browser or web portal. Web-based applications also may be client-based, where a small part of the program is downloaded to a user's desktop, but processing is done over the Internet on an external server. Web-based applications may also be dedicated programs installed on an internet-ready device, such as a smart phone or tablet. A person skilled in the relevant art may understand that a web site may also act as a web portal. A web portal may be a web site that provides a variety of services to users via a collection of web sites or web based applications. A portal is most often one specially designed site or application that brings information together from diverse sources in a uniform way. Usually, each information source gets its dedicated area on the page for displaying information (a portlet); often, the user can configure which ones to display. Portals typically provide an opportunity for users to input information into a system. Variants of portals include "dashboards". The extent to which content is displayed in a "uniform way" may depend on the intended user and the intended purpose, as well as the diversity of the content. Very often design emphasis is on a certain "metaphor" for configuring and customizing the presentation of the content and the chosen implementation framework and/or code libraries. In addition, the role of the user in an organization may determine which content can be added to the portal or deleted from the portal configuration.

96426460 1 [0061] It may be generally understood by a person skilled in the relevant art that the term "mobile device" or "portable device" refers to any portable electronic device that can be used to access a computer network such as, for example, the internet. Typically a portable electronic device comprises a display screen, at least one input/output device, a processor, memory, a power module and a tactile man-machine interface as well as other components that are common to portable electronic devices individuals or members carry with them on a daily basis. Examples of portable devices suitable for use with the present invention include, but are not limited to, smart phones, cell phones, wireless data/email devices, tablets, PDAs and MP3 players, test devices, etc.

[0062] It may be generally understood by a person skilled in the relevant art that the term "network ready device" or "internet ready device" refers to devices that are capable of connecting to and accessing a computer network, such as, for example, the Internet, including but not limited to an IoT device. A network ready device may assess the computer network through well-known methods, including, for example, a web-browser. Examples of internet- ready devices include, but are not limited to, mobile devices (including smart-phones, tablets, PDAs, etc.), gaming consoles, and smart-TVs. It may be understood by a person skilled in the relevant art that embodiment of the present invention may be expanded to include applications for use on a network ready device (e.g. cellphone). In a preferred embodiment, the network ready device version of the applicable software may have a similar look and feel as a browser version but that may be optimized to the device. It may be understood that other "smart" devices (devices that are capable of connecting to and accessing a computer network, such as, for example, the internet) such as medical or test devices, including but not limited to smart blood pressure monitors, smart glucometers, IoT devices, etc.

96426460 1 [0063] It may be further generally understood by a person skilled in the relevant art that the term "downloading" refers to receiving datum or data to a local system (e.g. mobile device) from a remote system (e.g. a client) or to initiate such a datum or data transfer. Examples of a remote systems or clients from which a download might be performed include, but are not limited to, web servers, FTP servers, email servers, or other similar systems. A download can mean either any file that may be offered for downloading or that has been downloaded, or the process of receiving such a file. A person skilled in the relevant art may understand the inverse operation, namely sending of data from a local system (e.g. mobile device) to a remote system (e.g. a database) may be referred to as "uploading". The data and/or information used according to the present invention may be updated constantly, hourly, daily, weekly, monthly, yearly, etc. depending on the type of data and/or the level of importance inherent in, and/or assigned to, each type of data. Some of the data may preferably be downloaded from the Internet, by satellite networks or other wired or wireless networks.

[0064] Elements of the present invention may be implemented with computer systems which are well known in the art. Generally speaking, computers include a central processor, system memory, and a system bus that couples various system components including the system memory to the central processor. A system bus may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. The structure of a system memory may be well known to those skilled in the art and may include a basic input/output system ("BIOS") stored in a read only memory ("ROM") and one or more program modules such as operating systems, application programs and program data stored in random access memory ("RAM"). Computers may also include a variety of interface units and drives for reading and writing

96426460 1 data. A user of the system can interact with the computer using a variety of input devices, all of which are known to a person skilled in the relevant art.

[0065] One skilled in the relevant art would appreciate that the device connections mentioned herein are for illustration purposes only and that any number of possible configurations and selection of peripheral devices could be coupled to the computer system. [0066] Computers can operate in a networked environment using logical connections to one or more remote computers or other devices, such as a server, a router, a network personal computer, a peer device or other common network node, a wireless telephone or wireless personal digital assistant. The computer of the present invention may include a network interface that couples the system bus to a local area network ("LAN"). Networking environments are commonplace in offices, enterprise-wide computer networks and home computer systems. A wide area network ("WAN"), such as the Internet, can also be accessed by the computer or mobile device.

[0067] It may be appreciated that the type of connections contemplated herein are exemplary and other ways of establishing a communications link between computers may be used in accordance with the present invention, including, for example, mobile devices and networks. The existence of any of various well-known protocols, such as TCP/IP, Frame Relay, Ethernet, FTP, HTTP and the like, may be presumed, and computer can be operated in a client-server configuration to permit a user to retrieve and send data to and from a web- based server. Furthermore, any of various conventional web browsers can be used to display and manipulate data in association with a web based application.

[0068] The operation of the network ready device (i.e., a mobile device) may be controlled by a variety of different program modules, engines, etc. Examples of program modules are routines, algorithms, programs, objects, components, data structures, etc. that

96426460 1 perform particular tasks or implement particular abstract data types. It may be understood that the present invention may also be practiced with other computer system configurations, including multiprocessor systems, microprocessor-based or programmable consumer electronics, network PCS, personal computers, minicomputers, mainframe computers, and the like. Furthermore, the invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.

[0069] Embodiments of the present invention can be implemented by a software program for processing data through a computer system. It may be understood by a person skilled in the relevant art that the computer system can be a personal computer, mobile device, notebook computer, server computer, mainframe, networked computer (e.g., router), workstation, and the like. In one embodiment, the computer system includes a processor coupled to a bus and memory storage coupled to the bus. The memory storage can be volatile or non-volatile (i.e. transitory or non-transitory) and can include removable storage media. The computer can also include a display, provision for data input and output, etc. as may be understood by a person skilled in the relevant art.

[0070] Some portion of the detailed descriptions that follow are presented in terms of procedures, steps, logic block, processing, and other symbolic representations of operations on data bits that can be performed on computer memory. These descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. A procedure, computer executed step, logic block, process, etc. is here, and generally, conceived to be a self-consistent sequence of operations or instructions leading to a desired result. The operations are those requiring physical manipulations of physical quantities. Usually, though

96426460 1 not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in a computer system. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers or the like.

[0071] FIG. 22 illustrates a more detailed diagram of an example computing device 800 within which a set of instructions, for causing the computing device to perform any one or more of the methods discussed herein, may be executed. The computing device 800 may include additional or different components, some of which may be optional and not necessary to provide aspects of the present disclosure. The computing device may be connected to other computing device in a LAN, an intranet, an extranet, or the Internet. The computing device 800 may operate in the capacity of a server or a client computing device in client- server network environment, or as a peer computing device in a peer-to-peer (or distributed) network environment. The computing device 800 may be provided by a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, or any computing device 800 capable of executing a set of instructions (sequential or otherwise) that specify operations to be performed by that computing device 800. Further, while only a single computing device 800 is illustrated, the term "computing device" shall also be taken to include any collection of computing devices that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein. [0072] Exemplary computing device 800 includes a processor 802, a main memory 804 (e.g., read-only memory (ROM) or dynamic random access memory (DRAM)), and a data storage device 814, which communicate with each other via a bus 826.

96426460 1 [0073] Processor 802 may be represented by one or more general-purpose processing devices such as a microprocessor, central processing unit, or the like. More particularly, processor 802 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets. Processor 802 may also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like. Processor 802 is configured to execute instructions 824 for performing the operations and functions discussed herein. [0074] Computing device 800 may further include a network interface device 806, a image capture device 810, a video display unit 820, a character input device 818 (e.g., a keyboard), and a touch screen input device 816.

[0075] Data storage device 814 may include a computer-readable storage medium 812 on which is stored one or more sets of instructions 824 embodying any one or more of the methodologies, functions or processes described herein. Instructions 824 may also reside, completely or at least partially, within main memory 804 and/or within processor 802 during execution thereof by computing device 800, main memory 804 and processor 802 also constituting computer-readable storage media. Instructions 824 may further be transmitted or received over network 808 via network interface device 806. [0076] Data storage device 814 may also include a database 822 on which is stored one or more pattern libraries 120. Pattern libraries 120 may also reside, completely or at least partially, within main memory 804 and/or within processor 802 during manipulation thereof by computing device 800, main memory 804 and processor 802 also constituting computer-

96426460 1 readable storage media. Pattern libraries 120 may further be transmitted or received over network 808 via network interface device 806.

[0077] It may be generally understood that in establishing a user interface, a task bar may be preferably positioned at the top of a screen to provide a user interface. Preferably, a textual representation of a task's name is presented in this user interface, preferably as a button, and the task names may be shortened as necessary if display space of the button is constrained. The labelled button having the task's name preferably operate as a type of hyperlink, whereby the user/viewer can immediately switch to the activity, view, etc. of each of the tasks by selecting the button containing the applicable name from the task bar. In other words, the user or viewer is redirected by the application to the function represented by the task button by selecting the labelled hyperlink. Preferably, the task entry associated with the currently-displayed work unit view may be shown in a different graphical representation (e.g., using a different color, font, or highlighting). In preferred embodiments, there may be provided a display having a selectable "X" in the task bar entry for each task: if the user clicks on the "X", then its associated task may be ended and the view of its work unit may be removed. A user interface may be web-based, application based, or a combination.

[0078] In accordance with a preferred aspect of the present invention, a person skilled in the relevant art would generally understand the term "application" or "application software" to refer to a program or group of programs designed for end users. While there are system software, typically but not limited to, lower level programs (e.g. interact with computers at a basic level), application software resides above system software and may include, but is not limited to database programs, word processors, spreadsheets, etc. Application software may be grouped along with system software or published alone. Application software may simply be referred to as an "application".

96426460 1 [0079] It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussions, it is appreciated that throughout the present invention, discussions utilizing terms such as "receiving", "creating", "providing", "communicating" or the like refer to the actions and processes of a computer system, or similar electronic computing device, including an embedded system, that manipulates and transfers data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices. As used herein, reference to the "transmission", "processing", "interpretation" or the like of data associated with a cloud may refer to advancing through logic contained in the guideline. This may be accomplished, among other methods, by running on a processor one or more computer programs representative of the algorithms, processes, etc.

[0080] According to the invention, one or more visual pattern recognition systems, methods, computer-readable media, and/or cooperating environments may be disclosed.

[0081] The invention is contemplated for use in association with one or more cooperating environments, to afford increased functionality and/or advantageous utilities in association with same. The invention, however, is not so limited.

[0082] Certain novel features which are believed to be characteristic of a visual pattern recognition system, method, computer readable medium, and/or certain features of the system, method, computer readable medium which are novel in conjunction with the cooperating environment, according to the present invention, as to their organization, use, and/or method of operation, together with further objectives and/or advantages thereof, may

96426460 1 be better understood from the accompanying disclosure in which presently preferred embodiments of the invention are disclosed by way of example. It is expressly understood, however, that the accompanying disclosure is for the purpose of illustration and/or description only, and is not intended as a definition of the limits of the invention.

[0083] Naturally, in view of the teachings and disclosures herein, persons having ordinary skill in the art may appreciate that alternate designs and/or embodiments of the invention may be possible (e.g., with substitution of one or more steps, algorithms, processes, features, structures, parts, components, modules, utilities, etc. for others, with alternate relations and/or configurations of steps, algorithms, processes, features, structures, parts, components, modules, utilities, etc). [0084] Although some of the steps, algorithms, processes, features, structures, parts, components, modules, utilities, relations, configurations, etc. according to the invention are not specifically referenced in association with one another, they may be used, and/or adapted for use, in association therewith.

[0085] One or more of the disclosed steps, algorithms, processes, features, structures, parts, components, modules, utilities, relations, configurations, and the like may be implemented in and/or by the invention, on their own, and/or without reference, regard or likewise implementation of one or more of the other disclosed steps, algorithms, processes, features, structures, parts, components, modules, utilities, relations, configurations, and the like, in various permutations and combinations, as may be readily apparent to those skilled in the art, without departing from the pith, marrow, and spirit of the disclosed invention.

[0086] In certain implementations, instructions 824 may include instructions for method 100 for visual pattern recognition shown in FIG. 2. While computer-readable storage medium 812 is shown in the example of FIG. 22 to be a single medium, the term "computer-

96426460 1 readable storage medium" should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term "computer-readable storage medium" shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure. The term "computer-readable storage medium" shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.

[0087] The methods, components, and features described herein may be implemented by discrete hardware components or may be integrated in the functionality of other hardware components such as ASICS, FPGAs, DSPs or similar devices. In addition, the methods, components, and features may be implemented by firmware modules or functional circuitry within hardware devices. Further, the methods, components, and features may be implemented in any combination of hardware devices and software components, or only in software. [0088] In the foregoing description, numerous details are set forth. It will be apparent, however, to one of ordinary skill in the art having the benefit of this disclosure, that the present disclosure may be practiced without these specific details. In some instances, well- known structures and devices are shown in block diagram form, rather than in detail, in order to avoid obscuring the present disclosure. [0089] The present disclosure also relates to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may comprise a general purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer

96426460 1 readable storage medium, such as, but not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions.

[0090] It is to be understood that the above description is intended to be illustrative, and not restrictive. Various other implementations will be apparent to those of skill in the art upon reading and understanding the above description. The scope of the disclosure should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.

[0091] In a preferred embodiment, the present invention is adapted to recognize visual patterns and/or to make a determination based on the pattern. In particular, the present invention is preferably adapted to recognize and/or determine the type of an RDT based on the label and/or other printed materials present on the RDT body or cassette. According to the present invention, a local image feature is preferably utilized for the purpose of two- dimensional ("2D") pattern registration. Registered patterns are preferably compared using a novel binary operator. The specificity and accuracy of this comparison is preferably enhanced by detecting the prominent and distinct sections of the target patterns using an offline training procedure.

[0092] The present invention may also preferably be adapted to provide a training procedure which utilizes real samples provided by an inexperienced user and/or recognizes instances of trained patterns.

[0093] FIG. 1 depicts examples of labels printed on target RDTs.

96426460 1 [0094] As shown in FIG. 2, the method 100 preferably includes the following steps, performed and/or provided by a system, method and/or computer readable medium according to the invention, among others: an input pattern step 102; a feature detection step 104; a local descriptors step 106; a RAN SAC 2D registration step 108 producing a 2D registration 1 10 comprising of (Θ, t ) 1 12, localization ratio 1 14 and match ratio 1 16; a pattern library step 120 comprising steps for accessing pattern n 122, local descriptors 124, prominent sections 126, and distinct sections 128; a registered pattern step 130; a pattern comparison step 132; a prominence ratio step 134; a distinction ratio step 136; and a decision making step 140.

[0095] A. Theory

[0096] A.l. Local Image Feature Detection [0097] According to the present invention, the RAN SAC 2D registration step 108 preferably comprises the comparison of two binary patterns: a first binary pattern ("Pi") and a second binary pattern ("P2"). The two binary patterns (Pt and P2) are preferably registered simultaneously The registration process preferably comprises determination of a rotation angle Θ and a translation vector t . The rotation angle Θ and the translation vector t are preferably applicable in a two-dimensional ("2D") plane.

[0098] Pi and P2 are preferably provided. A binary pattern or image ("Pn") is preferably provided from a pattern library stored in a database 822 in the pattern library step 120. Persons skilled in the art will understand that Pi and P2 may comprise different dimensions. As described in more detail in Section A.2, the registration step 1 10 preferably estimates the rotation angle Θ and the translation vector t during the determination of rotation angle Θ and translation vector t step 112. Persons skilled in the art will additionally appreciate that Pi and P2 may not contain the same pattern. FIG. 3 depicts patterns that may be used by the 2D

96426460 1 registration step 1 10 to find a geometrical transformation which maps a first binary pattern ("Pi") onto a second binary pattern ("P2"), taking into account that the first and second patterns may not be identical. For example, as shown in FIGS. 3 A and 3B, the 2D registration step 1 10 is preferably adapted to account for variation between Pt and P2 (e.g., differences in dimensions, patterns, etc.). The two patterns shown in FIG. 3 are 231 x 417 pixels in dimensions for Pi and 223 x 374 pixels in dimensions for P2 and 92 feature points detected on Pi and 104 feature points detected on P2.

[0099] FIGS. 3A and 3B depict input patterns Pi and P2 respectively. FIGS. 3C and 3D depict the detected features of Pi and P2 respectively. FIG. 3E depicts the matching features between Pi and P2. FIG. 3F depicts registration. [00100] The present invention is preferably adapted for use with scale- invariant feature transform ("SIFT"), a prior art algorithm for detecting and describing local features in images. Persons skilled in the art will appreciate that SIFT implementations are available in OpenCV, IVT, and VLFeat, among other free or open-source packages. Skilled readers may also appreciate that the present invention may be adapted for use with alternate (i.e., non- SIFT) algorithms for detecting and describing local features in images. Examples of alternatives for SIFT may include, but are not limited to, Speeded-Up Robust Features ("SURF"), Binary Robust Independent Elementary Features ("BRIEF"), and Oriented FAST and Rotated BRIEF ("ORB").

[00101] FIG. 4 depicts SIFT feature points detected on a sample pattern. FIG. 4A depicts an input image and FIG. 4B depicts the detected features. The input pattern shown in FIG. 4A is 231 x 417 pixels in dimensions and 92 feature points are detected on the pattern. Persons skilled in the art may appreciate that the feature points are detected on the binarized version of the pattern and are, additionally, confined to the foreground. The following

96426460 1 Algorithm 1, and depicted as FIG. 15 as a feature extraction process 300, preferably provides a process 300 of generating pattern features 316 for an arbitrary pattern provided as an input pattern "P" 102. The process 300 is preferably implemented, for example, as part of a library of functions stored either locally or on a network, which are capable of receiving an image comprising a pattern from a device.

Algorithm 1: Feature extraction algorithm

Input: Input Pattern P.

Output: Set of features F = {(_/?, , · · · , (pF , dF)}

1 if P contains color data then convert P to grayscale.

2 if P is not a binary image then convert P to binary.

3 Detect features on P and denote the set of features as F.

4 Update F so that it only contains feature points which are on the foreground of P.

5 return F [00102] Algorithm 1 (above and with reference to the feature extraction process 300 depicted in FIG. 15) preferably comprises an input pattern or image step "P" 102 which may contain color data and may not be a binary image. The process 300 includes a step to determine if P is a binary image 308. In preferable embodiments, if the input image comprising the pattern is not binary then a step is applied to convert the input image to a binary image 310. In preferred embodiments, the process 300 may include a step to determine if P contains color data 304. If the input image contains color data, a min() operator as provided in MatLab (for example) may be used to convert a color image to grayscale 306. Additionally, in order to convert an image to binary 310, preferable embodiments of the invention may use adaptive thresholding as implemented, for example, in the OpenCV function cv: : adaptiveThresholdQ . The process 300 further comprises a step to detect features on P and denote the set of features as "F" 312. A step to update "F" to contain

96426460 1 feature points on the foreground of P 314 is also included in the process 300. As previously noted, the present invention preferably utilizes SIFT features. Preferably, the function vl_sift() from VLFeat is used to detect SIFT features. One feature point is preferably represented as a pair of ( pf , df ). In the present invention, p f is the location of the f-t feature point on the image and df is the descriptor corresponding to this feature point. It is expected, and may be preferable, that the vector p f contains non-integer values. In other words, the feature detection algorithm preferably performs a sub-pixel calculation in which case the elements of p f are non- integer. The structure and length of df is determined by the underlying feature model and the particular implementation. In the case of vl_sift(), df is composed 128-dimensional vector of 8-bit unsigned integer array elements. In accordance with preferable embodiments of the invention, the process 300 yields a set of features "F" 316.

[00103] In preferable embodiments of the invention, df may preferably be used to compare two vectors dl and d2 which belong to two different images taken under different imaging conditions, and determine whether or not the two descriptor vectors describe locally similar patterns. Skilled readers may appreciate that two vectors dx and d2 that appear to describe similar patterns may not guarantee that they are related. FIG. 5 depicts a pattern registration example with FIGS. 5A and 5B depicting input patterns and FIG. 5C depicting registration. As best shown in FIGS. 5A and 5B, the three letters "A" present in "MALARIA" may preferably be distinguished based on global geometrical assessment using a Random Sample Consensus (RANSAC) based method. In accordance with the present invention, the algorithm that results in the d f vector is also preferably adapted to provide a comparison between any two df vectors dy and d2 . The function vl_ubcmatch() may also

96426460 1 be utilized as may be provided by VLFeat. Given the sets of features Fi and F2, this function produces a list of pairs (ii, i2), wherein the feature points ( pn , dn ) 6 Fi and ( pj2 , di2 ) 6 F2 are similar. This result preferably provides an indication that the registration between the two patterns Pi and P2 may in fact map p to pi2 . Persons skilled in the art will appreciate that this result may be utilized in a RANSAC method. [00104] A.2. RANSAC 2D Registration

[00105] The registration step 108 between two patterns, as shown in FIG. 2, may preferably be modeled as the pair of a rotation angle Θ and a 2 χ 1 translation vector t . In accordance with the present invention, the origin of each pattern is preferably a top corner (e.g., the left corner) of an image. [00106] As stated in Section A.1. (above), feature detection 104 is preferably conducted for the two patterns Pi and P2 and, subsequently, a feature matching step 132 is performed. The result of this process is preferably the set of M pairs of points ( pml , pm2 ), rn = 1, · · · ,

M. This set is preferably denoted as P and every ( pml , pm2 ) 6 P provides a suggestion that the point pml in P} is related to the point pm2 in P2. The suggestion preferably indicates that the points may have the same appearance but belong to different sections of the two patterns.

[00107] The purpose of the registration step 1 10 is to determine Θ and t , for which,

[00108] V (pml , pm2 ) e P, Re pml + t - pm2 (2.1),

[00109] where Re is a 2 χ 2 rotation matrix corresponding to an angle Θ. Which is to say that for all pairs of points from pi and p2 in P, rotating a point in Pi by Θ and translating it by t will result in an output point close to the corresponding point in P2. It is preferably

96426460 1 assumed that the members of P are reliable and provide a solution to Equation (2.1 ) using Singular Value Decomposition ("SVD"). The method of the present invention may be adapted from Reference [9] in the Bibliography Section (below).

[00110] The model provided in Equation (2.1) denotes a Rigid Transform model, for which the cost function can be written as,

[00111] Δ =∑^ \\ ^ Pml + t - Pm2 \\ 2- (2.2)

[00112] The mean of all pml and also the mean of all pml may be calculated as follows,

M

[00113] P, (2.3)

M

Figure imgf000031_0001

[00115] The following 2 >< 2 matrix may be calculated as follows, [00116] C = - _i ( pm2 - P2)( pmi - A Y - (2-5)

[00117] C may be decomposed using SVD and yield, [00118] C = USVT. (2.6) [00119] The 2 >< 2 rotation matrix Re may be produced as follows,

Figure imgf000031_0002
[00121] In the present invention, |A| denotes the determinant of the matrix A. Preferably, the translation vector t may be calculated as,

96426460 1 [00122] t = p2 - Re A- (2·8)

[00123] The foregoing process is preferably implemented in a method, which utilizes a general-purpose 2D registration function. If conducted in Non-RANSAC mode, the 2D registration function preferably redirects to a XY transformation fitting function, otherwise, it applies a RANSAC method on the cost function for a XY transformation fitting function using a XY transformation distance function. The pair of (Θ, t ) 112 are preferably referred to as the registration information or data.

[00124] Persons skilled in the art may appreciate that Random Sample Consensus ("RANSAC") (See, for example Reference [3] in the Bibliography Section below) is a general purpose method adapted for fitting a model to data which is perturbed with noise and wherein association may be error-prone. The present invention uses an open source implementation of RANSAC which is posted on Matlab Central (See, for example Reference [1 1 ] in the Bibliography Section below). RANSAC preferably locates and/or determines a subset of the data for which an acceptable fit is possible. This process is best shown by the data depicted in FIG. 6 which depicts input patterns wherein two points are altered (FIG. 6A), conventional fit (FIG. 6B) and the results of utilizing RANSAC (FIG. 6C).

[00125] In FIG. 6A a polygon with twelve nodes is shown. This polygon is rotated by - 25.42 degrees and two of the nodes are perturbed significantly. This perturbation is an example of a model of incorrect association. There is no translation between the two polygons seen in FIG. 6A. When a registration model is fit on to these two sets, the result is -25.42 degrees of rotation and translation by the vector [0.29, 0.02]T. This fit results in a mean registration error of 0.33 with the registration involving all points. When RANSAC is employed in accordance with the present invention, however, the algorithms recognize that points #3 230 and #7 232 carry perturbation to the extent that they are no longer useful for

96426460 1 registration. In accordance with the present invention, the algorithm automatically discards these two perturbed points and generates a rotation of -25.42 degrees and a translation of [- 0.00, -0.00]T. The mean registration error in this case is 0.00. Hence, the utilization of RANSAC allows for recognizing and discarding perturbed points; in addition, the RANSAC- enabled registration method is adapted to more accurately estimate translation and decrease the error. Visual comparison of FIGS. 6B and 6C also depicts the utilization of RANSAC results in better registration for the two sets of points.

[00126] Algorithm 2 (as shown in FIG. 16 depicting RANSAC-based registration process 400) outlines the RANSAC-based registration algorithm utilized in the present invention. This algorithm is preferably adapted to generate the registration data between Pt and P2. [00127] Algorithm 2 also preferably calculates a Match Ratio 116, denoted as xm, and the Localization Ratio 1 14, denoted as \. The Match Ratio step 406 and Locazliation Ratio step 414 are included in the registration process 400 and depicted in FIG. 16. The registration process 400 may include a failure report step, in which case neither the registration data nor any of the two ratios are valid. For example, if the calculated Match Ratio 116 is determined to be too small 408, the process 400 will report a failure 410 and if the calculated Localization Ratio 114 is determined to be too small 416, the process 400 will report a failure 410.

[00128] The set of features Ft contains F\ feature points and the set of features F2 contains F2 feature points. The process 400 includes a step 404 of matching Ft with F2 to produce a set of matching points P. It may also be assumed that the corresponding feature matching algorithm has found counterparts for Fm of the member of Fi in F2. The match ratio 116 between Ft and F2 may be defined as,

96426460 1 F

[00129] (2.9)

F,

[00130] Persons skilled in the art may appreciate that as 0 < Fm < F then 0 < xm < 1. Additionally, skilled readers may appreciate that a small xm is an indication that Algorithm 2 must abort. Moreover, skilled persons may assume that when RANSAC-based registration is applied on the matching features - a RANSAC-based registration on P to produce (Θ, t ) step 412 - Fj of them are marked as valid. The localization ratio 1 14 is defined between Ft and F2 as,

[00131] τι = ^ . (2.10)

Fl

Algorithm 2: RANSAC-based registration algorithm

Input: Input Patterns Pt and P2 and their corresponding set of features Fi and F2.

Output: Registration Data (Θ, ?), Match Ratio xm, Localization Ratio τι

1 Match F! with F2 and produce the set of matching points

P = {( ,/ , P2f )},f= - - - , F

2 Calculate xm using Equation (2.9).

3 if Tm is too small then return failed.

4 Apply RANSAC-based registration on P and produce (Θ, t ).

5 Calculate τι using Equation (2.10).

6 if Ti is too small then return failed

7 return (Θ, t ), xm, and τι

[00132] Persons skilled in the art may understand that as 0 < Fi < Fi, then 0 < τι < 1. Additionally, skilled readers may appreciate that F\ < Fm and therefore xi < xm. [00133] Persons skilled in the art may appreciate that Algorithm 2 is not transitive. In other words, the match and localization ratios corresponding to registering Pi on P2 may not

96426460 1 be equal to the match and localization ratios corresponding to registering P2 on Pt. Additionally, the RANSAC stage utilized in Algorithm 2 is non-deterministic. Accordingly, skilled readers may appreciate that repeated executions of Algorithm 2 on the same pairs of patterns may not produce identical results. In accordance with the present invention, the registration process 400 yields a results (Θ, ?), xm, and τι step 418. Skilled readers may also appreciate that the present invention may be adapted for use with alternate (i.e., non- RANSAC) algorithms or processes for registering noisy and/or perturbed patterns and as such a variety of cost functions may be adapted to determine the registration between patterns yielding the same output variables (Θ, t ), xm, and %\.

[00134] A3. Pattern Comparison [00135] As shown in FIG. 3F, the registration data generated through the registration process 400 described in Section A.2. (above) is adapted to compare two patterns. The first pattern is preferably a "Reference", denoted as R, and the second pattern is preferably a "Query", denoted as Q.

[00136] The decomposition of a Reference (R) 200 is depicted in FIG. 20. R 202 is preferably decomposed into the two patterns Rp 204 and RD 206 which are in turn addressed as a prominent part of R and a distinct part of R, respectively. While the method for dividing R into Rp and RD is described in Section A.4. (below), Rp contains the parts of R - features fPl 214a, fp2 214b, ... fpx 214x - which are prominent in different samples of R. Moreover, RD is what makes a particular pattern different from other patterns and similarly comprises

Figure imgf000035_0001

[00137] Algorithm 3 (depicted as FIG. 17 as a pattern comparison process 500) preferably denotes the pattern comparison algorithm of the present invention. This process 500 employs

96426460 1 Algorithm 2 for registration (the registration process 400) and may return a failure if either registration fails as shown in the step 506 or if its own two metrics, i.e. the Prominence Ratio 134 and the Distinction Ratio 136, are too small as shown in steps 512 and 516 respectively. These two metrics are formally defined and denoted by τρ and Xd, respectively. Both τρ and Xd are defined between two binary patterns. As stated in Algorithm 3, these two metrics are calculated between the prominent and distinct sections of R and the registered version of Q, which may be denoted as Q , respectively.

Algorithm 3: Pattern comparison algorithm

Input: Reference pattern R and query pattern Q and their corresponding set of features FR and FQ.

Output: Registration Data (Θ, ?), Match Ratio xm, Localization Ratio x\, Prominence Ratio τρ, Distinction Ratio Xd

1 Utilize Algorithm 2 on Q and R and return failed if failed.

2 Apply (Θ, t ) generated by Algorithm 2 on Q and address the updated pattern as Q .

3 Calculate τρ between Q and Rp using Equation (2.11).

4 if τρ is too small then return failed

5 Calculate X between Q and RD using Equation (2.12).

6 if Xd is too small then return failed

7 return (Θ, t ), xm, x\, xp, and Xd

[00138] Accordingly, τρ and Xd may be defined as,

Figure imgf000036_0001

[00140] ¾ = JLjj— ΪΓ · (2- 12)

96426460 1 [00141] In the foregoing equations, A Π B denotes the results of applying the binary and operator on A and B. Moreover, the weight of A, denoted as \\A\\ , defines the number of elements in the binary pattern A which are one.

[00142] As depicted in FIG. 17, the pattern comparison process 500 comprises the following: a step to input Reference Pattern "R", Query Pattern "Q", FR and F0 502; a step to apply the registration algorithm on Q and R 504; a step to determine if the registration failed

506; a step to report a failure 410; a step to apply (Θ, t ) on Q to produce £? 508; a step to calculate τρ between Q and Rp 510; a step to determine if τρ is too small 512; a step to report a failure 410; a step to calculate id between Q and RD 14; a step to determine if id is too small 516; a step to report a failure 410; and a step to an output comprising (Θ, t ), xm, %\, τρ, and id 518.

[00143] Λ.4. Training

[00144] According to the present invention, a training method is adapted to produce a set of reference patterns, which is preferably denoted as Rv , RL- Each reference pattern Ri is preferably composed of the corresponding features, denoted as Fi, and the prominence and distinct sections Rw and Rm- The training method of the present invention takes use of multiple samples for each reference pattern. The number of samples for each training pattern may be denoted as S.

[00145] As shown in FIG. 24, the training process preferably includes the following steps, among others: an input reference pattern samples step 1002; an input reference target pattern sample step 1004, which for convenience in the following will be assumed to be the first sample of the reference pattern on which we wish to determine the relevant sections; a feature detection step 104; a local descriptors step 106; a RANSAC 2D registration step 108

96426460 1 producing a 2D registration 1 10 comprising of (Θ, t ) 112, localization ratio 1 14 and match ratio 116; a registered pattern step 130; a pattern comparison step 132; a prominent section selection step 1006; a distinctive section selection step 1008; a library entry generation step which among other things may store the prominent and distinctive sections and local descriptors of the relevant reference pattern in a pattern library 120. [00146] FIG. 7 depicts the training samples used in the current implementation of the developed algorithm. This set contains six reference patterns 220, 221 , 222, 223, 224, 225, i.e., L = 6, and five samples per reference pattern 220a-e, 221a-e, 222a-e, 223a-e, 224a-e and 225a-e, i.e. S = 5. These patterns correspond to six variants of Malaria RDTs, i.e., Ag Pf/Pan, Ag Pf/Pv, Pf/Pan (two styles), and Pf/Pv (two styles). [00147] Algorithm 4 (also shown in FIG. 18 depicting a prominent sections estimation process 600) describes a process of estimating the prominent section of a reference pattern.

[00148] In Algorithm 4, λΡ denotes the Prominence Scale. In accordance with the present invention, λΡ = 2. As such, any section of the reference pattern which appears in at least half of the samples is designated a prominence of 1. Persons skilled in the art, however, may appreciate that a user may apply different values to λΡ.

[00149] As depicted in FIG. 18, the prominent sections estimation process 600 comprises the following: a step to input S samples of Reference Pattern "R " denoted as Rn to Ris 602; a step to initialize a prominent features set, denoted Rip, with all the features in Rn 604; a step to loop over all the samples but Rn 606; a step to register a sample of pattern with Rn and return the transformed pattern R 608; a step to add the transformed features of R to R]P 610; a step to iterate back to step 606 if more samples remain 612; a step to determine the prominence of each feature and only select those features with a prominence above a defined

96426460 1 threshold 614; and a step to an output comprising of the pattern iP the remaining features of

[00150] In accordance with the present invention, as shown on FIG. 21, the prominent sections 204a, 204b, 204c, 204L of a reference pattern 202a, 202b, 202c, 202L, respectively, are preferably estimated based on an inspection of multiple samples corresponding to that pattern 210a, 210b, 210c, 210L. The estimation of the distinct sections 206a, 206b, 206c, 206L of the reference pattern 202a, 202b, 202c, 202L, respectively, is preferably performed by comparing samples corresponding to a particular pattern with samples related to other patterns. For example, comparing Sample St 210a with Sample S2 210b, Sample S3 210c ... Sample SL 210L. As such, a primary query when estimating the prominent sections 204a, 204b, 204c, 204L of a reference pattern 202a, 202b, 202c, 202L, respectively, is to determine a prominent feature (for example, features fPl 214a, fp2 214b, ... fpx 214x, as shown in FIG. 20) between the different samples corresponding to the pattern. The estimation of the distinct sections 206a, 206b, 206c, 206L of a given pattern includes a query to determine the distinguishing feature of the pattern (for example, features foi 216a, fo2 216b, ... fox 216x, as shown in FIG. 20) from other patterns. According to the present invention, samples corresponding to every other reference pattern are utilized to estimate the distinct sections 206a, 206b, 206c, 206L of a particular reference pattern 202a, 202b, 202c, 202L respectively. This process is described in Algorithm 5 (FIG. 19 depicting a distinct sections estimation process 700). [00151] As depicted in FIG. 19, the distinct sections estimation process 700 comprises the following: a step to input L Reference Patterns R denoted as Ri to RL each comprising S sample patterns Rt]i to R[]S and of which we wish to find the distinctive sections of R 702; a step to initialize a distinctive features set, denoted RID, with the features in Rn x 0 704; a step

96426460 1 initializing a counter at 0 706; a step to loop over all the Reference patterns 708; a step to determine if the reference pattern is Ri 710, a step to loop over all the samples for a reference pattern 712; a step to register a sample of a reference pattern with Rn and return the transformed pattern R 714; a step to add the transformed features of R to RiD 716; a step to increment the counter 718; a step to iterate back to step 712 if more samples of a given reference pattern remain 720; a step to iterate back to step 708 if more reference patterns remain 722; a step to determine the distinctiveness of each feature and only select those features with a distinctiveness below a defined threshold and which are also found in Rn 724; and a step to an output comprising of the pattern RiD comprising those features with a distinctiveness below a defined threshold and which are in Rn 726. [00152] In accordance with the present invention, the distinct sections estimation process 700, locates and/or determines sections of a first sample for pattern "1", i.e. Ru, which may be present in samples corresponding to other patterns. A configuration parameter λη in this algorithm denotes the Distinction Scale and λη = 2 may preferably be used in the current implementation. Person skilled in the art, however, may appreciate that a user may apply different values to λη. In some embodiments of the invention, an area in the reference pattern may exist in half of the samples corresponding to other patterns in order for that area to be designated as non-distinct.

96426460 1 Algorithm 4: Prominent sections estimation algorithm

Input: S samples of the reference pattern Rl : { n, · · · , Ris} .

Output: Prominent sections of Ri given as RiP.

2 for s = 2 · · · S do.

3 I Register Ris on RU and denote the transformed version of Ris as R ls.

4 L R1P = R1P U R ls.

5 RiP = min(l , Rip /S χ λΡ ).

6 return RP

Algorithm 5: Distinct sections estimation algorithm

Input:. S samples for each of the reference patterns Ri, · · · , RL.

Output: Distinct sections of Ri given as RiD.

Figure imgf000041_0001

2 i = 0

3 for / = 1 · · L do

4 I if 7≠ I then

5 I I for s = 1 S do

6 I I I Register R~s on Ru and denote the transformed version of RTs as Rjs .

Figure imgf000041_0002

8 L L L '=i+ 1

9 RID = max(0, 1 - RiD / i * λη ) Π R„.

10 return RjD

[00153] FIG. 8 depicts training results for a Pf/Pan sample pattern, where FIG. 8A depicts a sample image 222b corresponding to this pattern (as shown in FIG. 7). The remaining

96426460 1 samples used for this pattern 222a, c-e are depicted in the third row of FIG. 7. FIGS. 8B and 8C depict the results of the training procedure for the samples 222a-e, wherein the prominent sections and the distinct sections are shown respectively. The circles overlaid on these two images denote the position and scale of the local image features (as determined by Algorithm 1 or the feature extraction process 300 shown in FIG. 15) which have been used during the training procedure and will also be used during the matching process.

[00154] FIG. 9 depicts screenshots from the training process carried out by the implementation of the developed method on the samples 220a-e, 221a-e, 222a-e, 223 a-e, 224a-e, and 225a-e shown in FIG. 7. FIG. 9A depicts prominent estimation wherein each sample of the reference pattern has been registered onto the first reference sample. The prominent estimation is preferably performed iteratively for each reference to determine the prominent sections of all reference patterns in the reference library; this process is depicted in FIG. 9A for a subset of the reference pattern samples in FIG. 7. FIG. 9B depicts distinct estimation wherein each sample of each reference pattern but for the samples of the reference pattern of interest has been registered onto the first reference sample. In accordance with the present invention, distinct estimation is preferably performed iteratively for each reference pattern to determine the distinct sections of all reference patterns in the reference library; this process is depicted in FIG. 9B for a subset of the reference pattern samples in FIG. 7. Skilled readers may appreciate that unlike the prominent estimation shown in FIG. 9A, the type of malaria test (e.g., Ag Pf/Pan, Ag Pf/Pv, Pf/Pan , and Pf/Pv) provided by the input images in the distinct estimation of FIG. 9B does not appear to provide a strong registration with target reference patterns compared with the prominent estimation process since the samples used in the distinct estimation process are from different reference patterns. In accordance with a preferred embodiment, these unmatched features may comprise distinctive

96426460 1 sections as the thresholding applied in Algorithm 5 (or process 700) selects for features that are uncommon between patterns.

[00155] A.5. Decision Making

[00156] The decision making step 140 (as seen in FIG. 2) employed in the present invention is a combination of the four checks carried out in Algorithm 2 (or the registration process 400) and Algorithm 3 (or the pattern comparison process 500). Hence, for the query pattern Q to match the reference pattern R, it is required that the four metrics xm, τ\, τρ, and id are all acceptably large.

[00157] B. Evaluation

[00158] According to the present invention, preliminary evaluation results for the developed algorithms (1 , 2, 3, 4 and 5 corresponding to processes 300, 400, 500, 600 and 700 respectively) have been collected. FIG. 10 depicts a screenshot from a query process. The query RDT type is Pf/Pan and the library contains four RDT types, i.e., Pf/Pan, Ag Pf/Pan, Pf/Pv, and Ag Pf/Pv. The database corresponding to this library can be found in the database 822 (e.g., //ITWKS/SW_Team/GPLF-1028/Data.Gen3). Preliminary results of the sample evaluation indicated that, first, Ag Pf/Pan and Ag Pf/Pv fail at the match ratio examination, because of a low match ratio of 0.37 and 0.29, respectively (as seen in FIG. 10). The two RDT types Pf/Pan and Pf/Pv, however pass this stage with a match ratio of 0.72 and 0.72, respectively (as depicted in FIG. 10). These two types also pass the localization ratio determination with a localization ratio of 0.64 and 0.53, respectively (as shown in FIG. 10). Persons skilled in the art may appreciate that Pf/Pv approaches the threshold for being dropped, as the minimum required localization ratio is 0.50. As the two remaining patterns pass through prominence and distinction determinations, the importance of the comparison strategy devised in this work, and described in Section A.3. becomes evident. As such, the

96426460 1 input pattern is assigned a prominent ratio of 0.65 in comparison with Pf/Pv (as shown in FIG. 10), i.e. this pattern contains many of the prominence sections of a Pf/Pv pattern, but its distinction ratio for Pf/Pv is 0.30 (as seen in FIG. 10). Therefore, the algorithms determine that the input pattern is not a Pf/Pv. In comparison with Pf/Pan, however, the prominence ratio is 0.79 and the distinction ratio is 0.80, and, therefore, the input pattern is discovered to correspond to a Pf/Pan pattern (as shown in FIG. 10). Visual inspection pay preferably be used to verify the determination.

[00159] FIG. 11 depicts recognition of a Pf/Pv sample by the developed algorithm. In this example, handwriting may occlude the RDT label. The algorithm, however, is preferably adapted to recognize this RDT type with a prominence ratio of 0.73 and a distinction ratio of 0.71 (as depicted in FIG. 11). FIG. 12 depicts three additional samples. Skilled readers may appreciate that in both FIGS. 11 and 12, the developed algorithms 1 , 2, 3, 4 and 5 and processes 300, 400, 500, 600 and 700 are adapted to recognize the RDT type and also recognize the position on the RDT where the label is present. This determination is the (Θ, t ) 112 pair, which is used by the consecutive stages of the algorithms in order to locate a membrane.

[00160] In a further example, the developed algorithms 1, 2, 3, 4 and 5 and processes 300, 400, 500, 600 and 700 were applied on 2,373 Malaria samples, of different variants. These sample files can be found in the database 822 (e.g., //ITWKS/SW_Team/GPLF- 1028/Evaluation/ ). In this present example, the samples were not available for public inspection and there was no information on whether the labels were legible on all these samples.

[00161] FIG. 13 summarizes the results of this evaluation effort. Based on FIG. 13 A, which depicts hypothesis count, the developed algorithms 1 , 2, 3, 4 and 5 and processes 300,

96426460 1 400, 500, 600 and 700 are adapted to generate a result for 79% of the Malaria samples. As the samples were not available for inspection, it could not be visually determined if the cases were correctly identified. Additionally, without physical inspection of the samples, it was not possible to determine if the 21% other cases were legible and whether the algorithm would be expected to generate an identification for them. The same is applicable to the results shown in FIG. 13B, which depicts identification statistics, wherein the results generated by the developed algorithms 1 , 2, 3, 4 and 5 and processes 300, 400, 500, 600 and 700 are segregated into the different RDT types. The nature of the input data utilized in this example did not allow for visual inspection and confirmation of the output.

[00162] C. Summary [00163] As depicted in FIG. 14, the present invention is adapted to identify visual patterns. The present invention describes a method, system and/or computer readable medium for the identification of visual patterns. Preferably, the present invention may be adapted for the identification of different RDT types. In addition to RDT type and other medical indicators, the developed method, system and/or computer readable medium can be applied to logos and signage, including information-bearing symbols, traffic signs, and other visual icons.

[00164] The method, system and/or computer readable medium of the present invention utilizes a local image feature in order to perform registration. The present invention also comprises a training mechanism which utilizes multiple samples of the patterns that it is required to identify. The training mechanism preferably decomposes every pattern into their prominent and distinct sections. Then, as a query is executed, four metrics are calculated, i.e. match ratio 1 16, localization ratio 1 14, prominence ratio 134, and distinction ratio 136, and

96426460 1 the method, system and/or computer readable medium either generates a verdict for the input pattern or issues a "Failure".

[00165] The present invention preferably utilizes a number of prior art image processing and mathematical operations. The list of references in the bibliography below provides relevant citations. [00166] The foregoing description has been presented for the purpose of illustration and maybe not intended to be exhaustive or to limit the invention to the precise form disclosed. Other modifications, variations and alterations are possible in light of the above teaching and may be apparent to those skilled in the art, and may be used in the design and manufacture of other embodiments according to the present invention without departing from the spirit and scope of the invention. It may be intended the scope of the invention be limited not by this description but only by the claims forming a part of this application and/or any patent issuing herefrom.

[00167] Data Store

[00168] A preferred embodiment of the present invention provides a system comprising data storage (e.g. databases 822 in FIG. 22) that may be used to store all necessary data required for the operation of the system. A person skilled in the relevant art may understand that a "data store" refers to a repository for temporarily or persistently storing and managing collections of data which include not just repositories like databases (a series of bytes that may be managed by a database management system (DBMS)), but also simpler store types such as simple files, emails, etc. A data store in accordance with the present invention may be one or more databases, co-located or distributed geographically. The data being stored may be in any format that may be applicable to the data itself, but may also be in a format that also encapsulates the data quality.

96426460 1 [00169] As shown in FIGS. 22 and 23, various data stores or databases 822 may interface with the system of the present invention, preferably including, without limitation, proprietary databases 902, epidemiologic databases 904, medical records databases 906, UN and major/international healthcare institution databases 908, healthcare and emergency infrastructure databases 910, education and economic databases 912, news databases 924, demographic databases 916, communication and military infrastructure databases 918, image databases 920, and weather 926, travel 928, topographic databases 930, over a bus 826 directly connected to a data storage device 814 or network interface connected to a LAN 914 or WAN 922.

[00170] A clinical and healthcare database may preferably contain, among other things, diagnostic and medical data (clinical information), such as, for example, one or more of the following, which may or may not be related to medical events: (a) test results from diagnostic devices equipped with remote data transfer systems and/or global positioning or localization features; (b) information from UN databases and major healthcare international institutions; and/or (c) scenarios and knowledge data. [00171] A sociological database may preferably contain, among other things, sociological data (human information), such as, for example, one or more of the following: (a) population information from local and/or international demographic databases; (b) political and/or organization systems in the area and/or from international databases; (c) education and/or economic systems in the area and/or from international databases; and/or (d) information from news and/or newspapers, drawn from the Internet or elsewhere.

[00172] An infrastructure database may preferably contain, among other things infrastructure data or information, such as, for example, one or more of the following: (a) information concerning healthcare infrastructure; (b) information concerning communication

96426460 1 infrastructures; and/or (c) information concerning emergency and/or military infrastructure; all preferably drawn from local and/or international databases.

[00173] A geophysics database may preferably contain, among other things, geophysics data or information, such as, for example, one or more of the following: (a) weather and/or climatic information from local databases; and/or (b) topographic information from local and/or international databases.

[00174] Bibliography

[00175] [1] Herbert Bay et al. "Speeded-Up Robust Features (SURF)". In: Computer Vision and Image Understanding 1 10.3 (2008), pp. 346-359.

[00176] [2] Michael Calonder et al. "BRIEF: Binary Robust Independent Elementary Features". In: Computer Vision - ECCV 2010: 11th European Conference on Computer Vision, Heraklion, Crete, Greece, September 5-11, 2010, Proceedings, Part IV. Ed. by Kostas Daniilidis, Petros Maragos, and Nikos Paragios. Berlin, Heidel-berg: Springer Berlin Heidelberg, 2010, pp. 778-792.

[00177] [3] Martin A. Fischler and Robert C. Bolles. "Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography". In: Communications of the ACM24.6 (1981), pp. 381-395.

[00178] [4] Integrating Vision Toolkit (IVT). url: http://ivt.soureforge.net/

[00179] [5] David G. Lowe. "Object Recognition from Local Scale-Invariant Features". In: Proceedings of the International Conference on Computer Vision-Volume 2. ICCV 99. Washington, DC, USA: IEEE Computer Society, 1999, pp. 1 150-1 157. isbn: 0-7695-0164-8.

96426460 1 [00180] [6] Krystian Mikolajczyk and Cordelia Schmid. "A Performance Evaluation of Local Descriptors". In: IEEE Transactions on Pattern Analysis and Machine Intelligence 27.10 (Oct. 2005), pp. 1615-1630. issn: 0162-8828.

[00181] [7] Open Source Computer Vision (OpenCV). url: http://openv.org/

[00182] [8] Ethan Rub lee et al. "ORB: An Efficient Alternative to SIFT or SURF". In: Proceedings of the 2011 International Conference on Computer Vision. ICCV 2011. Washington, DC, USA: IEEE Computer Society, 201 1 , pp. 2564-2571.

[00183] [9] Inge Soderkvist. Using SVD for some fitting problems. Accessed September 2016. url: https://goo.gl/qifZzT

[00184] [10] VLFeat. url: http://www.ylfeat.org/ [00185] [1 1] Ke Yan. RANSAC algorithm with example of finding homography. 20 March 011. url: https://goo.gl/nlPuV4.

96426460 1

Claims

WHAT IS CLAIMED IS:
1. A method of matching a first pattern against a second pattern, wherein the method comprises:
(a) a first pattern input step for providing the first pattern;
(b) a second pattern input step for providing the second pattern;
(c) a feature detection step comprising:
(i) a first pattern feature substep for generating a set of first features associated with the first pattern, the first features comprising first feature locations and first feature descriptors; and
(ii) a second pattern feature substep for generating a set of second features associated with the second pattern, the second features comprising second feature locations and second feature descriptors;
(d) a pattern comparison step comprising:
(i) a registration substep of: (1 ) matching the set of first features with the set of second features to generate a set of matching points; and (2) determining, based on the set of matching points, a match ratio, a localization ratio, and registration data comprising a rotation angle and a translation vector; and
(ii) a comparison substep of: (1) decomposing the second pattern into a prominent component of the second pattern and a distinct component of the second pattern; (2) applying the registration data to the first pattern to generate a registered first pattern; (3) determining a prominence ratio based on the registered first pattern and the prominent component of the second pattern; and (4) determining a distinction ratio based on the registered first pattern and the distinct component of the second pattern; and
(e) an evaluation step comprising a comparison of the match ratio, the localization ratio, the prominence ratio and the distinction ratio with a predetermined match ratio, a
96426460 1 predetermined localization ratio, a predetermined prominence ratio and a predetermined distinction ratio;
whereby the first pattern matches the second pattern if each of the match, localization, prominence and distinction ratios exceed the predetermined match, localization, prominence and distinction ratios.
2. The method according to claim 1, further comprising a step of determining the prominent component of the second pattern comprising: (i) a sample pattern input step for providing a plurality of sample patterns associated with a predetermined reference sample pattern; (ii) a registration step to register each sample pattern with the predetermined reference sample pattern and generate transformed sample patterns comprising transformed prominent features; (iii) a step to add the transformed prominent features for each transformed sample pattern to a prominent features set; and (iv) a prominent component determination step of determining the prominence of each prominent feature in the prominent features set and selecting the prominent features with prominence exceeding a predetermined prominence threshold.
3. The method according to claim 1, further comprising a step of determining the distinct component of the second pattern comprising: (i) a reference sample pattern input step for providing one or more predetermined reference sample patterns; (ii) a sample pattern input step for providing a plurality of sample patterns for the one or more predetermined reference sample patterns; (iii) a registration step to register each sample pattern with the respective one or more predetermined reference sample pattern and generate transformed sample patterns comprising transformed distinct features; (iv) a step to add the transformed distinct features to a distinctive features set; and (v) a distinct component determination step of determining the distinctiveness of each feature in the distinctive features set and selecting the distinct
96426460 1 features with distinctiveness below a predetermined distinctive threshold which are also in one or more predetermined reference sample patterns.
4. The method according to claims 1 to 3, wherein the second pattern is a reference pattern stored in a database.
5. The method according to claims 1 to 4, wherein the first pattern and the second pattern are binary.
6. The method according to claims 1 to 5, wherein the registration step is two- dimensional.
7. The method according to claims 1 to 6, wherein the registration step comprises random sample consensus (RANSAC).
8. A system for matching a first pattern against a second pattern, wherein the system comprises:
(a) a first pattern;
(b) a second pattern; and
(c) one or more processors encoded to:
(i) generate a set of: (A) first features associated with the first pattern comprising first feature locations and first feature descriptors; and (B) second features associated with the second pattern comprising second feature locations and second feature descriptors;
(ii) register a pattern comprising: (A) matching the set of first features with the set of second features to generate a set of matching points; and (B) determining, based on the set of matching points, a match ratio, a localization ratio, and registration data comprising a rotation angle and a translation vectors;
(iii) compare a pattern comprising: (A) decomposing the second pattern into a prominent component of the second pattern and a distinct component of the second pattern; (B) applying the registration data to the first pattern to generate a registered first pattern; (C)
96426460 1 determining a prominence ratio based on the registered first pattern and the prominent component of the second pattern; and (4) determining a distinction ratio based on the registered first pattern and the distinct component of the second pattern; and
(iv) compare the match ratio, the localization ratio, the prominence ratio and the distinction ratio with a predetermined match ratio, a predetermined localization ratio, a predetermined prominence ratio and a predetermined distinction ratio;
whereby the system is operative to facilitate a match between the first pattern and the second pattern if each of the match, localization, prominence and distinction ratios exceed the predetermined match, localization, prominence and distinction ratios.
9. The system according to claim 8, wherein the one or more processors is further encoded to: (v) determine the prominent component of the second pattern comprising: (A) providing a plurality of sample patterns associated with a predetermined reference sample pattern; (B) register each sample pattern with the predetermined reference sample pattern and generate transformed sample patterns comprising transformed prominent features; (C) add the transformed prominent features for each transformed sample pattern to a prominent features set; and (D) select the prominent features with prominence exceeding a predetermined prominence threshold.
10. The system according to claim 8, wherein the one or more processors is further encoded to: (v) determine the distinct component of the second pattern comprising: (A) providing a reference sample pattern input step for providing one or more predetermined reference sample patterns; (B) providing one or more predetermined reference sample patterns; (C) providing a plurality of sample patterns for the one or more predetermined reference sample patterns; (D) registering each sample pattern with the respective one or more predetermined reference sample pattern and generate transformed sample patterns comprising transformed distinct features; (E) adding the transformed distinct features to a
96426460 1 distinctive features set; and (F) determining the distinctiveness of each feature in the distinctive features set and selecting the distinct features with distinctiveness below a predetermined distinctive threshold which are also in one or more predetermined reference sample patterns.
11. The system according to claims 8 to 10, wherein the second pattern is a reference pattern stored in a database.
12. The system according to claims 8 to 11, wherein the first pattern and the second pattern are binary.
13. The system according to claims 8 to 12, wherein the registration step is two- dimensional.
14. The system according to claims 8 to 13, wherein the registration step comprises random sample consensus (RANSAC).
15. A non-transitory computer readable medium on which is physically stored executable instructions which, upon execution, will determine a match for a first pattern against a second pattern; wherein the executable instructions comprise processor instructions for one or more processors to automatically:
(a) generate a set of: (A) first features associated with the first pattern comprising first feature locations and first feature descriptors; and (B) second features associated with the second pattern comprising second feature locations and second feature descriptors;
(ii) register a pattern comprising: (A) matching the set of first features with the set of second features to generate a set of matching points; and (B) determining, based on the set of matching points, a match ratio, a localization ratio, and registration data comprising a rotation angle and a translation vectors;
(iii) compare a pattern comprising: (A) decomposing the second pattern into a prominent component of the second pattern and a distinct component of the second pattern;
96426460 1 (B) applying the registration data to the first pattern to generate a registered first pattern; (C) determining a prominence ratio based on the registered first pattern and the prominent component of the second pattern; and (4) determining a distinction ratio based on the registered first pattern and the distinct component of the second pattern; and
(iv) compare the match ratio, the localization ratio, the prominence ratio and the distinction ratio with a predetermined match ratio, a predetermined localization ratio, a predetermined prominence ratio and a predetermined distinction ratio;
to thus operatively facilitate a match between the first pattern and the second pattern if each of the match, localization, prominence and distinction ratios exceed the predetermined match, localization, prominence and distinction ratios.
96426460 1
PCT/CA2017/051416 2016-11-26 2017-11-24 Visual pattern recognition system, method and/or computer-readable medium WO2018094532A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US201662426515 true 2016-11-26 2016-11-26
US62/426,515 2016-11-26

Publications (1)

Publication Number Publication Date
WO2018094532A1 true true WO2018094532A1 (en) 2018-05-31

Family

ID=62194623

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2017/051416 WO2018094532A1 (en) 2016-11-26 2017-11-24 Visual pattern recognition system, method and/or computer-readable medium

Country Status (1)

Country Link
WO (1) WO2018094532A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100322522A1 (en) * 2009-06-17 2010-12-23 Chevron U.S.A., Inc. Image matching using line signature
US20140044362A1 (en) * 2011-03-02 2014-02-13 Centre National De La Recherche Scientifique Method and system for estimating a similarity between two binary images

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100322522A1 (en) * 2009-06-17 2010-12-23 Chevron U.S.A., Inc. Image matching using line signature
US20140044362A1 (en) * 2011-03-02 2014-02-13 Centre National De La Recherche Scientifique Method and system for estimating a similarity between two binary images

Similar Documents

Publication Publication Date Title
Denis et al. Efficient edge-based methods for estimating manhattan frames in urban imagery
Ganin et al. $$ N^ 4$$-Fields: Neural Network Nearest Neighbor Fields for Image Transforms
US20120027252A1 (en) Hand gesture detection
US20110150324A1 (en) Method and apparatus for recognizing and localizing landmarks from an image onto a map
Mnih et al. Learning to label aerial images from noisy data
US8005263B2 (en) Hand sign recognition using label assignment
Shen et al. Person re-identification with correspondence structure learning
Prasad et al. Edge curvature and convexity based ellipse detection method
CN101196994A (en) Image content recognizing method and recognition system
Hazirbas et al. Fusenet: Incorporating depth into semantic segmentation via fusion-based cnn architecture
Demir et al. Detection of land-cover transitions in multitemporal remote sensing images with active-learning-based compound classification
Wang et al. PISA: Pixelwise image saliency by aggregating complementary appearance contrast measures with edge-preserving coherence
Song et al. Eyes closeness detection from still images with multi-scale histograms of principal oriented gradients
Alberink et al. Performance of the FearID earprint identification system
Zhang et al. Random Gabor based templates for facial expression recognition in images with facial occlusion
Xue et al. Median-based image thresholding
Chiang et al. Automatic and accurate extraction of road intersections from raster maps
JP2012160047A (en) Corresponding reference image retrieval device and method thereof, content superimposing apparatus, system, method, and computer program
Zhu et al. Copy-move forgery detection based on scaled ORB
US9098888B1 (en) Collaborative text detection and recognition
Cheung et al. Bidirectional deformable matching with application to handwritten character extraction
Cheriyadat et al. Mapping of settlements in high-resolution satellite imagery using high performance computing
US20140313216A1 (en) Recognition and Representation of Image Sketches
CN101149804A (en) Self-adaptive hand-written discrimination system and method
US20120308143A1 (en) Integrating feature extraction via local sequential embedding for automatic handwriting recognition