AU2006201849A1 - Gaming object position analysis and tracking - Google Patents

Gaming object position analysis and tracking Download PDF

Info

Publication number
AU2006201849A1
AU2006201849A1 AU2006201849A AU2006201849A AU2006201849A1 AU 2006201849 A1 AU2006201849 A1 AU 2006201849A1 AU 2006201849 A AU2006201849 A AU 2006201849A AU 2006201849 A AU2006201849 A AU 2006201849A AU 2006201849 A1 AU2006201849 A1 AU 2006201849A1
Authority
AU
Australia
Prior art keywords
card
game
cards
gaming
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
AU2006201849A
Inventor
Maulin Gandhi
Prem Gururajan
Jason Jackson
Alex Levinshtein
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tangam Gaming Technology Inc
Original Assignee
Tangam Gaming Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tangam Gaming Technology Inc filed Critical Tangam Gaming Technology Inc
Publication of AU2006201849A1 publication Critical patent/AU2006201849A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F17/00Coin-freed apparatus for hiring articles; Coin-freed facilities or services
    • G07F17/32Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F17/00Coin-freed apparatus for hiring articles; Coin-freed facilities or services
    • G07F17/32Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
    • G07F17/3202Hardware aspects of a gaming system, e.g. components, construction, architecture thereof
    • G07F17/3216Construction aspects of a gaming system, e.g. housing, seats, ergonomic aspects
    • G07F17/322Casino tables, e.g. tables having integrated screens, chip detection means
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F17/00Coin-freed apparatus for hiring articles; Coin-freed facilities or services
    • G07F17/32Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
    • G07F17/3225Data transfer within a gaming system, e.g. data sent between gaming machines and users
    • G07F17/3232Data transfer within a gaming system, e.g. data sent between gaming machines and users wherein the operator is informed

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Analysis (AREA)

Description

03-May-2006 05:54 PM WATERMARK 61398196010 4/59 Pool) Section 29 Regulation 3.2(2)
AUSTRALIA
Patents Act 1990 COMPLETE SPECIFICATION STANDARD PATENT Application Number: Lodged: Invention Title: Gaming object position analysis and tracking CHECK! The following statement is a full description of this invention, including the best method of performing It known to me us: NO03-Mcxy-2006 05:55 PM WATERMARK 61398196010 5/59 TITLE: Gaming Object Position Analysis and Tracking 0 723,481 filed October 5" 2005, 60 723,452 filed October 511) 2005, 60 736,334 filed Ni November i5"]l 2005, 60 760,365 filed January 20 th 2005 and 60/77 1,058 filed February 0 8t', 200(6.
BACKGROUND
100021 Casinos propose a wide variety of gamnbling activities to accommodate players and their preferences. Some of' those activities reward strategic thinking while others are impartial, but each one of -them obeys a strict set of rules that, favours; the casino over its clients, 100031 The success of a casino relies partially onl the efficiency and consistency with which those rules are applied by the dealer. A pair of slow dealing hands or an undeserved payout may have substantial consequences on profitability.
100041 Another critical factor is the consistency with which those rules are respected by the player. Large sun-s of money travel through the casino, tempting players to bend the.
rules. Again, an undetected card switch or complicity between a dealer and a player may be highly detrimental to profitability.
[0005J For those reasons among others, casinos have traditionally invested tremendous efforts in monitoring gambling activities. Initially, the task was performed manually, a solution that was both expensive and inefficient. However, technological innovations have been offering advantageous alternatives that reduce costs while increasing efficiency.
fOOOGJ In the gaming industry, there is a need for automation of tracking of activities happening at table games. Casino chips, playing cards, card hands, wagers, payouts, chip tray float, currency transactions, game outeomes and players are examples of' items and activities that are tracked and monitored at casino table games. Amongst other applications, automated tracking would improve player tracking, game security, operational efficiency monitoring, and can enable development of new table games involving concepts such as bonusing, jackcpots, progressive jackpots and side betting.
03-May-2006 05:55 PM WATERMARK 61398196010 6/59 2 [0007J There are several issues and challenges with overhead video camera based gamne monitoring. One challenge is that performing repetitive optical recognition on O consecutive images in a video stream can be processing intensive. Another challenge is that gaming objects might occasionally be partially or entirely occluded from an overhead ON camera view. A playing card can be occluded because of the dealer's clothing, bands or 00 other gaming objects. Yet another issue is that cards and card hands that are moved on the o table can result in blurred images. Sometimes, due to space constraints a dealer may place ci playing card hands such that two or more playing card hands have some overlap even
IN
o though ideally there should not be any overlap between distinct playing card hands. There Ci could be other objects on the table, such as patterns on dealer clothing, that may appear somewhat similar to a playing card shape and consequently result in erroneous playing card detection ("false positives"). The disclosed invention seeks to alleviate some of these problems and challenges with respect to overhead video camera based game monitoring.
[00081 One of the most important aspects of table gamie monitoring consists in rcognizing playing cards, or at the very least, their value with respect to the game being played. Such recognition is particularly challenging when the card corner or the central region of a playing card is undetectable within an overhead image of a card hand, or more generally, within that of an amalgam of' overlapping objects. Current solutions for achieving such recognition bear various weaknesses, especially when confronted to those particular situations.
100091 U.S. Pat. App. No. 11/052,941, titled "Automated Game Monitoring", by Tran, discloses a method of recognizing a playing card positioned on a table within an overhead image. Thu method consists in detecting the contour of the card, validating the card from its contour, detecting adjacent corners of the card, projecting the boundary of the card based on the adjacent corners, binarizing pixels within dlie boundary, and counting the number of pips to identify the value of the card. While such a method is practical for recognizing a solitary playing card, or at least one that is not significantly overlapped by other objects, it may not be applicable in cases where the corner or central region of the card is undetectable due to the presence of overlapping objects. It also does not provide a method of distinguishing face cards. Furthermore, it does not provide a method of extracting a region of interest encompassing a card identifying symbol when only a partial card edge is available or whcn card corners are not available.
[0(31 (I A paper titled "Introducing Computers to Blackjack: Implementation of a Card Recognition System Using Computer Vision Techniques", written by G. Hollinger and N.
03-May-2006 05:55 PM WATERMARK 61398196010
VO
CN] 3 Ward, proposes the use of neural networks to distinguish face cards. The method proposes determining a central moment of individual playing cards to determine a rotation angle.
Ce SThis approach of determining a rotation angle is not appropriate for overlapping cards forming a card hand. They propose counting the number of pips in the central region of the O card to identify number cards. This approach of pip counting will not be feasible when a 00 card is significantly overlapped by another object. They propose training three neural o networks to recognize face card symbols extracted from an upper left region of a face card,
C
N where each of the networks would be dedicated to a distinct face card symbol. The neural Snetwork is trained using a scaled image of the card symbol. A possible disadvantage of C'l trying to directly recognize images of a symbol using a neural network is that it may have insufficient recognition accuracy especially under conditions of stress such as image rotation and noise.
[00111 Several references propose to achieve such recognition by endowing each playing card with an easily detectable and identifiable marker. For instance, U.S. Pat. App.
No. 10/823,051, titled "Wireless monitoring of playing cards and/or wagers in gaming", by SOLTYS, discloses playing cards bearing a conductive material that may be wirelessly interrogated to achieve recognition in any plausible situation, regardless of visual obtrusions. One disadvantage of their implementation is that such cards are more expensive than normal playing cards. Furthermore, adhering casinos would be restricted to dealing such special playing cards instead of those of their liking.
[00121 Card recognition is particularly instrumental in detecting inconsistencies on a game table, particularly those resulting from illegal procedures. However, such detection is yet to be entirely automated and seamless as it requires some form of human intervention.
[00131 MP Bacc, a product marketed by Bally Gaming for detecting an inconsistency within a game of Baccarat consists of a card shoe reader for reading bar-coded cards as they are being dealt, a barcode reader built into a special table for reading cards that were dealt, as well as a software module for comparing data provided by the card reader and discard rack.
[00141 The software module verifies that the cards that have been removed from the shoe correspond to those that have been inserted into the barcode reader on file table. It also verifies that the order in which the cards have been removed from the shoe corresponds to the order in which they were placed in the barcode reader. One disadvantage of this system is that it requires the use of bar-coded cards and barcode readers to be present in the playing area. The presence of such devices in the playing area 7/59 03-May-2006 05:56 PM WATERMARK 61398196010 8/59
VO
04 may be intrusive to players. Furthermore, dealers may need to be trained to use the special devices and therefore the system does not appear to be seamless or natural to the existing Splaying environment.
[0015] It is unreasonable to expect any gaming object positioning and identification system to be perfect. There are often scenarios where a game tracking method must 00 analyze ambiguous gaming object data in determining the game state and game progress.
O For instance, an overhead video camera based recognition system can produce ambiguous IN or incomplete data caused by playing card occlusion, movement, false positives, dealer 0 mistakes and overlapping of card hands. Other systems involving RFID embedded playing cards could produce similar ambiguity relating to position, movement, distinction of separate card hands, dealer mistakes false positives etc. The disclosed invention seeks to alleviate some of the challenges of ambiguous data by providing methods to improve robustness of game tracking.
SUMMARY
[0016] It would be desirable to be provided with a system for recognizing playing cards positioned on a game table in an accurate and efficient mannmer.
100171 It would be desirable to be provided with a method of recognizing standard playing cards positioned on a game table without having to detect their corner.
[0018] It would also be desirable to be provided with a seamless, automated, and reliable system for detecting inconsistencies on a game table and providing an accurate description of the context in which detected inconsistencies occurred.
[0019] An exemplary embodiment is directed to a method of tracking a gaming object on a gaming table, the method comprising obtaining a position profile for a gaming object and resolving the position profile on the gaming table.
100201 Another embodiment is directed to a method of tracking gaming objects on a gaming table comprising: recording temporally sequential data relating to a plurality of the gaming objects; determining an identity and a position profile of a tracked one of the gaming objects at a first instant in time from the data; determining a position profile of an investigated one of the objects at a second instant in time from the data; identifying a compatibility between the position profile of the investigated one and the position profile of the tracked one; and assigning the identity to the investigated one of the objects according to the compatibility.
03-May-2006 05:56 PM WATERMARK 61398196010 9/59
IO
c"1 1 0 021] Yet another embodiment is directed to a system for tracking a gaming object on a gaming table, the system comprising means for obtaining a position profile for a O gaming object and means for resolving the position profile on the gaming table.
[00221 Yet another embodiment is directed to a system for tracking gaming objects on a gaming table comprising: means for recording temporally sequential data relating to a 00 Splurality of the gaming objects; means for determining an identity and a position profile of a tracked one of the gaming objects at a first instant in time from the data; means for o determining a position profile of an investigated one of the objects at a second instant in O time from the data; means for identifying a compatibility between the position profile of the investigated one and the position profile of the tracked one; and means for assigning the identity to the investigated one of the objects according to the compatibility.
[0023] Yet another embodiment is directed to a method of tracking the progress of a game on a gaming table comprising: recording data frames and game states as data while the game is in progress; establishing a first state of the game from the data; identifying an occurrence of a game event that follows the first state; evaluating whether die game event and a set of rules of the game provide sufficient information to accurately create a second state; determining that further information is required to accurately create the second state according to the results of the evaluating; obtaining the further information from the data; and creating a second state according to the game event, the set of rules and the further information.
[0024] Yet another embodiment is directed to a method of tracking the progress of a game on a gaming table comprising: recording data relating to the game while the game is in progress; establishing a plurality of potential game states of the game; identifying an occurrence of a game event that follows the plurality of potential game states; applying the game event to at least two of the plurality of potential game states to establish at least one new potential game state; adding the at least one new potential game state to the plurality of potential game states to establish an updated plurality of potential states; evaluating a likelihood of each potential game state; and identifying at least one likely potential game state of the updated plurality based on the evaluating, 100251 Yet another embodiment is directed to a system of tracking the progress of a game on a gaining table comprising: means for recording data frames and game states as data while the game is in progress; means for establishing a first state of the game from the data; means for identifying an occurrence of a game event that follows the first state; means for evaluating whether the game event and a set of rules of the game provide 03-May-2006 05:57 PM WATERMARK 61398196010 6 sufficient information to accurately create a second state; means for determining that further information is required to accurately create the second state according to the results O of the evaluating; means for obtaining the further information from the data; and means for creating a second state according to the game event, the set of rules and the further Sinformation.
00 [0026] Yet another embodiment is directed to a system for tracking the progress of a 0 game on a gaming table comprising: means for recording data relating to the game while ND the game is in progress; means for establishing a plurality of potential game states of the 0 game; means for identifying an occurrence of a game event that follows the plurality of potential game states; means for applying the game event to at least two of the plurality of potential game states to establish at least one new potential game state; means for adding the at least one new potential game state to the plurality of potential game states to establish an updated plurality of potential states; means for evaluating a likelihood of each potential game state; and means for identifying at least one likely potential game state of the updated plurality based on the evaluating.
[0027] Yet another embodiment is directed to a system for identifying a gaming object on a gaming table comprising at least one overhead camera for capturing an image of the table; a detection module for detecting a feature of the object on the image; a search module for extracting a region of interest of the image that describes the object from the feature; a feature space module for transforming a feature space of the region of interest to obtain a transformed region of interest; and an identity module trained to recognize the object from the transformed region.
[0028] According to another embodiment, at least one factor attributable to casino and table game environments impedes reliable recognition of said object by said statistical classifier when trained to recognize said object from said region of interest without transformation by said feature space module.
[0029] Yet another embodiment is directed to a method of identifying a value of a playing card placed on a game table comprising: capturing an image of the table; detecting at least one feature of the playing card on the image; delimiting a target region of the image according to the feature, wherein the target region overlaps a region of interest, and the region of interest describes the value; scanning the target region for a pattern of contrasting points; detecting the pattern; delimiting the region of interest of the image according to a position of the pattern; and analyzing the region of interest to identify the value.
10/59 03-May--2006 05:57 PM WATERMARK 61398196010 C~1 7 [00301 Yet another embodiment is directed to a system for detecting an inconsistency with respect to playing cads dealt on a game table comprising: a card reader for O determining an identity of each playing card as it is being dealt on the table; an overhead camera for capturing images of the table; a recognitiony module for determining an identity of each card positioned on the table from the images; and a tracking module for comparing 00 the identity deternined by the card reader with the identity determined by the recognition o module, and detecting the inconsistency.
INC BRIEF DE SCRLPT[ON OF THE DRAWINGS 0[0031] For a better understanding of embodiments of the present invention, and to show more clearly how it may be carried into effect, reference will now be made, by way of example, to the accompanying drawings which aid in understanding and in which: [00321 Figure I is an overhead view of a card game; [00331 Figure 2 is a side plan view of an imaging systemn; [00341 Figure 3 is a side plan view of an overhead imaging system; [00351 Figure 4 is a top plan view of a lateral imaging systemn; 10036J Figure 5 is an overhead view of a gaming table containing FYI D detectors; [00371 Figure 6 is a block diagram of the components of an exemplary embodiment of a systein for tracking gaming objects; j00381 Figure 7 is a plan view of card hand representations; [00391 Figure 8 is a flowchart of a first embodiment of an H' module; [00401 Figure 9 is an overhead view of a gaining table with proximity detection sensors; [0041] Figure 10 is a plan view of a card posidion relative to proximity detection sensors; 10042J Figure I1I is a plan view of card hand representations with positioning features;, [0043] Figure 12 is an illustrative example of applying corners in a contour test; [0044] Figure 13 is an illustrative examnple of matching vertical and horizontal card orientations;, 11/59 [0045] Figure 14 is a flowchart of a points in contour test; 03-Mcxy-2006 05:58 PM WATERMARK 61398196010
IND
[00461 Figure 15 Is an illustrative example of detecting frame diiferencing through motion detection; 0[00471 Figure 16 is an illustrative example of a the band of a dealer occluding the contour of a card hand; [0048] Figure .17 is an illustrative example of changes in blob properties; o[0049] Figure 18 is an illustrative example of applying erosion; 1N 00501 Figure 19 is an illustrative example of the separation of two card bands; O[00511 Figure 20 is an illhstrative example of pair-wise ro~tation and analysis; [00521 Figures 21 a and 21ib, are flowcharts of a card band separation process; 100531 Figure 22 is an illustrative example of the front and back butfer of data frames; [00541 Figure 23 is an illustrative example of states with backward tracking; [0055] Figure 24 is an illustrative example of states with forward tracking; 100561 F4igures 25a and 25b are flowcharts of the process of single state tracking; [00571 Figure 26 is a flowchart of the process of backward tracking; 100,581 Figure 27 is an illustrative example of backward tracking; [0059] Figure 28 is a flowchart of the process of forward tracking;, [0060] Figure 29 is an illustrative example of forward tracking; [00611 Figure 30 is an illustrative example of multi state tracking; [0062] Figures 31ia and 31Ib are illustrative examples of multiple game states; [00631 Figure 32 is a flowchart of Lhe process of player tracking; [00641 Figure 33 is a flowchart of the process of surveillance; [0065) Figure 34 is a flowchart of the process of utilizing surveillance data; [0066] Figure 35 illustrates an overhead image of a card hand where the corners of a card are undetectable; [0067] Figure 36 is a flowchart describing the pretbrred method of extracting a region of interest from a card edge; [00681 Figure 37 illustrates an application of the prefer-red method of extracting a region of interest from a card edge; 12/59 03-May-2006 05:58 PM WATERMARK 61398196010
IND
[0069] Figure 38 is a flowchart describing another method for extracting a region of interest from a card edge; 0[00701 Figure 39 illustrates an application of another method of extracting a region of interest from a card edge; [0071] Figure 40 is a block diagram of the preferred system for identifying a gaming object on a gaming table; 0 recognition purposes 100731 Figure 42 is a flowchart describing a method of detecting inconsistencies with respect to playing cards dealt on a, game table; 100741 Figure 43 illustrates a first application of the method of detecting inconsistencies with respect to playing cards dealt on a game table; [00751 Figure 44 illustrates a second application a mnethod of detecting inconsistencies with respect to playing cards dealt on a game table; 100761 Figure 45 illustrates a third a method of detecting inconsistencies with respect to playilng cards dealt on a game table; [00771 Figure 46 illustrates a Feed Forward Neural Network;, and [00781 Figure 47 illustrates Ilaar feature classifiers; 100791 Figure 48 is a flowchart describing a method of calibrating an imaging system within the context of table game tracking; and 100801 Figure 49 illustrates a combination of weak classifiers into one strong classifier as achieved through a boosting module.
DETAILED DESCRIPTION 100811 lIa the following description of exemplary embodiments we will use the card gamie of blackjack as an example to illustrate how the embodiments mray be utilized.
[0082] Referring now to Figure I an overhead view of a card game is shown generally as 10. More specifically, Figure 1 is an example of a blackjack game in progress. A gaming table is shown as feature 12. Feature 14 is a single player and feature 16 is the dealer. Player 14 has three cards 18 dealt by dealer 16 within dealing area 20. The dealer's cards are shown as feature 22. In this example dealer 16 utilizes a card shoe 24 to deal cards 18 and 22 and places them in dealing area 20. Within gaming table 12 there arc 13/59 03-May-2006 05:58 PM WATERMARK 61398196010
O
0 a plurality of betting regions 26 in which a player 14 may place a bet. A bet is placed through the use of chips 28. Chips 28 are wagering chips used in a game, examples of O which are plaques, jetons, wheelchecks, ILadio Frequency Identification Device (RFID) embedded wagering chips and optically encoded wagering chips.
[0083] An example of a bet being placed by player 14 is shown as chips 28a within betting region 26a. Dealer 16 utilizes chip tray 30 to receive and provide chips 28.
O Feature 32 is an imaging system, which is utilized by he present invention to provide ^N overhead imaging and optional lateral imaging of game 10. An optional feature is a player 0 identity card 34, which may be utilized by the present invention to identify a player 14.
100841 At the beginning of every game players 14 that wish to play place their wager, usually in the form of gaming chips 28, in a betting region 26 (also known as betting circle or wagering area). Chips 28 can be added to a betting region 26 during the course of the game as per the rules of the game being played. The dealer 16 then initiates the game by dealing the playing cards 18, 22. Playing cards can be dealt either from the dealer's hand, or from a card dispensing mechanism such as a shoe 24, The shoe 24 can take different embodiments including non-electromechanical types and electromechanical types. The shoe 24 can be coupled to an apparatus (not shown) to read, scan or image cards being dealt from the shoe 24. The dealer 16 can deal the playing cards 18, 22 into dealing area The dealing area 20 may have a different shape or a different size than shown in Figure 1. The dealing area 20, under nonnrmal circumstances, is clear of foreign objects and usually only contains playing cards 18, 22, the dealer's body parts and predetermined gaming objects such as chips, currency, player identity card 34 and dice. A player identity card 34 is an identity card that a player 14 may possess, which is used by the player to provide identity data and assist in obtaining complimentary ("comps") points from a casino. A player identity card 34 may be used to collect comp points, which in turn may be redeemed later on for comps.
100851 During the progression of the game, playing cards 18, 22 may appear, move, or be removed from the dealing area 20 by the dealer 16. The dealing area 20 may have specific regions outlined on the table 12 where the cards 18, 22 are to be dealt in a certain physical organization otherwise known as card sets or "card hands", including overlapping and non-overlapping organizations.
[00861 For the purpose of this disclosure, chips, cards, card hands, currency bills, player identity cards and dice are collectively referred to as gaming objects. In addition 14/59 03-May-2006 05:59 PM WATERMARK 61398196010 15/59
VO
the term "gaming region" is meant to refer to any section of gaming table 12 including the entire gaming table 12.
[0087] Referring now to Figure 2, a side plan view of an imaging system is shown.
This is imaging system 32 of Figure 1. Imaging system 32 comprises overhead imaging system 40 and optional lateral imaging system 42. Imaging system 32 can be located on or 00 Sbeside the gaming table 12 to image a gaming region from a top view and/or from a lateral 0 view. Overhead imaging system 40 can periodically image a gaming region from a planar IND overhead perspective. The overhead imaging system 40 can be coupled to the ceiling or to O a wall or any location that would allow an approximate top view of the table 12. The optional lateral imaging system 42 can image a gaming region from a lateral perspective.
Imaging systems 40 and 42 are connected to a power supply and a processor (not shown) via wiring 44 which runs through tower 46.
100881 The imaging system 32 utilizes periodic imaging to capturing a video stream at a specific number of frames over a specific period of time, such as for example, thirty frames per second. Periodic imaging can also be used by an imaging system 32 when triggered via software or hardware means to capture an image upon the occurrence of a specific event. An example of a specific event would be if a stack of chips were placed in a betting region 26. An optical chip stack or chip detection method utilizing overhead imaging system 40 can detect this event and can send a trigger to lateral imaging system 42 to capture an image of the betting region 26. In an alternative embodiment overhead imaging system 40 can trigger an RFID reader to identify the chips. Should there be a discrepancy between the two means of identifying chips the discrepancy will be flagged.
100891 Referring now to Figure 3, a side plan view of an overhead imaging system is shown. Overhead imaging system 40 comprises one or more imaging devices 50 and optionally one or more lighting sources (if required) 52 which arc each connected to wiring 44. Each imaging device 50 can periodically produce images of a gaming region. Charged Coupling Device (CCD) sensors, Complementary Metal Oxide Semiconductor (CMOS) sensors, line scan imagers, area-scan imagers and progressive scan imnagers are examples of imaging devices 50. Imaging devices 50 may be selective to any frequency of light in the electro-magnetic spectrum, including ultra violet, infra red and wavelength selective.
Imaging devices 50 may be color or grayscale. Lighting sources 52 may be utilized to improve lighting conditions for imaging. Incandescent, fluorescent, halogen, infrared and ultra violet light sources are examples of lighting sources 52.
03-May-2006 05:59 PM WATERMARK 61398196010 16/59
VO
cIN S12 [00901 An optional case 54 encloses overhead imaging system 40 and if so provided, Cc, includes a transparent portion 56, as shown by the dotted line, so that imaging devices Smay view a gaming region.
[0091] Referring now to Figure 4, a top plan view of a lateral imaging system is shown. Lateral imaging system 42 comprises one or more imaging devices 50 and 00 optional lighting sources 52 as described with reference to Figure 3.
0 N [0092] An optional case 60 encloses lateral imaging system 42 and if so provided 0 includes a transparent portion 62, as shown by the dotted line, so that imaging devices may view a gaming region.
[0093] The examples of overhead imaging system 40 and lateral imaging system 42 are not meant by the inventors to restrict the configuration of the devices to the examples shown. Any number of imaging devices 50 may be utilized and if a case is used to house the imaging devices 50, the transparent portions 56 and 62 may be configured to scan the desired gaming regions.
[00941 According to one embodiment of the present invention, a calibration module assigns parameters for visual properties of the gaming region. Figure 48 is a flowchart describing the operation of the calibration module as applied to the overhead imaging system. The calibration process can be: manual, with human assistance; fully automatic; or semi automatic.
[0095] Referring back to Figure 48, a first step 4800 consists in waiting for an image of the gaming region from the overhead imager(s). The next step 4802 consists in displaying the image to allow the user to select the area of interest where gaming activities occur. For instance, within the context of blackjack gaming, the area of interest can be a box encompassing the betting boxes, the dealing arce, and the dealer's chip tray.
100961 In step 4804, coefficients for perspective correction are calculated. Such correction consists in an image processing technique whereby an image can be warped to any desired view point. Its application is particularly useful if the overhead imagers are located in the signage and the view of the gaining region is slightly warped. A perfectly overhead view point would be best for further image analysis. A checkerboard or markers on the table may be utilized to assist with calculating the perspective correction coefficients.
10097J Subsequently, in step 4806, the resulting image is displayed to allow the user to select specific points or regions of interest within the gaming area. For instance, the user 03-Mcxy-2006 06:00 PM 'WATERMARK 61398196010 17/59 13 may select the position of betting spots and the region encompassing the dealer's chip tray.
Other specific regions or points within the gaming area may be selected.
0[00981 In the next step 4808, camera -parameters such as shutter value, gain value(s) are calculated and white balancing operations arc performed. Numerous algorithms are publicly available to one skilled in the ant for performing camera calibration.
00 10099J In step 4810, additional camera calibration is performed to aqjust the lens focus 0 [001001 Once the camera calibration is complete and according to step 4812, an image Cl of th~e table layout, clear of any objects on its surface, is captured mnd saved as a background image. Such an image may be for detecting objects on the table. The, background image may be continuously captured at various points during system operation in order to have a most recent background image.
1001011 In step 481IA, while the table surface is still clear of objects additional points of interest such as predetermined markers are captured.
1001021 In the final step 4816, the calibration parameters arc stored in memory.
[00 1031 It must be noted that the calibration concepts may be applied for the lateral imaging system as well as other imaging systems.
1001 04J In an optional embodiment, continuous calibration checks may be utiliz ed to ensure that the initially calibrated environment remains relevant. For instance a continuous brightness check may be performed periodically, and if it fails, an alert may be asserted through a feedback device indicating the need for re-,calibration. Similar periodic, automatic checks may be performed for white balancing, perspective correction, and region of interest definition.
1001 05J As an example, if lighting in the gaming region changes calibration may need to be performed again. A continuous brightness check may be applied periodically and if the brightness check fails, an alert may be asserted through one of the feedback devices indicating the need for re-calibration. Similar periodic, automatic cheeks m-ay be performed for white balancing, perspective correction and the regions of Interest.
1001061 In an optional embodiment, a white sheet similar in shade to a playing card surface may be placed on the table durin~g calibration in order to determine the value of the white sheet at various points on the gaming table and consequently the lighting conditions 03-May-2006 06:00 PM WATERMARK 61398196010
VO
14 at these various points. The recorded values may be subsequently utilized to determine threshold parameters for detecting positions of objects on the table.
O [00107] It must be noted that not all steps of calibration need human input Certain steps such as white balancing may be performed automatically.
100108] In addition to the imaging systems described above, exemplary embodiments 00 may also make use of RFID detectors for gambling chips containing an RFID, Figure 5 is C- an overhead view of a gaming table containing RFID detectors 70. When one or more I chips 28 containing an RFID are placed on an RFID detector 70 situated below a betting O region 26 the values of the chips 28 can be detected by the RFID detector 70. The same technology may be utilized to detect the values of RFID chips within the chip tray [00109] Referring now to Figure 6 a block diagram of the components of an exemplary embodiment is shown. Identity and Positioning module (IP module) 80 identifies the value and position of cards on the gaming table 12. Intelligent Position Analysis and Tracking module (IPAT module) 84 performs analysis of the identity and position data of cards and interprets them intelligently for the purpose of tracking game events, game states and general game progression. The Game Tracking module (GC module) 86 processes data from the IPAT module 84 and keeps track of game events and game status. The GT module 86 can optionally obtain input from Bet Recognition module 88. Bet Recognition module 88 identifies the value of wagers placed at the game. Player Tracking module keeps track of patrons and players that are participating at the games. Surveillance module 92 records video data from imaging system 32 and links game event data to recorded video. Surveillance module 92 provides efficient search and replay capability by way of linking game event time stamps to the recorded video. Analysis and Reporting module 94 analyzes the gathered data in order to generate reports on players, tables and casino personnel. Example reports include reports statistics on game related activities such as profitability, employee efficiency and player playing patterns. Events occurring during the course of a game can be analyzed and appropriate actions can be taken such as player profiling, procedure violation alerts or fraud alerts.
[001101 Modules 80 to 94 communicate with one another through a network 96. A 100Mbps Ethernet Local Area Network or Wireless Network can be used as a digital network. The digital network is not limited to the specified implementations, and can be of any other type, including local area network (LAN), Wide Area Network (WAN), wired or wireless Internet, or the World Wide Web, and can take the form of a proprietary cxtranet.
18/59 03-May-2006 06:01 PM WATERMARK 61398196010
VO
1001111 Controller 98 such as a processor or multiple processors can be employed to execute modules 80 to 94 and to coordinate their interaction amongst themselves, with the O imaging system 32 and with input/output devices 100, optional shoe 24 and optional RFID detectors 70. Further, controller 98 utilizes data stored in database 102 for providing O operating parameters to any of the modules 80 to 94. Modules 80 to 94 may write data to 00 database 102 or collect stored data from database 102. Input/Output devices 100 such as a Slaptop computer, may be used to input operational parameters into database 102.
k\f Examples of operational parameters are the position coordinates of the betting regions 26 on the gaming table 12, position coordinates of the dealer chip tray 30, game type and game rules.
[0011.21 Before describing how the present invention may be implemented we first provide some preliminary definitions. Referring now to Figure 7 a plan view of card representations is shown. A card or card hand is first identified by an image from the imaging system 32 as a blob 1 10. A blob may be any object in the image of a gaming area but for the purposes of this introduction we will refer to blobs 110 that are cards and card hands. The outer boundary of blob 110 is then traced to determine a contour 112 which is a sequence of boundary points forming the outer boundary of a card or a card hand. In determining a contour, digital imaging thresholding is used to establish thresholds of grey.
In the case of a card or card hand, the blob 110 would be white and bright on a table. From the blob 110 a path is traced around its boundary until the contour 112 is established. A contour i 12 is then examined for regions of interest (ROI) I 18, which identify a specific card. Although in Figure 7 ROI 118 has been shown to be the rank and suit of a card an alternative ROI could be used to identify the pip pattern in the centre of a card. From the information obtained from ROls 118 it is possible to identify cards in a card hand 120.
[00113] IP module 80 may be implemented in a number of different ways. In a first embodiment, overhead imaging system 32 (see Figure 2) located above the surface of the gaming table provides overhead images. An overhead image need not be at precisely ninety degrees above the gaming table 12. In one embodiment it has been found that seventy degrees works well to generate an overhead view. An overhead view enables the use of two dimensional Cartesian coordinates of a gaming region. One or more image processing algorithms process these overhead images of a gaming region to determine the identity and position of playing cards on the gaming table 12.
1001141 Referring now to Figure 8 a flowchart of an embodiment of an IP module 80 is shown. Beginning at step 140 initialization and calibration of global variables occurs.
19/59 03-May-2006 06:01 PM WATERMARK 61398196010
O
16 Examples of calibration arc manual or automated setting of camera properties for an imager 32 such as shutter value, gain levels and threshold levels. In the case of thresholds, O a different threshold may be stored for each pixel in the image or different thresholds may be stored for different regions of the image. Alternatively, the threshold values may be Sdynamically calculated from each image. Dynamic determination of a threshold would 00 calculate the threshold level to be used for filtering out playing cards from a darker table Sbackground.
IN) [00115] Moving to step 142 the process waits to receive an overhead image of a O gaming region from overhead imaging system 40. At step 144 a thresholding algorithm is applied to the overhead image in order to differentiate playing cards from the background to create a threshold image. A background subtraction algorithm may be combined with the thresholding algorithm for'improved performance. Contrast information of the playing card against the background of the gaming table 12 can be utilized to determine static or adaptive threshold parameters. Static thresholds are fixed while dynamic thresholds may vary based upon input such as the lighting on a table. The threshold operation can be performed on a gray level image or on a color image. Step 144 requires that the surface of game table 12 be visually contrasted against the card. For instance, if the surface of game table 12 is predominantly white, then a threshold may not be effective for obtaining the outlines of playing cards. The output of the thresholded image will ideally show the playing cards as independent blobs 110. This may not always be the case due to issues of motion or occlusion. Other bright objects such as a dealer's hand may also be visible as blobs 110 in the thresholded image. Filtering operations such as erosion, dilation and smoothing may optionally be performed on the thresholded image in order to eliminate noise or to smooth the boundaries of a blob 110.
[001161 In the next step 146, the contour 112 corresponding to each blob 110 is detected. A contour 112 can be a sequence of boundary points of the blob 110 that more or less define the shape of the blob 110. The contour 112 of a blob 110 can be extracted by traversing along the boundary points of the blob 110 using a boundary following algorithm. Alternatively, a connected components algorithm may also be utilized to obtain the contour 112.
[0(11171 Once the contours 112 have been obtained processing moves to step 148 where shape analysis is performed in order to identify contours that are likely not cards or card hands and eliminate these from further analysis. By examining the area of a contour 112 and the external boundaries, a match may be made to the known size and/or dimensions of 20/59 03-May-2006 06:02 PM WATERMARK 61398196010 21/59
VO
17 cards. If a contour 112 does not match the expected dimensions of a card or card hand it can be discarded.
[001181 Moving next to step 150, line segments 114 forming the card and card hand boundaries are extracted. One way to extract line segments is to traverse along the boundary points of the contour 112 and test the traversed points with a line fitting 00 algorithm. Another potential line detection algorithm that may be utilized is a Hough O Transform. At the end of step 150, line segments 114 forming the card or card hand ID boundaries are obtained. It is to be noted that, in alternate embodiments, straight line O segments 114 of the card and card hand boundaries may be obtained in other ways. For instance, straight line segments 114 can be obtained directly from an edge detected image.
For example, an edge detector such as the Laplace edge detector can be applied to the source image to obtain an edge map of the image from which straight line segments 114 can be detected. These algorithms are non-limiting examples of methods to extract positioning features, and one skilled in the art might use alternate methods to extract these card and card hand positioning features.
1001191 Moving to step 152, one or more comers 116 of cards can be obtained from the detected straight line segments 114. Card comers 116 may be detected directly from the original image or thresholded image by applying a corner detector algorithm such as for example, using a template matching method using templates of comer points.
Alternatively, the comer 116 may be detected by traversing points along contour 112 and fitting the points to a comer shape. Corner points 116, and line segments 114 are then utilized to create a position profile for cards and card hands, i.e. where they reside in the gaming region.
[00120] Moving to step 154, card corners 116 are utilized to obtain a Region of Interest (ROt) 1 18 encompassing a card identifying symbol, such as the number of the card, and the suit. A card identifying symbol can also include features located in the card such as the arrangement of pips on the card, or can be some other machine readable code.
[001211 Comrners of a card are highly indicative of a position of a region of interest. For this very reason, they constitute the preferred reference points for extracting regions of interest. Occasionally, corners of a card may be undetctable within an amalgam of overlapping gaming objects, such as a card hand. The present invention provides a method of identifying such cards by extracting a region of interest from any detected card feature that may constitute a valid reference point.
03-May-2006 06:02 PM WATERMARK 61398196010 22/59 18 100*1221 Fig. 35 illustrates an overhead image of a card hand 3500 comprised of cards 3502, 3504, 3506, and 3508. The card 3504 overlaps the card 3502 and is overlapped by O the card 3506 such that corners of the card 3504 are not detctable.
[001231 According to a preferred embodiment of the invention, the overhead image is analyzed to obtain the contour of the card hand 3500. Subsequently, line segments 3510, 00 3512, 3514, 3516, 3 518, 3520, 3522, and 3524 forming the contour of the card hand 3500 0 are extracted. The detected line segments are thereafter utilized to detect convex corners INO 3530, 3532, 3534, 3536, 3538, and 3540.
0[001241 As mentioned herein above, corners constitute the preferred reference points for extracting Regions of Interest. in the following description, the tern "index corner" refers to a corner of a card in the vicinity of which a region of interest is located. The term "blank corner" refers to a corner of a card that is not an index corner.
100125] The corner 3530 is the first one to be considered. A sample of pixels drawn within the contour, in the vicinity of the corner 3530, is analyzed in order to determine whether the corner 3530 is an index corner. A sufficient number of contrasting pixels arc detected and the corner 3530 is identitied as an index coiner, Consequently, a region of interest is projected and extracted according to the position of the corner 3530, as well as the width, height, and offset of regions of interests from index corners.
f00126J Similarly, the corner 3532 is identified as an index corner and a corresponding region of interest is projected and extracted.
[00127] The corner 3534 is the third to be considered. Due to their coordinates, the corners 3532 and 3534 are identified as belonging to a samne card, and consequently, the corner 3534 is dismissed from further analysis.
[001281 Similarly to corners 3530 and 3532, the corner 3536 is identified as an index corner and a corresponding region of interest is projected and extracted.
[001291 The corners 3538 and 3540 are the last ones to be considered. Due to their coordinates, the corners 3530, 3538 and 3540 are identified as belongig to a same card, and consequently, the corners 3538A and 3540 is dismissed from fiurther analysis.
[001301 As a result of the corner analysis, the regions of interest of the cards 3502, 3506 and 3508 of the card hand 3500 have been extracted. However, none of the corners of the card 3504 has been detected and consequently, no corresponding region of interest has been extracted.
03-Mcxy-2006 06:03 PM WATERMARK 61398196010 23/59 100131] In order to extract any remaining regions of interest, the extracted line segments 3510, 3512, 3514, 3516, 3518, 3520, 3522, and 3524 forming the contour of the O card band 3500 are utilized according to a method provided by the present invention.
[001321 In Figure 36, a flowchart describing the preferred method for extracting a region of interest from a card edge segment is provided. It must be noted that a partial card edge segment may suffice for employing this method.
0 N 100133] hI step 3600, two scan line segments are determined, The scan line segments O are of the same length as the analyzed line segment. Furthermore, the scan line segments arc parallel to the analyzed line segment. Finally, a first of the scan line segments is offset according to a predeterm-ined offset of the region of interest from a corresponding card edge. The second of the scan line segments is offset from the first sean line segment according to the predetermined width of the rank and suit symbols.
100134] In step 3602, pixel rows delimited by the scan line segm-ents are scanned, and for each of the rows a most contrasting color or brightness value is recorded.
[00135] Subsequently, in step 3604, the resulting sequence of most contrasting color or brightness values, referred to as a contrasting value scan line segment, is analyzed to identify regions that may correspond to a card rank and suit. The analysis may be performed according to pattern matching or pattern recognition algorithms.
[00136] According to a preferred embodiment, the sequence of contrasting color values is convolved with a mask of properties expected from rank characters and suit symbols.
For instance, in the context of a white card having darker coloured rank characters and suit symbols, the mask may consist of a stream of darker pixels corresponding to the height of -rank characters, a stream of brighter pixels corresponding to the height of spaces separating rank characters and suit symbols, and a final stream of darker pixels corresponding to the height of suit symbols. The result of the convolution will give rise to peaks where a sequence of the set of contrasting color values corresponds to the expected properties described by the mask.
J001371 Several methods are available for performing such convolution, including but not limited to cross-correlation, squared difference, correlation coefficient, as well as their normalized versions.
[00138] In step 3606, the resulting peaks are detected, ancd the corresponding regions of interests are extracted.
03-Mcxy--2006 06:03 PM WATERMARK 61398196010 24/59
IND
02 100139] Figure 37 illustrates an analysis of the line segment 3510 according to a preferred embodiment of the invention.
001401 First, two scan line segments, 3700 and 3702 are determined. The scan line segments 3700 and 3702 are of7 the same length as the line segment 35 10. Furthermore, the scan line segments are parallel Lu the line segment 3510. Finally, the scan line segmnent 00_ 3700 is offset from the line segmnent 35 10 according to a predetermined offset of the region O of interest from a corresponding card edge. The scan line segment 3702 is offset from the IND scan line segmient 3700 according to the predetermined width of the rank characters and O suit symbols.
1001411 Subsequently, rows delimited by the scan line segments 3700 and 3702 are scanned. For each of the rows, a most contrasting color or brightness value is, recorded to form a sequence of contrasting color or brightness values 3704, also referred to as a contrasting value sean line segment.
100142] Once the sequenc 3704 is obtained, it is convolved with a mask 3706 of properties expected ftrm rank characters and suit symbols. The mask 3706 consists of a stream of darker pixels corresponding to the height of rank characters, a stream of brighter pixels corresponding to the height of spaces separating rank characters and suit symbols, and a final stream of darker pixels corresponding to the height of suit symbols.
100143] A result 3708 of the convolution gives rise to a peak 371.0 where a subsequence 3712 of sequence 3704 corresponds to the expected properties described by the mask 3706. Finially, a region of interest 3714 corresponding to the card 3502 is extracted.
[00144]JIn Figure 38, a flowchart describing another embodiment of the method for extracting a region of interest from a line segmentt is provided.
[00145] In step 3800, several scan line segments are determined. The scan line segments are of the same length as the analyzed line segment. Furthermore, the scan line segments are paral lel to the analyzed line segimt. Finally, a first of the scan line segments is offset from the analyzed line segment according to a predetermined offset of the region of interest from a corresponding card edge. The other scan line segments are offset from the first scan line segment according to the predeterminied width of the rank and suit symbols. The scan line segments are positioned in that manner to ensure that at leatst some of them would intersect any characters and symbols located along the analyzed line segment.
03-May-2006 06:04 PM4 WATERMARK 61398196010 C~1 21 1001461 In stop 3 802, each scan l ine segment is scanned and points of contrasting color or brightness values are recorded to assemble a set of contrasting points which we will o refer to as seed points 100147] Subsequently, in step 3804, the set of contrasting points is analyzed to identify clusters that appear to be defining, at least partially, rank characters and suit symbols. The 00_ clusters can be extracted by grouping the seed points or by fuirther analyzing the vicinity of O one or more of the seed points Using a region growing algorithm.
IN f00148j Finally, in step 3806, regions of interest are extracted from the identified O clusters of contrasting points.
1001491 Figure 39 illustrates an analysis or the line segment 3510 according to a preferred embodiment of the invention.
[00150] First, two scan line segments 3900 and 3902 are determined. The scan line segments 3900 and 3902 are of the samne length as the line segment 3510. Furthermnore, the scan line segments 3900 and 3902 are parallel to the line segmnent 3510. Finally, the scan line segment 3 900 is offset from the line segment 35 10 according to a predetermined offset of the region of interest from a corresponding card edge segment. The scan line segment 3902 is offset from the scan line segment 3900 according to the predetermined width of rank characters and suit symbols. The scan line segments 3900 and 3902 are positioned in that manner to ensure that at least one of them would intersect any characters and symbols located along the line segment 3 510.
1001511 The scan line segments 3900 and 3902 are scanned and points of contrasting color and brightness values are recorded to assemble a sequence of contrasting points.
Subsequently, the sequence is analyzed and ctusters of seed points 3910, 3912 and 3914 are identified as likely to define, at least partially, rank characters and suit symbols.
1001521 Finally, regions of interest 3920, 3922, and 3924 are extracted respectively from the clusters of seed points 3910, 3912, and 3914. Therefore, the method has succeeded in extraction a region of interest of a card having no detectable corners.
1001,531 Referring back to Figure 35, the samne invention is applied to the line segmlents 3512, 3514, 3516, 3518, 3520, 3522, and 3524 as well, in order to identify any desirable region of interest that is yet to be extracted.
1001541 Although the invention bas been described within the context of a hand of cards, it may be applied within the context of a single gaming object, or an amalgam of overlapping gaming objects.
25/59 03-May-2006 06:04 PM WATERMARK 61398196010 26/59 0 22 [001551 Although the invention has been described as preceded by a corner analysis, it may be applied without any previous corner analysis. However, it is usually preferable to O start with a corner analysis since comers are preferred over line segments as reference points.
O- [00156J Although the invention has been described as a method of extracting a region _0 of interest from a card edge, it may do so from any detected card feature, provided that the O feature constitutes a valid reference point for locating a region of interest. For instance, the IN method may be applied to extract regions of interest from detected comers, or detected O pips, instead of line segments. Such versatility is a sizeable asset within the context of table games, where some playing cards may present a very limited number of detectable features.
[001571 J.t is important to note that the preceding corner analysis could have been performed according to the invention.
[00158] Referring back to Figure 8, at step 156, a recognition method may be applied to identify the value of the card. In one embodiment, the ROI 118 is rotated upright and a statistical classifier, also referred to as machine learning model, can be applied to recognize the symbol. Prior to recognition, the ROI 118 may be pre-processed by thresholding the image in the ROI 118 and/or narrowing the ROI 118 to encompass the card identifying symbols. Examples of statistical classifiers that may be utilized with this invention include Neural Networks, Support Vector Machines, lidden Markov Models and Bayesian Networks. A Feed-forward Neural Network is one example of a statistical classifier that may be used with this system. Training of the statistical classifier may happen in a supervised or unsupervised manner. In an alternate embodiment, a method that does not rely on a statistical classifier, such as template matching, may be utilized. In yet another embodiment, the pattern of pips on the cards may be utilized to recognize the cards, provide a sufficient portion of the pattern is visible in a card hand. A combination of recognition algorithms may be used to improve accuracy of recognition.
[00159] The present invention provides a system for identifying a gaming object on a gaming table in an efficient and seamless manner. The system comprises at least one overhead camera for capturing an image of the table; a detection module for detecting a feature of the object on the image; a search module for extracting a region of interest of the image that describes the object from the feature; a feature space module for transforming a feature space of the region of interest to obtain a transformed region of interest; a dimensionality reduction module for reducing the transformed region into a reduced 03-May-2006 06:05 PM WATERMARK 61398196010
O
CK 23 representation according to dimensionality reduction algorithms, and an identity module trained to recognize the object from the transformed region.
1001601 Within the context of the system illustrated in Figure 6, the overhead camera corresponds to the Imager 32. As for the detection module, the search module, the feature space module, the dimensionality reduction module, and the identification module, they 00 are components of the IP module Ci 100161] Figure 40 is a block diagram of the preferred system for identifying a gaming
\O
O object on a gaming table.
Ci [001621 The Imager 32 provides an overhead image of the game table to a Detection module 4000. Subsequently, the Detection Module 4000 detects features of potential gaming objects placed on the game table. Such detection may be performed according to any of the aforementioned methods; for instance, it may consist of the steps 142, 144, 146, 148, 150, and 152, as illustrated in Figure 8.
[00163] According to one embodiment of the present invention, the Detection Module 4000 comprises a cascade of classifiers trained to recognize specific features of interest such as corners and edges.
[00164] According to another embodiment of the present invention, the system further comprises a Booster Module, and the Detection Module 4000 comprises a cascade of classifiers. The Booster module serves the purpose of combining weak classifiers of the cascade into a stronger classifier as illustrated in Figure 49. It may operate according to one of several boosting algorithms including Discrete Adaboost, Real Adaboost, LogitBoost, and Gentle Adaboost.
[00165] Referring back to Figure 40, the Detection Module 4000 provides the image along with the detected features to a Search Module 4002. The latter extracts regions of interest within the image from the detected features. Such extraction may be performed according to any of the aforementioned methods; for instance, it may consist of the steps 3600, 3602, 3604, and 3606, illustrated in Figure 36.
[001661 The Search Module 4002 provides the extracted regions of interest to the Feature Space (FS) Module 4004. For each region of interest, the FS Module 4004 transforms a provided representation into a feature space, or a set of feature spaces that is more appropriate for recognition purposes.
27/59 03-May-2006 06:05 PM WATERMARK 61398196010 28/59
VO
0 i 24 [00167] According to one embodiment, each region of interest provided to the FS Module 4004 is represented as a grid of pixels, wherein each pixel is assigned a color or O brightness value.
[00168] Prior to performing a transformation, the FS Module 4004 must select a ON desirable feature space according to a required type, speed, and robustness of recognition.
00 The selection may be performed in a supervised manner, an unsupervised manner, or both.
[001691 Figure 41 illustrates an example of a feature space that may be used for I recognition purposes. The feature space consists in a histogram of the grayscale values 0^ stored in each column of a pixel grid.
[00170] Once a feature space is selected, the FS Module 4004 applies a corresponding feature space transformation on a corresponding image.
[00171] it is important to distinguish feature space transformations from geometrical transformations. The geometrical transformation of an image consists in reassigning positions of pixels positions within a corresponding grid. While such a transformation does modify an image, it does not modify underlying semantics; the means by which the original image and its transformed version are represented is the same. On the other hand, feature space transformations modify underlying semantics.
[00172] One example of a feature space transformation consists in modifying the representation of colours within a pixel grid from RGB (Red, Green, and Blue) to llSV (Hue, Saturation, and Value or Brightness). In this particular case, the data is not modified, but its representation is. Such a transformation is advantageous in cases where it is desirable for the brightness of a pixel to be readily available. Furthermore, the HSV space is less sensitive to a certain type of noise than its RGB counterpart.
[00173] The Hough Line Transform is another example of a feature space transformation. It consists in transforming a binary image from a set of pixels to a set of lines. In the new feature space, each vector represents a line whereas in the original space, each vector represents the coordinates of a pixel. Consequently, such a transformation is particularly advantageous for applications where lines are to be analyzed.
[00174] Other feature space transformations include various filtering operations such as Laplace and Sobel. Pixels resulting from such transformations store image derivative information rather than image intensity.
[00175] Canny edge detection, Fast Fourier Transform (FFT), and Discrete Cosine Transform (DCT), and Wavelet transforms are other examples of feature space 03-May-2006 06:05 PM WATERMARK 61398196010 29/59 VO IND N transformations. Images resulting from FFT and DCT are no longer represented spatially Cc, (by a pixel grid), but rather in a frequency domain, wherein each vector corresponds to the O proportion of a given colour frequency in the image. Such transformations are practical because the resulting feature space is invariant with respect to some transformations, and robust with respect to others. For instance, an image resulting from a DCT is more robust 00 to lighting variations, which makes recognition more reliable.
0 100176] Within the context of the present invention, the use of different feature spaces ID provides for additional robustness with respect to parameters such as lighting variations, O brightness, image noise, image resolutions, ambient smoke, as well as geometrical transformations such as rotations and translations. As a result, the system of the present invention provides for greater training and recognition accuracy.
100177]1 According to a preferred embodiment of the present invention, Principal Component Analysis (PCA) is the main feature space transformation in the arsenal of the FS Module 4004. It is a linear transform that selects a new coordinate system for a given data set, such that the greatest variance by any projection of the data set relates to a first axis, known as the principal component, the second greatest variance, on the second axis, and so on.
[00178] The first step of the PCA consists in constructing a 2D matrix A of size n x wh where each column is an image vector, given n images of w x h pixels. Each image vector is formed by concatenating all the pixel rows of a corresponding image into vector. The second step consists in computing an average image from the matrix A by summing up all the rows and dividing by n. The resulting clement vector of size (wh) is called i. The third step consists in subtracting u from all the columns of A to get a mean subtracted matrix B of size (n x wh). The fourth step consists in computing the dot products of all possible image pairs. Let C be the new (n x n) matrix where C[i]Uj] dot product of B[i] and B[j].
C is the covariance matrix. The penultimate step consists in Compute the n Eigen values and corresponding Eigcn vectors of C. Finally, all Eigen values of C are sorted from the highest Eigen value to the lowest Eigen value.
[00179] According to another embodiment, the FS Module 4004 applies predominantly one or more of the DCT, FFT, Log Polar Domains, or other techniques resulting in edge images.
[00180] Referring back to Figure 40, and according to a preferred embodiment of the invention, the FS Module 4004 provides the transformed representation, or set of 03-May-2006 06:06 PM WATERMARK 61398196010 26 representations to a Dimensionality Reduction (DR) Module 4008. The DR Module 4003 reduces the dimensionality of the provided representations by applying feature selection O techniques, feature extraction techniques, or a combination of both.
1001 81J According to a preferred embodiment of (he present invention, the representations provided by the ITS Module 4004 result from the application of a PGA, and the DR Module 4006 reduces their dinmensionality by applying a feature selection O technique that consists in selecting a subset of the PCA coefficients that contain the most INO information.
001821 According to one embodiment of the present invention, the representations provided by the PS Module 4004 result from the application of a DCT, and the DR Module 4006 reduces their dimensionality by applying a feature selection technique that consists in selecting a subset of the DCT coefficients that contain the most information.
[00183] According to anothier embodiment of the present invention, the DR Module 4006 reduces the dimensionality of the provided representations by applying a feature extraction technique that consists in projecting them into a featur space of fewer dimensions.
100184) According to another embodiment of the present invention, the representations provided by the FS Module 4004 result from the application of a OCT, and the DR Module applies a combination of feature selection and feature extraction techniques that consists in selecting a subset of the DCT coefficients that contain the most information, and applying PCA on the selected coefficients.
[00185] Within the context of the present invention, the application of dimnensionality reduction techniques reduces computing computational overhead, thereby increasing the training and recognition procedures performed by the Identity Module 4008. Furthermore, dimensionality reduction tends to eliminate, or at the very least reduce noise, and therefore, increase recognition and training efficiency.
[00186] According to another embodiment of the invention, the FS Module 4004 provides the transformed representation or set of transformed representations to an Identity Module 4008 trained to recognize gaming objects from dimensionality reduced representations of regions of interest.
[0011871 Referring back to Figure 40 and according to a preferred embodiment of the present invention, the DR Module 4006 provides the dimensionality reduced 30/59 03-Moy-2006 06:06 PM WATERMARK 61398196010 27 representations to an Identity Module 4008, which identifies a corresponding gaming object.
1 00188] Still according to a preferred embodiment of the present invention, the Identity Module 4008 comprises a statistical classifier trained to recognize gaming objects ftom C> dimensional ity reduced representations.
00 1001 89J According to one embodiment of the present invention, the Identity Module N 4008 comprises a Feed-forward Neural, Network such. as the one illustrated in Figure 46 NO that consists of input nodes, multiple hidden layers, and output nodes. The hidden layers 0can be partially connected, as those shown in Figure 46, or fully connected. During the initial supervised training mode, a back propagation learning method is utilized in conjunction with the error finction an error function to allow the Neural Network to adjust its internal weights according the inputs and outputs.
[00190j According to another embodiment of the present invention, the Identification Module comprises a cascade of classifiers.
1001911 According to another embodiment of the present invention, the system. further comprises a Booster Module, and the Identity Module 4008 comaprises a cascade of classifiers. The Booster module serves the purpose of combining weak classifiers of the cascade into a stronger classifier. It may operate according to one of several boosting algorithms including Discrete Adaboost, Real Adahoost, Logiffloost, and Gentle Adaboost.
[00192] Referring back to Figure 40, and according to one embodiment of the present invention, the system is used to perform deck verification. When such verification is required, the dealer presents the corresponding cards on the table, in response to which the Identity Module 4008 is automatically triggered to provide the rank and suit of each identified card to a Deck Verification Module 4010. The latter module analyzes the provided data to ensure that the deck of cards adheres to a provided set of standards.
[00193] According to one embodiment of the present invention, the Detection Module 400QL recognizcs a configUration of playing cards suitable for a deck verification procedure and triggers the Identity Module 4008 to provide the rank and suit of each identified card to a Deck Verification Module 40 1001941 According to another embodiment of the present invention, the Identity Module 4008 is manually triggered to provide the rank and suit of each identified card to the Deck Verification Module 4010() 31/59 03-Mciy-2006 06:07 PM WATERMARK 61398196010 28 [001951 Referring back to Figure 8, once the identity and position profile of each en visible card in the gaming region has been obtained, the data can be output to other O modules at step 158. Examples of data output at step 158 May include the number of card hands, the Cartesian coordinates of eah corner of a card in a hand (or other positional information such as line segments), and the identity of the card as a rank and/or suit.
00[00196j At step 160 the process waits for a new image and when received processing 0 returns to step 144.
o[001971 Referring now to Figure 9 an overhead view of gaming table with proximity 0 detection sensors is shown. In an alternative embodiment IP module 80 may utilize proximity detection sensors 1 70- Card shoe 24 is a card shoe reader, which dispenses playing cards and generaties signals indicative of card identity. An example of a card shoe reader 24 may include those disclosed in United States patents 5.374,061 to Albrecht, 5,941,769 to Order, 6,039,650 to Hill, or 6,126,166 to Lorson. Commercial card shoe readers such as for example the MP21 card reader unit sold by Rally Gamning or the Intelligent Shoe sold by Shuffle Master Inc. may be utilized. In an alternate embodiment of the card shoe reader, a card deck reader such as the readers commercially sold by Rally Gaming and Shuffle Master can be utilized to determine the identity of cards prior to their introduction into the game. Such a card deck reader would pre-determine a sequence of cards to be dealt into the game. An array of proximity detection sensors 170 can be positioned under the gamning table 12 parallel to the table surface, such that periodic sampling of the proximity detection sensors 170 produces a sequence of fr-ames, where each frame contains the readings from the proximity detection sensors. Examples of proximity detection sensors 170 include optical sensors, infra red position detectors, photodiodes, capacitance position detectors and ultrasound position detectors. Proximity detection sensors 170 can detect the presence or absence of playing car-ds (or other gamning objects) on the surface of gaming table 12. Output from the array of proximity detection sensors can be analog or digital and call be further processed in order to obtain data that represents objects on the table surface as blobs and thus replace step 142 of Figure 8. In this embodiment a shoe 24 would provide information on the card dealt and sensors 170 would provide positioning data. The density of the sensor array (resolution) will determine what types of object positioning features may be obtained. To assist in obtaining positioning features further processing may be performed such as shown in Figure which is a plan view of a cad position relative to proximity detection sensors 170. Sensors 170 provide signal strength infbnnation, where the value one represents an object detected 32/59 03-May-2006 06:07 PM WATERMARK 61398196010
O
29 and the value zero represents no object detected. Straight lines may be fitted to the readings of sensors 170 using a line fitting method. In this manner proximity detection O sensors 170 may be utilized to determine position features such as line segments 114 or corners 116.
[00198] In this embodiment, identity data generated from the card shoe reader 24 and 00 positioning data generated from proximity detection sensors 170 may be grouped and 0 output to other modules. Associating positional data to cards may be performed by the NO IPAT module 84.
0 1001991 In another alternate embodiment of the IP module 80, card reading may have an RFID based implementation. For example, RHID chips embedded inside playing cards may be wirelessly interrogated by RFID antennae or scanners in order to determine the identity of the cards. Multiple antennae may be used to wirelessly interrogate and triangulate ile position of the RFID chips embedded inside the cards. Card positioning data may be obtained either by wireless interrogation and triangulation, a matrix of REID sensors, or via an array of proximity sensors as explained herein.
[002001 We shall now describe the function of the Intelligent Position Analysis and Tracking module (IPAT module) 84 (see Figure The IPAT module 84 performs analysis of the identity and position data of cards/card hands and interprets them "intelligently" for the purpose of tracking game events, game states and general game progression. The IPAT module may perform one or more of the following tasks: a) Object modeling; b) Object motion tracking; c) Points in contour test; d) Detect occlusion of cards; c) Set status flags for card positional features; and f) Separate overlapping card hands into individual card hands.
(00201] According to the present invention, the IPAT module 84, in combination with the Imager 32, the IP module 80, and the card shoe 24, may also detect inconsistencies that occur on a game table as a result of an illegal or erroneous manipulation of playing cards, [00202J According to a preferred embodiment of the present invention, the system for detecting inconsistencies that occur on a game table as a result of an illegal or erroneous manipulation of playing cards comprises a card shoe fur storing playing cards to be dealt 33/59 03-Mczy-2006 06:08 PM WATERMARK 61398196010
IND
03 on the table; a card reader for determining an identity and a dealing order of each playing card as it is being dealt on the table flair the shoe; an overhead camera for capturing O images of the table, a recognition module for determining an idenitity and a position of each card positioned on the table from the images; and a tracking module for comparing the dealing order and identity determined by the card reader with the identity and the 00 position determined by the recognition module, and detecting the inconsistency.
0[002031 Within the context of the system illustrated in Fig. 6, the card shoe and card IND reader correspond to the card shoe 24, which comprises an embedded card reader. The O overhead camera corresponds to the Imager 32. The recognition module corresponds to the I P module 80. Finally, the tracking miodule corresponds to the II'AT module 84.
[00204J In Figure 42, a flowchart describing the interaction between thc IPAT module 84, IP module 80, and card shoe 24 for detecting such inconsistencies is provided, In step 4200, the PAT module 84 is calibrated and its global variables are initialized. In step 4202, the IPAT module 84 receives data from the card shoe 24.
[002951 In the preferred embodiment of the present invention, the data is received immediately following each removal of a card from the card shoe 24. In anothier embodimenrt, the data is received following each removal of a predetermined number of cards from the card shoe 24. In yet another embodiment, the data is received periodically.
[002061 In the preferred embodiment of the present invention, the data consist of a rank and suit of a last card to be removed ftrm the card shoe 24. In another embodiment, the data consist of a rank of a last card to he removed from the card shoe 24.
(00207J In step 4204, the IPAT module 84 receives data from the [P module [002081 In the preferred embodiment of the present invention, the data is received periodically. in another embodiment, the data is received in response to the realization of step 4202.
1002091 In the preferred embodiment Of the present invention, the data consist of a rank, suit, and position of each card placed on the game table. In another embodiment, the data cZonsist Of a rank and suit of each card placed on the game table. In yet another embodiment, the data consist of a rank of each card placed on the game table.
[0021 JIn step 4206, the lPAT module 84 compares the data provided by the card shoe 24 with those provided by the IP m-odule 34 /59 03-Mcxy-2006 06:08 PM WATERMARK 61398196010 31 [00211J In the preferred embodiment of the present invention, the IPAT module 84 verifies whether the rank and suit of cards, removed from the card shoe 24 as well as the O order in which they were removed correspond to the rank, suit, and position of cards placed on the game table according to a set of rules of the game being played.
[00 212] In another embodiment, the PAT module 84 verifies whether the rank and suit of cards removed from the card shoe 24 correspond to the rank and suit of those that are 0 placed on the game table.
O[00213] If an inconsistency is detected, the IPAT module 84 informs the surveillance 0 module 92 according to step 4208. Otherwise, the IPAT module 84 returns to step 4202 as soon as subsequent data is provided by the card shoe 24.
[00214) The invention will now be described within the context of monitoring a game of Baccarat- According to the rules of' the game, a dealer withdraws four cards from a card shoe and deals two hands of two cards, face down; one for the player, and one for the bank.
The player is required to flip the dealt cards and return them back to the dealer. The latter organizes the retured cards on the table and determines the outcome of the game. One known form of cheating consists in switching cards. More specifically, a player may hide cards of desirable value, switch a dealt card with one of the hidden cards, flip the illegally introduced card and return it back to the dealer The present invention provides an efficient and seamless means to detect such illegal procedures.
[002151 As mentioned hereinabove, according to the rules of the Baccarat, the dealer must withdraw four cards from the card shoe. According to a first exemplary scenario, the dealer withdraws in order the Five of Spade, Six of H-earts, Queen of Clubs, and the Ace of Diamonds. The rank and suit of each of the four cards is read by the card shoe 24, aind provided to the IPAT module 84.
[002161 The player flips the dealt card and returns them to the dealer. The latter organizes the four cards on the table as illustrated in Figure 43. The Five of Spades 4300 and the Six of Bearts 4302 are placed in a region dedicated to the player's hand, and the Queen of Clubs 4304 and Ace of Diamonds 4306 are placed in a region dedicated to the bank's hand.
[002171 The Imager 32 captures overhead images of the table, and sends the images to the IP module 80 for processing. The IF module 80 determines the position, suit, mid rank of cards 4300, 4302, 4304. and 4306, and provides the information to the IPAT module 84.
The latter compares the data received from the card shoe reader and the IF module, and 35/59 03-Mciy-2006 06:09 PM4 WATERMARK 61398196010 32 finds no inconsistency. Consequently, it waits for a new set of data from the card shoe reader.
0[90218] According to a, second exemplary scenario, the dealer withdraws in order the Five of Spade, Six of Hearts, Queen of Clubs, and the Ace of Diamonds. The tank and suit of each of the four cards is read by the card shoe 24, and provided to the WPAT module 84.
00 [00219] The player switches one of the dealt cards with one of his hidden cards to form Nl a new hand, flips the cards of the new hand, and returns themn to the dealer. The laiter O arranges the four cards returned by the player as illustrated in Figure 44. The Five of 0Spades 4300 and Four of Hearts 4400 are placed in a region dedicated to the player's hiand, and the Queen of Clubs 4304 and Ace of Diamonds 4306 are placed in a region dedicated to the bank's hand.
[00220] Tho hmnger 32 captures overhead images of die table, anid sends the images to the 1P' module 80 for processing. The IP module 80 determines the position, suit, and rank of cards 4300, 4400, 4304 and 4306, and provides the information to the IPAT module 84.
The latter compares the data received from the card shoe 24 and the IP module, and finds an inconsistency; the rank of the cards 4300, 4302, 4304, and 4306 removed from the card shoe do not correspond to the rank of the cards 4300, 4400, 4304 and 4306 placed on the table. More specifically, the card 4302 has been replaced by 4400, which likely results fromn a card switching procedure. Consequently, the [PAT module 84 provides a detailed description of the detected inconsistency to the surveillance module 92.
1002211 According to a third exemplary scenario, the dealer withdraws in order the Five of Spades, Six of Hearts, Queen of Clubs, and the Ace of Diamonds. The rank and suit of each of the four cards is read by the card shoe 24, and provided to the I PAT module 84.
1002221 The player flips the dealt card and returnis them to the dealer. The latter organizes the four cards on the table in an erroneous manner, as illustrated in Figure The Five of Spades 4300 and the Queen of Clubs 4304 are placed in a region dedicated to the player's hand, and the Six of Hearts 4302 and Ace of Diamonds 4306 are placed in a region dedicated to the bank's hand.
[00223] The lImager 32 captures overhead images of the table, and sends the images to the IP module 80 for processing. The I P module 80 determines the position, suit, and rank of cards 4300, 4302, 4304 and 4306, and provides the information to the IPAT module 84.
The latter compares the data, received from the card shoe reader 24 and the lIP module, and finds an inconsistency; while the rank and suit of the cards; removed from the card shoe 36/59 03-May-2006 06:09 PM WATERMARK 61398196010 O IND 33 correspond to the rank and suit of the cards positioned on the table, the order in which the cards were removed from the card shoe does not correspond to the order in which the cards O were organized on the table. More specifically, the card 4302 has been permutated with the card 4304. Consequently, the IPAT module 84 provides a detailed description of the C0 detected inconsistency to the surveillance module 92.
00 0- [002241 While the invention has been described within the context of monitoring a 0 game of Baccarat, it is applicable to any table game involving playing cards dealt from a I card shoe.
0 [00225] With regard to object modeling the IP module 80 provides positioning features that can be utilized to model cards and track cards from frame to frame. Referring now to Figure 11 a plan view of card hand representations with positioning features is shown. A centre or mass 180 is shown as a positioning feature but other features such as ROI 118, corners 116, line segments 114, shapes or partial shapes of numbers, patterns and pips on the card may be utilized, 1002261 Object representation or modeling refers to the parameters that can describe the object in each frame. Different aspects of the object can be represented, such as its shape or appearance. For modeling object boundaries, deformable contours (Witkin, Kass. M., and Terzopoulos, D.1988. Snakes: Active contour models. International Journal of Computer Vision, 1(4):321-331) is an example representation that may be utilized. As other examples, coarse contour representation, ellipses, superquadries, or B-splines can be used. The mentioned techniques for representing a contour define a set of parameters that can describe the contour. For example, in the case of deformable contours or B-Splines, the parameters are usually a sequence of points. In the case of ellipses or superquadrics, the parameters are usually the axes dimensions and various deformation parameters, such as the angle of rotation or bending parameters. In general, some optimization techniques can be used to fit the parameterized model to the actual contour. A contour can become partially occluded. For example, a dealer's hand may partially obstruct the overhead view and occlude a part of a card hand contour. Some features can still be representative of a card hand, even if only part of it is visible. In the case of contours, such features include the portions of the contour, which are unique in shape, such as a corner. Since under partial occlusion some of these distinguishing features would likely still be visible, the partially occluded hand could likely be matched using a subset of card hand features. For low resolution data, features such as the curvature of the bounding contour could be used for tracking. Object modeling can group together different features to model an object. For 37/59 03-May-2006 06:10 PM WATERMARK 61398196010
VO
cK 34 instance, a group of corners and associated line segments can be collectively modeled as one card hand, and that way if some of the features within that group of features are not O available because of occlusion, the remaining features will be sufficient to track file cards.
[00227] An object's model may not necessarily contain a static group of features. The model can be dynamic and can be updated and expanded with new data as it becomes 00 available or.as the existing position features change from frame to frame. As an example, o after recognition of ROT 118 (see FIG. specific positioning features (such as
I
N geometrical and/or pattern features) detected on the rank and suit inside the RO 118 may O be added to the object's model. An example of a geometrical feature is strong corners obtained through Eigenvalues on a card suit image. An example of a pattern feature is Ilu moments and Zernike moments obtained on an image of a card's interior pattern. As another example, if an object is slightly rotated, tius causing its position features to change slightly, the object's model can be updated with the new position features.
[002281 Object motion tracking generally refers to tracking an object that is moving from frame to frame in a temporal sequence of image frames. The position and/or other parameters of the object are being tracked through consecutive or periodic frames. In case of card tracking, the objects in question arc cards or card hands. Object motion tracking matches positioning features of objects over consecutive frames. Based on this comparison of consecutive frames, it is possible to track a moving hand that is shifted on gaming table 12. An assumption that tracking methods rely on is that an object's positioning profile (comprised of positioning features such as corners, ROIs, or line segments) for consecutive frames are similar to each other. Based on this assumption, comparison of position profiles between one or more consecutive frames can be utilized to establish compatibility.
A compatibility of position profiles can indicate that the compared position profiles represent the same gaming object. Once compatibility has been established for position profiles of gaming objects between two (or more) frames, the identity of the gaming object from the first frame can be assigned to the gaming object in the second frame, thus eliminating the need for performing recognition of the gaming object in the second frame.
An advantage of object motion tracking is that it can potentially improve speed of game monitoring by reducing the number of recognitions that need to be done by the described technique of assigning the identity after establishing compatibility between position profiles.
1002291 One type of positioning feature or groups of positioning features is a shape descriptor. A shape descriptor approximately defines the shape of a gaming object. A 38/59 03-May-2006 06:10 PM WATERMARK 61398196010 39/59
O
O
contour is an example of a shape descriptor. The four corner points of a playing card is ¢Cf another example.
O
[00230] One motion tracking technology is optical flow Horn and B. Schunck, Determining optical flow, Artif Intell., vol. 17, pp. 185-203, 1981). Based on frame Sdifferencing, each point in every frame is assigned a velocity. For card tracking, such data 00 could he used to better estimate the motion of a card or a card hand and help keep track of its position parameters. Tracking methods can account for effects such as occlusion, by I being able to track the object based on only a subset of object positioning features at each O frame. Examples of some available tracking techniques used include Kalman Filtering and Condensation (CONDENSATION conditional density propagation for visual tracking, Michael Isard and Andrew Blake, Int Computer Vision, 29, 1, 5-28, 1998).
[00231] One possible motion tracking approach that deals with occlusion is the layered approach. An example of such layered tracking is by B. J. Frey, N. Jojic and A. Kannan 2003 Learning appearance and transparency manifolds of occluded objects in layers, In Proceedings of IEEE conference on Computer Vision and Pattern Recognition, 2003.
(CVPR 03). Each hand can be tracked using a separate layer. The larger contour containing overlapping hands may be tracked or detected using a combination of layers of individual cards or individual card hands.
[00232] With regard to the points in contour test, in an embodiment of the present invention, cards and card hands are modeled with contours and/or card comer points. The points in contour test may be utilized to: a) Determine if the card or card hand might be occluded; b) Determine the minimum number of cards in a card hand contour; and c) Ascertain if a contour is likely that of a card hand.
[00233] Before discussing an implementation of a points in contour test we first refer to some basic concepts.
[00234] Referring now to Figure 12 an illustrative example of applying corer points in a points in contour test is shown. There are two types of corners, convex comers and concave corners. A convex comer is the corner of a card, a concave comer is a corner within a contour 112 that is not a card corer. In Figure 12, contour 112 of card hand 120 has sixteen corners, ten of which are convex and six of which are concave. As shown in Figure 12, a card is identified by a first convex corner 116a and processing moves to each unexplained convex corner in turn until as many cards as fully visible in a card hand 120 03-Mcay-2006 06:11 PM WATERMARK 61398196010 36 are detected- In the example shown in Figure 12, the next unexplained convex corner after 11 6a would be I I 6b and so on. Concave corners. such as 119 are not examined to identify' O cards in this embodiment 1902351 Referring now to Figure 13 an illustrative example of matching vertical card and horizontal Card orientations is showw- As shown in feature 190 a vertical card 192 00 correctly matches the contour of card hand 120. In the case of feature 194 a horizontal 0 card 196 does not mnatch the contour of card hand 120, as mismatched corner 11 6a, IND mismatched line segmnrt I114a and mismatched area 1 82a do not match the contour of O hand 120. It is to be noted that it is not necessary that each of features 11I 6a, I 14a and 182a need to be checked. A subset of them can be checked. Alternatively, the process can begin with an unexplained line segment and try to match line segments using a line segment in contour Lest, whereby one or more of corresponding lines can be superimposed.
[00236J Referring now to Figure 14 a flowchart of a points in contour test is shown, 'Beginning at step 200 a list of contours that may be card hanids is determined based upon object positioning information received from the IP module 80, At step 202 a contour is selected. At step 204 a convex corner a card cornet) of the contour is selected.
Moving to step 206 the corners of a card are interpolated based upon the convex corner .selected and a vertical card is placed on the contour to determine if it fits inside the contour. At step 208 if the vertical card fits inside the contour, processing moves to step 210 where the corners of the contour that match the vertical cad are marked as explained.
At step 212 a test is made to deternine if there are anymore unexplained convex corners.
If no more unexplained convex corners exist processing moves to step 214 where a test is mrade to determine if any more contours exist to be examined. If at step 214 the result is yes, processing moves to 202 and if no, processing moves to step 200. Returning to step 212 if there are more unexplained convex corners, processing moves to step 216 where the next unexplained convex corner is detected and processing then returns to step 200- [002371 Returning to step 208 if the vertical superimposition is not successful, processing moves to step 218 where the corners of a card are interpolaed based upon the convex corner selected and a horizontal card is placed on the contour to determine if it fits inside the contour. At step 220 a test is made to determine if the horizontal card fits the contour. If die superimposition was successful processing moves; to step 210 as discussed above. If it was not successful a flag Is set at step 222 to indicate the matching of a corner failed and processing moves to step 212. The flags set at step 222 can be used by game tracking module 86 to deten-nine if a card hand is occluded. The flags set at step 222 can 40/59 03-Mcxy-2006 06:11 PM WATERMARK 61398196010 41/59 also be used by a recognition method of the IF module to perform recognition only on cards or cad bands that are not occluded. In the embodiment of the IP module with a card O shoe reader 24 and array of' proximity sensors 170, the number of cards in a card hand as determined by the points in contour test can be utilized to assign a [iCW card that has come, out of the shoe to the appropriate card hand.
00 1002381 The number of times the point in contours method is repeated is an indication 0 of the mninimumn number of cards in a card hand. If the points in contour method fails for IN one or more corners, it cudmean that the contour does not belong to a card/card band or O that the contour may be occluded.
[002391 Motion detection (different from object motion tracking) in the vicinity of the card or card hand position feature can be performed by comparing consecutive frames.
Frame differencing is a method to detect motion, whereby from the most recent image, one or more temporally previous images may be, siibtracted. This difference image shows one or more aras of potential motion. We now refer to Figure 15 as an illustrative example of detecting frame differencing through motion detection. The hand of a dealer dealing a card 18 to a card hand 120 is shown as feature 230. The motion of' the dealer about to deal card 18 is captured as image 232. Image 234 indicates the position of card 18 and the dealer hand 230 once the card 18 has been added to card hand 120. By subtracting image 232 from image 234 as shown by feature 236, a motion image 238 is generated defining a motion area 240.
[00240] As shown in image 242 the hand of the dealer 230 has been removed from the card hand 120 as shown in image 242. By subtracting image 234 from image 242 as shown by feature 244, a motion image 246 is generated defining a motion area 248.
[00241J Motion detected on or right beside an object positioning feature (such as a contour) of a card or card hand can be an indication that the card or card hand may be occluded and an appropriate motion flag can be set to record th is potential occlusion.
[00242] Skin color detection algorithms on images can be utilized to detect hands of a player or dealer, If the hand of a player or dealer is on or right beside an object position feature of a card hand, it can be deciphered that the card or card hand can be partially or entirely occluded. Numerous skin color detection algorithms on images are readily available in the public domain, Non-parametric ski distribution modeling is an example of a skin detection algorithm that may be utilized. It must be noted that skin detection may not be sufficiently accurate if the table layout is brown or skin like in color or has skin 03-May-2006 06:12 PM4 WATERMARK 61398196010 42/59 38 colored patterns. Referring now to Figure 16 an illustrative example of a dealer hand occluding the contour of a card hand is shown.
[00243J Analysis of a contour 112 of a card 18 or card hand 120 can be utilized to determinec if it is occluded. The same is true for any gaming object. 'rhe values of the flags set by the points in contour test, motion detection, skin detection and contour analysis can 00 be utilized to detect potential occlusion of a card or card hand. It is not necessary to utilize 0 all of these occlusion detection methods. A subset of these methods may be utilized to IN detect potential occlusion. As shown in Figure 16 the hand of the dealer 230 has occluded O the contour 112 of'the card hand 120- 1002441 During the co-urse of a game, occasionally, two individual card hands may overlap resulting in a single contour representing both card hands. One way lo detect an overlap of card hands is to utilize object motion tracking, as described in a foregoing section, to track identified card cornes (or contours or other position features) gradually as they move and end up overlapping another card hand. For instance, with reference to Figure 17 an illustrative example of changes in blob properties is shown. A card hand 1 20a may be moved by the dealer and result in overlapping with card hand 120b. One method to determine the occurrence of an overlap is to detect changes in area, centre of mass, and other geometrical parameters of blobs or contours. When two hands overlap there is normally a very large increase in the area of the resulting merged contour or blob as compared to the area of the original contour. As is shown in Figure 17, as indicated by reference line 250, centre of mass 180 moves to the right when card hands 120a and 120b overlap and the resulting blob 1 10b is larger than blob 1 IOft.
100245] If two contours or blobs are overlapping by a small are;, then an erosion algor-ithmn may be utilized to separate blabs/contours into separate card hands. Erosion is an image processing filter that can iteratively shrink the contour or blob. Shrinking is done in such a way that narrow portions of the contour disappear first. Erosion may be utilized as a technique to separate blobs/contours into separate card hands. Figure 18 is an illustrative example of applying erosion. As shown in Figure 1.8 two overlapping card hands 1 20a and 1 20b resolve to a single blob 11 Oc. By applying erosion, blob I11 Oc is separated into two blobs 110~d and 11t0e representing card hands 120a and 120b respectively.
1002461 Referring now to Figure 19 an illustrative example of the separation of two card hands is shown. Given a single contour formed by one or more individual card hands, the identified cards and position features can be utilized to separate the hands using known 03-Mcxy-2006 06:12 PM WATERMARK 61398196010 properties of how a table game is dealt. For instance, in the game of Blackjack, the dealer usually deals cards in a certain fashion, as shown in Figure 19. The cards can be ordered in O deecasing distance ftrm the dealer reference point. Trhe dealer reference point 260 can be located at the bottom center of the chip tray. The ordered list of cards as shown in Figure ON 19 is: five spade 262, nine spade 264, six heart 266, queen club 268, seven diamond 270, 00 and ace diamond 272 based upon dealer reference point 260. Arrow 274 indicates bow the O card hands dealt may be eventually rotated and separated into separate card hands. It is to IND be noted that in alternate embodiments ordering of the cards can be done with reference to O different reference points depending on the location of a card hand. As an example, if a Cl card hand is located closest to the first betting spot, the cards in that hand can be ordered in increasing distance from a referenc point located above the first betting spt.O 100247J Referring now to Figure 20 an illustrative example of pair-wise rotation and analysis is shown. Cards 280, 282, 284 and 256 are selected in the order of furthest away from the dealer reference point 260 to the closest to the dealer reference point 260. The ordered list of cards shown in Figure 20 would be: three diamond 280, ace diamond 282, six heart 284, and five spade 286. Each consecutive pair of cards is checked for a valid card configuration as shown by features 288, 290 and 292. One of these pairs is not a valid card configuration. In order to check for a valid card configuration, the cards are first rotated and this rotation is performed for every pair of cards that are compared.
1002481 'The first two cards, three diamond 280 and ace diamond 282 are rotated upright at step 288 and the valid card configuration checked. If the configuration passes, the next pair of cards is checked at step 290. This configuration fails anfd the failed card (ace diamond 282) is removed ftrm the caird hand and made into a temporary separate card hand. The card configuration failed because according to Blackjack card configuration rules the new card should he planed to the bottom left, and not the bottom right of' the previous card. The next pair of cards, three diamond 280 and six heart 284 arc then checked. Here as the configuration is a valid Blackjack card configuration. At step 292, the next pair of cards, six heart 284 and five spade 286 are checked, which also pass the valid card configuration. Ultimately, the cards get separated into two hands, each with valid blackjack card hand configuration. The first hand comprises three cards, three diamond 280, six heart 284 and five spade 286, while the second hand comprises one card, ace diamond 282.
f002491 Figures 21Ia and 21 b are flowcharts of a card hand separation process as described with reference to Figure 20. Beginning at step 300, identity and positioning 43/59 03-Mcay-2006 06:13 PM WATERMARK 61398196010 44/59
IN
04 informnation of cards and card hands is received from the 1P' module 80, from which a card hands list is determined. From the example of Figure 20 one contour of a card hand O containing cards 280 to 286 can be separated into two actual card hands 280,284,286 as one card band and 282 as another card hand. At step 302 a test is made to determine if there are any morc card hands to analyze in the card hands list. If not, processing ends at 00_ step 304. If there are more card hands to analyze processing moves to step 306. At step O 306 the next card hand in the card hands list is removed and assigned to a current card IND band, At step 308 a test is made to determine if there are more than two or more cards in O the current hand. If not processing returns to step 302 otherwise processing moves to step CI 310. At step 310 the cards in the current card hand, in our example cards 280 to 286 are sorted by decreasing distance to the dealer reference point 260. At step 312 a temporary card hand is created and a counter is initialized to zero to point to the first card of' the sorted list of cards in the current card hand. At step 314 a test is made to determine if the counter has reached thc last card in the current hand. If the test at step 314 is successful processing moves to step 316 where a temporary hand a hand including cads that do not appear to be of the current card hand) is added to the card hands list If the test at step 314 is negative processing moves to step 322 of Figure 21b. At step 322 the current and next card are selected from the current hand. At step 324 the cards selected at step 322 are rotated upright. At step 326 a test is made to determine if the two selected cards form a valid card pair configuration as per the dealing procedures of the game (Blackjack in this case). If not processing moves to step 330 where the next card in the pair is removed from the current hanld and placed in the temporary hand and then back to step 314 of Figure 21a. If the test at step 326 is successful, the value of the current card CTF') is incremented at step 328 and processing moves to step 314 of Figure 21 a, [002501 In the foregoing embodiment of the card hand separation process, the cards, 280 to 286 in the card hand arc first sorted according to their distance from a reference point 260 before they arc compared pair wise and separated if necessary. In an alternate embodiment, tbe cards 280 to 286 mnay first be compared pair wise and separated into different card hands (if necessary), after which the cards in each resulting card hand may be ordered by sorting according to their distance from a reference point 260.
j0O251J In the foregoing embodiments of the card hand separation process, the cards 280 to 286 in the card hand are separated based on the present data fromn the IP module and without prior knowledge of the game state or other data frames. In an alternate embodiment, the separation process may analyze one or more of the following parameters 03-Mciy--2006 06:13 PM WATERMARK 61398196010 41 -Current game state, previous game state, data from the IV' module 80 from different points including past and future with respect to the current data frame.
1002521 Although the foregoing methods illustrate how overlapping card hands can be separated into distinct card hands, it must be noted that the described card hand organization techniques can generally be applied to organize a number of cards into card hands. For instance, in an embodiment of the IF module 80 with RFID embedded playing 0 Ni cards, the data from the IP module 80 might not contain contour infornation. in this NO embodiment the described card hand organization methods may be utilized to organize the 0 identity and position profilIes of cards into distinct card hands.
1002531 We shall now discuss the functionality of the game tracking (UT) module 86 (see Figure Thie UT module 86 processes input relating to card identities and positions to determine game events and game states.
[00254] The GT module 86 can have a single state embodiment or a multiple state embodiment. in the single state embodiment, at any given time in a game, one valid current game state is maintained by the OT module 86. When faced with ambiguity of game state, the single slate embodiment forces a decision such that one valid current game state is chosen. In the multiple state embodiment, multiple possible game states may exist simultaneously at any given time in a game, and at the end of the game or at any point in die middle of the game, the OiT module 86 m-ay analyze the different game states and select one of them based on certain criteria. When faced with ambiguity of game state, the multiple state embodiment allows all potential game states to exist and move forward, thus deferring the decision of choosing one game state to a later point in the game. The multiple game state embodiment can be more effective in handling ambiguous data or game state scenarios.
[00255] In order to determine states, OT module 86 examines data frames. Data frames comprise data on an image provided to CT module 86 from TP module 80 and IPAT module 84. Referring now to Figure 22 an illustrative example of the front and back buffer of data frames is shown. Data frames are queued in a back buffer 350 and a front buffer 352. Data framnes in front buffer 352 have yet to be examined by (iT module 86 while data frames in back buffer 350 have been examined. Data frame 354 is an example of a data frame in back buffer 350 and data frame 356 is an example of a data fr-ame in front buffer 352. Current data frame 358 indicates a data frame being processed by CT module 86.
45/59 [00256] A data framne may include the following data: 03-May-2006 06:14 PM WATERMARK 61398196010
VN
42 a) Card and card hand positioning features (such as contours and corners) b) Identity of cards, linked to the card positioning features c) Status flags (set by IPAT module 84) associated with the card and card hand positioning features.
00 [00257] GT module 86 utilizes data frames as described with regard to Figure 22 to O identify key events to move from one state to another as a game progresses. In the case of Is0 Blackjack, a key event is an event that indicates a change in the state of a game such as a o new card being added to a card hand, the split of a card hand, a card hand being moved, a new card provided from a shoe, or removal or disappearance of a card by occlusion.
100258] A stored game state may be valid or invalid, A valid state is a state that adheres to the game rules, whereas an invalid slate would be in conflict with the game rules. During the game tracking process, it is possible that the current game state cannot account for the key event in the current data frame 358 being analyzed. The data frame 358 can contain information that is in conflict with the game rules or the current game state. In such an event, the current game state may be updated to account for the data in the frame 358 as accurately as possible, but marked as an invalid state. As an example in Blackjack, a conflicting data frame would be when IIP module 80 or IPAT module 84 indicates that the dealer has Iwo cards, while one of the players only has one hand with one card, which is a scenario that conflicts with Blackjack game rules. In this example, the dealer hand in the game state is updated with the second dealer card and the game is set to invalid state.
(002591 In the event of an invalid state or data frames with conflicting information, ambiguity resolution methods can be utilized to assist in accurately determining valid states. An embodiment of the present invention utilizs either or a combination of back tracking, forward tracking, and multiple game states to resolve ambiguities.
[00260] To further explain how backtracking may be utilized to resolve ambiguity with regard to key events and states we refcr now to Figure 23, an illustrative example of states with backward tracking. Beginning at state 370 a game is started. Based upon a key event 372a, which is discovered to be valid, the next state is 374. Key event 372b is also valid and the state 376 is established. Key event 372c is ambiguous with respect to state 376 and consequently cannot establish a new state. Feature 378 indicates backtracking to a previous game state 374 to attempt to resolve the ambiguity of key event 372c. At this 46/59 03-May-2006 06:14 PM WATERMARK 61398196010 point key event 372e is found to be not ambiguous with respect to game state 374 and the new state 380 is established based upon key event 372c to reflect this.
[8102611 The use of backward tracking requires the system to store in memory previous game states and/or previous data frames. The number of temporally previous game states or data frames to be stored in memory can be either fixed to a set number, or can be 00 variable, or determined by a key event.
0 N [002621 Game states continue to be established until the game ends at gamec state 382 O and reset 384 occurs to start a new game state 370.
Ci[002631 Referring now to Figure 24 an illustrative example of states with forward tracking is shown. Beginning at state 390 a game is started, Based upon a key event 392a, which is discovered to be valid, the next state is 394. Key event 392b is valid which results in a valid game state 396. Key event 392c is determined to be ambiguous with respect to game state 396. As a result, the method forward tracks through the front buffer 352 of data frames and identifies a future key event in a data frame in front buffer 352.
The combination of key events 392c and the future key event resolve the ambiguity, thus establishing next state 398. Feature 400 illustrates how ambiguity is resolved by looking for a valid ftiue key event in front buffer 352 and combining it with key event 392c.
[00264] The forward tracking method requires the front buffer 352 (see Figure 22) store data frames in memory that are temporally after the current framne 358 being analyzed.
The number of frames to store information could either be fixed to a set num-ber of data frames or can be variable.
[00265J Game states continue to be established until the game ends at game state 402 and reset 404 occurs to start a new game state 390.
100266] Although backward tracking and forward tracking have been described as separate processes, they may be utilized in conjunction to resolve ambiguous data. If either one fails to establish a valid state, the other may be invoked in an attempt to establish a valid state.
[00267] Referring now to Figures 25a and 25b a flowchart of the process of single state tracking is shown. Beginning at step 410 an initialization for the start of tracking a game begins. At step 410 one or more game state indicators are initialized Examples of game state indicators would be that no card hands have been recognized, a game has not started or a game has not ended, or an initial deal has not been started. In the case of Blackjack an initial deal would be the dealing of two-cards to a player. Processing then moves to step 47/59 03-Moy--2006 06:14 PM WATERMARK 61398196010 44 412 where the process waits for the next data fradme to analyze. At step 414 a frame has arrived and the frame is analyzed to determine if a gamne has ended. Step 414 may invoke one or more tests such as: a) Is the dealer hand complete? In the case of Blackjack, if a dealer hand has a sum more than or equal to seventeen, the dealer hand is marked complete- 00 b) is step a) me and do all player card hands have at least two cards,? IDC) A check of motion data to determine that there is no motion in the dealer area.
Od) No cards in the current frame and no motion on the table could also indicate a game has ended.
If the game has ended theni processing returns to ste~p 410. If the game has not ended, then at step 416 a test is made to detennine if a game has started. The test at step 416 may detennne If the initial deal, denoted by two cards near a betting region 26, has occurred.
If not, processing returns to step 412. If the game has started, then processing moves to step 41 8.
[002681 At step 41 8 the positioning features and identities of cards and card hands in the data frame are matched to the card hands stored in the current game Mtate. The matching process can take on different embodiments such as priority fit. In the ease of priority fit, card hands in the game state are ordered in priority from the right most hand (from the dealer's perspective) to the left most hand. In this ordering, the card hand at the active betting spot that is located farthest to the right of the dealer would have the highest pre-determined priority in picking cards/card hands in the data frame to match to itself.
The right most card hand in the game state would pick the best match of cards/card hands from the data frame, after which the second right most card hand in the game state would get to pick the matching cards/card hands from tie remaining cards/card hands in the data frame.
[002691 In an alternate embodiment of matching, a best fit approach can be used in order to maximize matching for all card hands in a gamne state. In the best fit approach, no specific card hand or betting location is given pre-determined priority.
[00270j In some cases a perfect match with no leftover unmnatched cards or card hands occurs. This indicates that the incom-in'g data frame is consistent with the current game state and that there has been nio change in the gamne state.
48/59 03-May-2006 06:15 PM WATERMARK 61398196010 1002711 Moving now to step 420 a determination is made as to whether there are any unmatched cards or card hands left from the previous step. If there are no unmatched cards o or card hands the process returns to step 412. Unmatched cards or card hands may be an indication of a change in the game state. At step 422, the unmatched cards or card hands are analyzed with respect to the niles of the game to determine a, key event. At step 424, if 00 the determined key event was valid, the next game state is established at step 426, after O which the process returns to step 412. Returning to step 424, if the key event is invalid or IND ambiguous then processing moves to step 429 where an ambiguity resolution method such o as backtracking or forward tracking may be applied in an effort to resolve the ambiguity.
At step 430 a test is made to determine if the ambiguity is resolved. If so, processing moves to step 426 otherwise if the ambiguity is not resolved, then a next game state cannot be established and as a result, processing returns to step 412 and waits for the next framne.
[002721 We shall now discuss how backward tracking (shown as feature 380 of Fi gure 23) functions. Referring now to Figure 26, a flowchart of the process of backward tracking is shown.
10027311 The backward tracking process starts at step 450 by initializing counter to I and initializing to the predetermined maximum number previous game states to backtrack to. In the next step 452 the aunbiguous key event from the single state tracking process (step 424 of FIG. 25b) is compared to the i' previous game state to see if the key event is valid with respect to this previous game state. Moving to step 454, if the ambiguity is resolved by the comparison then backtracking has succeeded and the process ends at step 462. In step 454, if the ambiguity is not resolved then the process moves to step 456 to check if it has backtracked to the maximum limit. If the maximum limit is reached, then moving to step 460 it is determined that backtracking has not resolved the ambiguity and the process ends at step 462. If in step 456 the maximum limiL has not been reached, then the process increments the counter at step 458 and returns to step 452.
1002741 Backward tracking can be used to track to previous frames, in order to look for a valid key event, or to track to previous valid game states.
1002751 Referring now to Figure 27 an illustrative example of backward tracking is shown. Figure 27 shows how backward tracking can be used to backward track to a previous game state in order to resolve ambiguity. In this example, the IP module 80 and [PAT module 84 provide frames containing identity, location and orientation data of playing cards. In the problem scenario a valid state 490 exists with hand A 492 and band B 494 both having two cards. At the next key event 496, the dealer accidentally dropped a 49/59 03-May-2006 06:15 PM WATERMARK 61398196010 46 card on hand A 492 so it now contains three cards and a valid game state 498 is established. At key event 500 the dealer has picked up the card and placed it on hand B O 494 so that hand A 492 now contains two cards and hand B 494 now contains three cards resulting in an invalid game state 502. Key event 500 is ambiguous with respect to current game state 498 and invalid state 502 occurs because hand A 492 cannot he matched 00 between the invalid gamne state 502 and the valid game state 498. The back tracking O method is then activated, and the key event 500 is applied to previous valid game state 490 INO which results in the resolution of the ambiguity and establishing of a new valid game state o (not shown) similar to invalid game state 502. The game cun then continue to updae with new inputs.
[002761 It is also possible that backward tracking may not be able to account for certain key events, in which case other conflict resolution methods described next can be utilized.
1002771 We shall next discuss forward tracking in more detail. Forward tracking requires a front buffer 352 of Figure 22 to store data framnes. The number of frames to store information could either be fixed or can be 'variable. Data frames can be analyzed after a predetermined number of framnes are present in front buffer 352.
[00278J Referring now to Figure 28 a flowchart of the process of forward tracking is shown. The forward tracking p rocess starts at step 5 10 by initializi ng counter 1i" to I and initializing to the predetermined maximum number data framnes in front buffer 352. At step 512 the il data frame in the front buffer is analyzed to determine a key event (as described in previous sections). In the next step 514, the key event is compared to the current game state to determine if the key event is valid and if it resolves the ambiguity.
From step 516, if the ambiguity is resolved then forward tracking has succeeded and the process ends at step 522. If the ambiguity is not resolved then moving to -step 518 a determination is made on whether the end of the front buffer 352 has been reached. if the end of the front buffer has been reached then forward tracking has not been able to resolve the ambiguity and processing ends at step 522. If at step 518, the end of the. front buffer 352 has not been reached, then the counter is incre.mented in step 520, after which the process returns to step 512.
1002791 Referring now to Figure 29 an illustrative example of for-ward tracking for the game of Blackjack is shown. In this forward tracking example, orientation information on playing cads is not available to the game tracking module 86. Valid state 550 indicates that there are two hands, hand A 552 and band B 554, caused by one split, and there are two bets in betting location 556. Kcy Event 558 shows an additional third bet in betting 50/59 03-Mciy-2006 06:16 PM WATERMARK 61398196010 51/59 location 556, the addition of a new card to band A 552 and the offset top card of hand A 552 (indicating possible overlap between two hands), and the combination of the foregoing ofthree features indicate a potential split of Hand A 552 or a double down onto Hand A 552.
Key event 558 is ambiguous with respect to game state 550 since it is uncertain whether a ON split or a double down happened and as a result an invalid state 560 is created. From game 00 state 550, forward tracking (shown by feature 566) into the front buffer of data frames, key O event 564 shows Hand A 552 containing two overlapping card hands whcreby each of the IND two card hands has two curds. Key event 564 is consistent with a split scenario, resolves O the split/double-down ambiguity, is valid with respect to game state 550 and as a result valid game stale 562 is established.
1002P.01 After forward tracking has been done, the data frame that produced the key event that resolved the ambiguity can be established as the current frame 358 (see FIG.
22). It is to be noted that forward tracking can involve analyzing a plurality of data frames in order to resolve ambiguity.
[002811 In the foregoing sections, the game tracking module stored the game state in a singlecurrcnt valid state. The current gamne state was updated based on events, and at the end, the state reflected the game outcome. In the ease of ambiguity or conflicts, ambiguity resolution methods backtracking or forward tracking were invoked to resolve the ambiguity, and if resolved a new current game state was established. This single state model may not directly account for all possible scenarios caused by key events, such as for example, human error in the ease when a card is completely withdrawn from the table and treated as a "burnt card" in Blackjack. In this scenario, the regular Blackjack game roles were not followed, and as a result, the single game state may remain in an invalidlamnbiguous stare until the end of die game. This is because the game state may not be able to account for the key event that a card was removed. In some scenarios an ambiguous key event mnay lead to two potential game states and the single state model might inadvertently pick the wrong next state since the model requires a single valid current gamne state at any given time.
[002821 A multiple game state model can overcome some deficiencies of the single state model by allowing -flexibility in the number of current game states maintained and updated in parallel.
100283] Referring now to Figure 30 an illustrative example of multi state tracking is shown. All variables are initialized at the start of the game at the start game state. The term node is herein referred to as an instance of die game state. On the occurrence of every key 03-Mcay-2006 06:16 PM WATERMARK 61398196010 48 event, the node to the left is always copied with the current state and the node to the right is always updated with the new key event. At key event El (feature 572), thc start game O state 570 is copied to the left as node 574 and updated to the right with key event El as shown by node 576. This process continues at key event E2 (feature 578) to create nodes 580, 582, 584 and 586, At the end oF this update, the previous game state can be 00_ destroyed. This process continues for each key event until the end state is reached.
N 1002841 The representation is very similar to a binary tree with a rule that every child IN node to the left is a copy of the current node and every child node to the right is updated O with the key event. An advantage of this representation is that all possible valid combinations of sequential key events are automatically utilized to update the game states.
[00285J The multiple game state game tracking model can handle backtrackcing since the Previous game state can be copied over as a node for comparison with the new input.
Although Figure 30 illustrates a tree structure, the actual implementation in software may take different forms including a linked list of game states or an array of game states. The tree structure is utilized to explain the concept of multiple state game tracking in a visual format.
[002861 Storing all the possible game states may decrease performance of the multi game state model. An optimization to the multiple state game tracking method could be the use of two variables, one that stores how many key events have passed, and the other that stores when and which key event caused the most recent valid status update in the game state. All the nodes that have not caused a valid status update in the past fixed number of key events can be nullified.
100287] The multiple gamne state model, can be extended to create new states every time there is ambiguity during the update of a gamne state. As an example, if there is an nbiguous situation where two gamre staes are possible, the game tracking module can create two new states, and update both the states based on future key events. This is shown by Figures 31ai and 31lb, which are illustrative examples of multiple valid* game states.
Referring first to Figure 31 a, state 600 shows two card hands 602 and 604 each having two cards. Key event 606 occurs when the dealer adds a new card 610 oriented horizontally, as usually done for a double down, overlapping both card hands; 602 and 604. Key event 604 is ambiguous in that it is not clear if the double down card 6 10 was added to hand 602 or hand 604. Because of this ambiguity, two new game states 614 and 616 are created.
Game state 614 represents the new double down card 610 added to hand 602 to give it three cards, while state 616 represents the new double down card 610 added to hand 604 to 52/59 03-May-2006 06:17 PM WATERMARK 61398196010 53/59 49 give it three cards- Key event 618 happens at a later time when the dealer moves the new card 6 10 over to the left to cover hand 602 which clarifies the configuration of state 620.
O The new state 620 resolves the ambiguity of which hand the double down card was dealt to. Game state 620 finds a complete match between its current game state and the new input from key event 618. However game state 622 will not find a matchb with the new 00 input as hand 604 does not have thruee can-ds, therefore game state 622 is invalid and may o not proceed forward.
IN [002881 Th~e game state that gets updated with a valid Status consistently with new key O events will likely prevail -ultimately and will likely accurately reflect the actual game outcome- The multiple state model may lead to more than one valid end game state. In this scenario, an evaluation of the end game states needs to be madec as to which eid game state is the correct one. In order to assist with this evaluation, a game state may be provided with a likelihood score. The likelihood score or a game state represents the likelihood that the game state is accurate. The likelihood score can also be utilized in the middle of the game to determine the most likely game state firom the list of multiple states that may be maintained and updated, E-xamples of parameters that may he included in a likelihood score are: a) How many key events have passed for the specific game state; the higher the number of key events the larger the likelihood score.
b) The time stamp of the most recent key event for the specific game state; the moire recent the time stamp the larger the likelihood score.
c) The total number of cards and card bands that were matched for this game state; the higher the number of maches the larger the likelihood score.
d) The total number of cads and card hands that were not matched for this game state; the lower the number of un-matched cards I card hands the larger the likelihood score.
e) The number uof carTds in a game state; the larger the number of cards the higher the likelihood score.
O) Addition or removal of chips and tbe chip values at betting spots (if the data is available); the closer the chip movements and values match game rules the larger the likelihood score.
[00289] A historical record of a plurality of the foregoing parameters can be stored for each game state (the historical record would be die scores of its parent and ancestor nodes or scores for previous data frames that were compared to the game state); the historical 03-May-2006 06:17 PM WATERMARK 61398196010
VO
0 record can be traversed and combined with the current likelihood score to determine an Cc) aggregate likelihood score.
0 [00290] The likelihood scores may be used or combined into a formula to provide a single likelihood score. The formula may provide different priority levels for the different scoring parameters. The formula may combine scoring parameters over a period of time, 00 such as the past two seconds, or over a number of data frames, such as the past forty data 0 frames, to determine a current likelihood score. In the event that more than one valid end ,O game state exists, the likelihood score(s) of each state may be compared to determine O which end game state is the correct game state.
[002911 fn an alternate embodiment, multiple game state game tracking may be integraled with a card shoe based reader. In this embodiment, the dispensing of a new identified card may be classified as a key event, which may trigger the creation of new game states where each new game state represents a different card hand receiving the new card. The likelihood score concepts and other concepts explained in this section may be integrated with this embodiment to assist with determining the most likely correct game state.
[002921 Returning to Figure 6 we will now discuss bet recognition module 88. Bet recognition module 88 can determine the value of wagers placed by players at the gaming table. In one embodiment, an RFID based bet recognition system can be implemented, as shown in Fig 5. Different embodiments of RFID based bet recognition can be used in conjunction with gaming chips containing RFID transmitters. As an example, the RFID bet recognition system sold by Progressive Gaming International or by Chipco International can be utilized.
100293] In another embodiment, a vision based bet recognition system can be employed in conjunction with the other modules of this system. There are numerous vision based bet recognition embodiments, such as those described in U.S. patents 5,782,647 to Fishbine et al.; 5,103,081 to Fisher et al; 5,548,110 to Storch et al.; and 4,814,589 to Storch et al.
Commercially available implementations of vision based bet recognition, such as the MP21 system marketed by Bally Gaming or the BRAVO system marketed by Genesis Gaming, may be utilized with the invention.
[00294j The bet recognition module 88 can interact with the other modules to provide more comprehensive game tracking. As an example, the game tracking module 86 can 54/59 03-May-2006 06:18 PM WATERMARK 61398196010 55/59
IO
51 send a capture trigger to the bet recognition module 88 at the start of a game to Sautomatically capture bets at a table game.
[002951 Referring to Figure 6 we will now discuss player tracking module 90. Player tracking module 90 can obtain input from the 1P module 80 relating to player identity cards. The player tracking module 90 can also obtain input from the game tracking module 00 86 relating to game events such as the beginning and end of each game. By associating C each. recognized player identity card with the wager located closest to the card in an I overhead image of the gaming region, the wager can be associated with that player identity Scard. In this manner, comp points can be automatically accumulated to specific player identity cards.
1002961 Optionally the system can recognize special player identity cards with machine readable indicia printed or affixed to them (via stickers for example). The machine readable indicia can include matrix codes, barcodes or other identification indicia.
1002971 Optionally, biometrics technologies such as face recognition can be utilized to assist with identification of players.
1002981 Referring now to Figure 32 a flowchart of the process of player tracking is shown. The process invoked by player tracking module 90 starts at step 630 and moves to step 632 where the appropriate imaging devices are calibrated and global variables are initialized. At step 634 processing waits to obtain positioning and identity of a player identity card from IP module 80. At step 636 an association is made between a player identity card and the closest active betting region. At step 638 complementary points are added to the player identity card based upon betting and game activity. Once a game ends processing returns to step 634.
[00299] We will now discuss the functionality of surveillance module 92. Surveillance module 92 obtains input relating to automatically detected game events from one or more of the other modules and associates the game events to specific points in recorded video.
The surveillance module 92 can include means for recording images or video of a gaming table. The recording means can include the imagers 32. The recording means can be computer or software activated and can be stored in a digital medium such as a computer hard drive. Less preferred recording means such as analog cameras or analog media such as video cassettes may also be utilized.
[003001 Referring now to Figure 33 a flowchart of the process of surveillance is shown.
Beginning al step 650 the process starts and at step 652 the devices used by the 03-May-2006 06:18 PM WATERMARK 61398196010
VO
52 surveillance module are calibrated and global variables are initialized. Moving to step 654 Cc, recording begins. At step 656 input is obtained from other modules. The surveillance O module 92 can receive automatically detected game events input from one or more of the other modules. As an example, the surveillance module 92 can receive an indicator from the game tracking module 86 that a game has just begun or has just ended. As another 0 0 example, the surveillance module 92 can receive input from the bet recognition module 88 0 that chips have been tampered with. In yet another example, the surveillance module 92 IN can receive input from the player tracking module 90 that a specific player is playing a game0 At stp 658 a game event or player data related event is coupled to an event marker on the video. The surveillance module 92 associates the game events to specific points in recorded video using digital markers. Various embodiments of markers and associations are possible. As a non-limiting example, the surveillance module can keep an index file of game events and the associated time at which they took place and the associated video file that contains the recorded video of that game event. Associating automatically tracked table game events/data to recorded video by using event markers or other markers can provide efficient data organization and retrieval features. In order to assist surveillance operators, data may be rendered onto the digital video. For instance, a color coded small box may be rendered beside each betting spot on the video. The color of the box may be utiliz i7ed to indicate the current game status for the player. As an example, the color red may be used to indicate that the player has bust and the color green may be used to indicate that the player has won. Various symbols, text, numbers or markings may be rendered onto the surveillance video to indicate game events, alerts or provide data. An advantage of this feature is that it enables surveillance operators to view data faster. For example, it is easier for a surveillance operator to see a green colored box beside a betting spot and understand that the player has won, than to total up the player's cards and the dealer's cards to determine who won. In this feature, game data may be rendered directly onto the video during recording, or the data may be stored in a database and then dynamically rendered onto the video during playback only. Furthermore, additional features such as by example, notes and incident reports can be incorporated into the surveillance module. Additionally, sound recording may be incorporated into the surveillance module in order to capture the sounds happening at the gaming table. For example, sound capturing devices (for example: microphones) may be positioned in the overhead imaging system or lateral imaging system or alt any other location in the vicinity of the gaming region. The captured sound may be included into the recorded video. Optionally, speech recognition software or algorithms 56/59 03-May-2006 06:19 PM WATERMARK 61398196010 57/59 c53 may be used to interpret the sounds captured at the gaming table. At step 660 the event Ct data is recorded on video. Processing then returns to step 656.
1003011 The surveillance module 92 can replay certain video sequences relating to gaming events based on a selection of a game event. Figure 34 is a flowchart of the process of utilizing surveillance data. Figure 34 illustrates how a user interface may be 00 coupled with the data collected by surveillance module 92 to display data of interest to a Ci user, Processing begins at step 670 and a step 672 calibration of the necessary hardware O and the initialization of data variables occurs. At step 674 the process waits for input from 0 the user on what video is requested. The user can select a specific gaming table and view recorded video clips organized by game. Alternatively, the user can select a specific player and view video clips organized by player. Similarly, the user can potentially select certain game events such as tampering of chips and view the clips associated with those game events. At step 676 a search is made for the event markers that are relevant to the user input of step 674 and are located on the recorded media. At step 678 a test is made to determine if any event markers were found. If not processing moves to step 680 where a message indicating no events were located is displayed to the user. Processing then returns to step 674. If event markers have been found at step 678 then processing moves to 682 and the relevant images are displayed to the user. Control then returns to step 674 where the user may view the video. During display the user may utilize the standard features of video and sound imaging, tfor example: speed up, slow down, freeze frame, and increase resolution.
J003021 We shall now discuss the analysis and reporting module 94 of Figure 6.
Analysis and reporting module 94 can mine data in the database 102 to provide reports to casino employees. The module can be configured to performnn functions including automated player tracking, including exact handle, duration of play, decisions per hour, player skill level, player proficiency and true house advantage. The module 94 canbe configured to automatically track operational efficiency measures such as hands dealt per hour reports, procedure violations, employee efficiency ranks, actual handle for each table and actual house advantage for each table. The module 94 can be configured to provide card counter alerts by examining player playing patterns. It can be configured to automatically detect fraudulent or undesired activities such as shuffle tracking, inconsistent deck penetration by dealers and procedure violations. The module 94 can be configured to provide any combination or type of statistical data by performing data mining on the recorded data in the database.
03-Mcay-2006 06:19 PM 'WATERMARK 61398196010 58/59 1903031 Output, including alerts and player compensation notifications, can be through output devices such as monitors, LCD displays, or PDAs. An output device can be of any 0 type and is not limited to visual displays and can include auditory or other sensory means.
The software can potentially be configured to generate any type of' report with respect to casino operations.
00 [00304j Module 94 can be configured to accept input from a user interface, running on 0 N" inpuit devices. These inputs can include, without limitation, training parameters, IN configuration commands, dealer identity, table status, and other inputs required to operate o the system.
1003051 Although not shown in Figure 6 a chip tray recognition module may be provided to determine the contents of the dealer's chip bank. In one embodiment an RFID based chip tray recognition system can be implemented. In another embodiment, a vision based chip tray recognition system can be implemented. The chip fray recognition module can send data relati ng to the value of chips in the dealer's chip tray to other modules.
1003061 Although not shown in Figure 6, a deck checking module may be provided. A deck checking module would receive card identity and location data from the I P module The card identity and location data can be utilized to perform automated verification of deck checking as a dealer performs manual deck checking.
[00307] Although not s;hown in Figure 6, a dealer identity module may be employed to track the identity of a dealer. The dealer can optionally either key in her unique identity code at the game table or optionally she can use an identity card and associated reader to register their identity. A biometrics system may be used to facilitate dealer or employee idenitification.
[00308] All of or parts of the disclosed invention as disclosed can be utilized to enable new game development such as fixed jackpot gamnes, progressive jackpot gamnes, bonusing games, manual and electronic side betting games. Partially automated Blackjack based and Baccarat based games can be developed similar to the popular game Rapid Roulette. As an example, a player cani make an electronic side bet at a table where the win or loss of the side bet would be dependent on whether a specific game outcome (such as achieving a player hand total of 2 1) or specific game event (such as receiving a. pair in the initial deal of two cards) occurs. The disclosed system can automatically detect game events and the outcome of a game and consequently establish whether the player's side bet lost or won.
Upon determination of the win or loss of the side bet, Analysis and Reporting module 94 03-May-2006 06:20 PM WATERMARK 61398196010
\O
can send a signal on the side bet win/loss to outside systems or modules. As an example, Sthe disclosed invention may be utilized in conjunction with the table game electronic O auxiliary bet systems offered by DEQ Systems Corp. to automatically determine game outcome, and payouts/debits for player side bets. As another example, the disclosed invention may be utilized in conjunction with the Progressive Jackpot games (such as 00 Progressive Blackjack) offered by Progressive Gaming International Corp. to O automatically determine game outcome, and consequently determine payouts/debits for IO player wagers. As yet another example, the disclosed invention can be utilized in o conjunction with IGT's State-Of-The-Art Bonusing Concepts such as Automated Payout Multiplier for table games to trigger multiple payouts to a player based on the game wins/losses of the player as tracked by the disclosed invention.
[003091 The terms imagers and imaging devices have been used interchangeably in this document. The imagers can have any combination of sensor, lens and/or interface. Possible interfaces include, without limitation, 10/100 Ethernet, Gigabit Ethernet, USB, USB 2, FireWire, Optical Fiber, PAL or NTSC interfaces. For analog interfaces such as NTSC and PAL a processor having a capture card in combination with a frame grabber can be utilized to get digital images or digital video.
1003101 The image processing and computer vision algorithms in the software can utilize any type or combination or color spaces or digital file formats. Possible color spaces include, without limitation, ROB, HSL, CMYK, Grayscale and binary color spaces.
[00311] The overhead imaging system may be associated with one or more display signs. Display sign(s) can be non-electronic, electronic or digital. A display sign can be an electronic display displaying game related events happening at the table in real time. A display and the housing unit for the overhead imaging devices may be integrated into a large unit. The overhead imaging system may be located on or near the ceiling above the gaming region.
[003121 Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
59/59

Claims (63)

  1. 03-May-2006 06:22 PM WATERMARK 61398196010 4/57 VO 56 1. A method of tracking a gaming object on a gaming table, said method comprising obtaining a position profile for a gaming object and resolving said position profile on said gaming table. 00 S2. The method of claim 1, wherein said position profile comprises a shape descriptor M\S and said resolving comprises: locating a first positioning feature ofsaid shape descriptor; c, interpolating a second positioning feature based on said first positioning feature; and analyzing said first positioning feature and said second positioning feature in relation to said shape descriptor. 3. The method of claim 2, wherein said analyzing utilizes a points in contour test.
  2. 4. The method of claim 2, wherein said first positioning feature and said second positioning feature are convex corners, said convex comers utilized to determine if a playing card resides within said shape descriptor. The method of claim 4 further comprising identifying convex corners as matched corners, thereby preventing said matched comers from being analyzed.
  3. 6. The method of claim 2, further comprising determining if said gaming object contains a playing card.
  4. 7. The method of claim 2, further comprising determining if an object other than a playing card, resides within said shape descriptor.
  5. 8. The method of claim 1, further comprising organizing cards into card hands defined by a plurality of playing cards on said gaming table, said organizing comprising: determining a position profile for one or more of said playing cards; selecting at least two cards from said playing cards; identifying a card configuration defined by said at least two cards; evaluating the validity of said card configuration according to a set of rules associated with the game being played; and 03-Mciy-2006 06:22 PM WATERMARK 61398196010 5/57 IND determining if said at least two cards belong in the samc card hand based upon said evaluating.
  6. 9. The method of claim 8, further comprising establishing an order of selection of said plurality of playing cards. 00 10. The method of claim 9, wherein said establishing an order comprises ordering said O plurality of playing cards by distance from a reference point. O 11. The method of claim 8, flirther comprising rotating said position profile for the Cl purpose of said evaluating.
  7. 12. The ruethod of claim 1, further comprising partitioning a contour comprising overlapping card hands by: detecting said contour; and applying an erosion algorithm to said contour, whereby said contour is eroded into at least two contours.
  8. 13. The method of claim 1, further comprising performing an analysis of said gaming object by; detecting an event in a vicinity of said gaming object according to said position profile; and analyzing said gaming object based upon said event.
  9. 14. Trhe method of claim 13, wherein said event comprises a presence of motion in the vicinity of said position profie. The method of claim 13, wherein said event comprises detecting an object overlapping said position profile of said gaming object.
  10. 16. The method of claim 13, wherein said event comprises the presence of a skin colored object in the vicinity of said position profile in an overhead image of said gaming object.
  11. 17. A method of tracking gaming objects on a gaining table comprising: recording temporally sequential data relating to a plurality of said gaming objects; 03-May-2006 06:22 PM WATERMARK 61398196010 S58 determining an identity and a position profile of a tracked one of said gaming objects at a first instant in time from said data; O determining a position profile of an investigated one of said objects at a second instant in time from said data; identifying a compatibility between said position profile of said investigated 00 one and said position profile of said tracked one; and S- assigning said identity to said investigated one of said objects according to I\ said compatibility. C, 18. The method of claim 17, wherein said recording sequential data comprises recording top view images of said game table periodically.
  12. 19. The method of claim 17, wherein said tracked one of said objects is a playing card, and said position profile of said tracked one comprises a position of a boundary feature of said playing card. The method of claim 17, further comprising utilizing multiple data frames within said sequential data for said determining an identity and a position profile.
  13. 21. The method of claim 17, further comprising utilizing multiple data frames within said sequential data for said identifying a compatibility.
  14. 22. A system for tracking a gaming object on a gaming table, said system comprising means for obtaining a position profile for a gaming object and means for resolving said position profile on said gaming table.
  15. 23. The system of claim 22, wherein said position profile comprises a shape descriptor and said means for resolving comprises: means for locating a first positioning feature of said shape descriptor; means for interpolating a second positioning feature based on said first positioning feature; and means for analyzing said first positioning feature and said second positioning feature in relation to said shape descriptor.
  16. 24. The system of claim 23, wherein said means for analyzing utilizes a points in contour test. 6/57 03-Mcxy-2006 06:23 PM WATERMARK 61398196010 7/57 The system of claim 23, wherein said first position feature and said second positioning feature are convex corners, said convex corners utilized to determine if a 0 playing card resides within said shape descriptor.
  17. 26. The system of claim 2$ further comprising means for identifying convex corners as matched corners, thereby preventing said matched corners from being analyzed. 027, The system of claim 23, further comprising means for determining if said gaming IND object contains a playing car4- Cl 28. *The system of claim 23, further comprising means for determining if an object other than a playing card, resides within said shape descriptor.
  18. 29. The system of claim 22, futher comprising means for organizing cards into card hands defined by a plurality of playing cards on said gaming table, said means for organizing cards comprising:, means for determining a position profile for one or more of said playing cards-, means for selecting at least two cards from said playing cards; means for identifying a card configuration defined by said at least two cards; means for evaluating the validity of said card configuration according to a set of rules associated with the game being played, and means for determining if said at least two cards belong in the same card hand based upon said evaluating- The system of claim 29, further comprising means for establishing an order of selection of said plurality of playing cards.
  19. 31. The system of claim 30, wherein said means for establishing an order comprises means for ordering said plurality of playing cards by distance from a reference point.
  20. 32. The system of claim 29, further comprising means for rotating said position profile for the purpose of said evaluating.
  21. 33. The system of claim 22, fariher comprising means for partitioning a contour comprising overlapping card hands, said means for partitioning a contour comprising: means fur detecting said contour;, and I 03-May-2006 06:23 PM WATERMARK 61398196010 8/57 NO 06 means for applying an erosion algorithm to said contour, whereby said contour is eroded into at least two contours.
  22. 34. The system of claim 22, further comprising means for performing an analysis of said gaming object, said means for performing an analysis comprising: means for detecting an event in a vicinity of said gaming object according o to said position profile; and CA- means for analyzing said gaming object based uapon said event. IN The system of claim 34, wherein said event comprises a presence of motion in the vicinity of said position profile.
  23. 36. The system of claim 34, wherein said event comprises detecting an object overlapping said position profile of said gaming object.
  24. 37. The system of claim 34, wherein said event comprises the presence of a skin colored object in the vicinity of said position profile in an overhead image of said gamig obj ect.
  25. 38- A system for tracking gaming objects on a gaming table comprising: means for recording temporally scqucntial data relating to a plurality of said gaming objects; means for determining an identity and a position profile of a tracked one of said gaming objects at a first instant in time from said data; means for detennining a position profile of an investigated one of said objects at a second instant in. time from said data; means for identifying a compatibility between said position profile of said invcstigated one and said position profile of said tacked one; and means for assigning said identity to said investigated one of said objects according to said compatibility.
  26. 39. The system of claim 38, wherein said mecans for recording sequential data comprises means for recording top view images of said game table periodically. The system of claim 38, wherein said tracked one of said objects is a playing card, and said position profile of said tracked one comprises a position of a boundary feature of said playing card. 03-Mciy-2006 06:23 PM WATERMARK 61398196010 9/57 61
  27. 41. The system of claim 38, further comprising means for utilizing multiple data frames within said sequential data for use by said means for determining an identity and a 0 position profile. 0>42. The system of claim 38, further comprising means for utilizing multiple data frames within said sequential data for use by said means for identifying a compatibility.
  28. 043. The method of claim 1, wherein said method is embodied in computer instructions o in a computer readable medium. Cl44. The method of claim 17, wherein said method is embodied in computer instructions in a computer readable medium- A method of tracking the progress of a game on a gaming table comprising. recording data frames and game states as data while said game is in progress; establishing a first state of said game from said data; i identifying an occurrence of a game event that follows said first state; evaluating whether said game event and a set of rules of said game provide sufficient information to accurately create a second state; determining that further information is required to accurately create Raid second state according to the results of said evaluating; obtaining said further information from said data; and creating a second state according to said game event, said set of rules and said further information.
  29. 46. The metbod of claim 45, wherein said evaluating comprises evaluating whether said game event is coherent with respect to said first state and said set of rules and said determining comprises determining that said game. event is not coherent with respect to said first state and said set of rules.
  30. 47. The method of claim 45, wherein said recording data comprises recording overhead images of said gaming table wifle said game is in progress.
  31. 48. The method of claim 45, wherein said recording data comprises recording data collected from RHO1 sensors within said gaming table. 03-May-2006 06:24 PM WATERMARK 61398196010 10/57 IO 62
  32. 49. The method of claim 45, wherein said recording data comprises recording data Cc, collected from proximity detection sensors within said gaming table. The method of claim 45, wherein said first state is defined by a plurality of parameters related to a plurality of playing cards positioned on said gaming table. 00 _51. The method of claim 45, further comprising: 0 storing a hack buffer of previous data frames; N- storing a front buffer of future data frames; S- utilizing either or both of said back buffer and said front buffer to create said second state from said first state based upon a current frame.
  33. 52. A method of tracking the progress of a game on a gaming table comprising: recording data relating to said game while said game is in progress; establishing a plurality of potential game states of said game; identifying an occurrence of a game event that follows said plurality of potential game states; applying said game event to at least two of said plurality of potential game states to establish at least one new potential game state; adding said at least one new potential game state to said plurality of potential game states to establish an updated plurality of potential states; evaluating a likelihood of each potential game state, and; identifying at least one likely potential game state of said updated plurality based on said evaluating.
  34. 53. The method of claim 52, further comprising identifying at least one unlikely potential game state and removing said at least one unlikely potential game state from said updated plurality.
  35. 54. The method of claim 52, wherein said identifying at least one likely potential game state comprises identifying a most likely potential game state. The method of claim 52, wherein said evaluating comprises calculating a likelihood score for each potential game state.
  36. 56. The method of claim 55, wherein said evaluating a likelihood score comprises counting the total number of cards in a potential game state. 03-May-2006 06:24 PM WATERMARK 61398196010 11/57 ID 0 63
  37. 57. The method of claim 55, wherein said evaluating a likelihood score comprises Cc counting unmatched cards or card hands between a current data frame and a potential game O state. S58. The method of claim 55, wherein said evaluating comprises analyzing a historical 00 record of likelihood scores for a potential game state. S59. The method of claim 45, further comprising examining said game states previous to ID said first state and utilizing one or more of said game states previous to said first state to O create said second state based upon said game event. A system of tracking the progress of a game on a garning table comprising: means for recording data frames and game states as data while said game is in progress; means for establishing a first state of said game from said data; means for identifying an occurrence of a game event that follows said first state; means for evaluating whether said game event and a set of rules of said game provide sufficient information to accurately create a second state; means for determining that further information is required to accurately create said second state according to the results of said evaluating; means for obtaining said further information from said data; and means for creating a second state according to said game event, said set of rules and said further information.
  38. 61. The system of claim 60, wherein said means for evaluating determines whether said game event is coherent with respect to said first state and said set of rules and said means for determining comprises means for determining that said game event is not coherent with respect to said first state and said set of rules.
  39. 62. The system of claim 60, further comprising: means for storing a back buffer of previous data frames; means for storing a front buffer of future data frames; means for utilizing either or both of said back buffer and said front buffer to create said second state from said first state based upon a current data frame.
  40. 63. A system for tracking the progress of a game on a gaming table comprising: 03-May-2006 06:25 PM WATERMARK 61398196010 12/57 VO c0 64 means for recording data relating to said game while said game is in Cc progress; O means for establishing a plurality of potential game states of said game; means for identifying an occurrence of a game event that follows said plurality of potential game states; 00 means for applying said game event to at least two of said plurality of Spotential game states to establish at least one new potential game state; N- means for adding said at least one new potential game state to said plurality o of potential game states to establish an updated plurality of potential states; means for evaluating a likelihood of each potential game state, and; means for identifying at least one likely potential game state of said updated plurality based on said evaluating.
  41. 64. The system of claim 63, further comprising means for identifying at least one unlikely potential game state and removing said at least one unlikely potential game slate from said updated plurality. The system claim 63, wherein said identifying at least one likely potential game state comprises means for identifying a most likely potential game state.
  42. 66. The system of claim 63, wherein said evaluating comprises means for calculating a likelihood score for each potential game state.
  43. 67. The system of claim 63, wherein said evaluating a likelihood score comprises means for counting the total number of cards in a potential game state.
  44. 68. The system of claim 63, wherein said evaluating a likelihood score comprises means for counting unmatched cards or card hands between a current data frame and a potential game state.
  45. 69. The system of claim 63, wherein said evaluating comprises means for analyzing a historical record of likelihood scores for a potential game state. The system of claim 60, further comprising means for examining said game states previous to said first statl and means for utilizing one or more of said game states previous to said first state to create said second state based upon said game event. 03-May-2006 06:25 PM WATERMARK 61398196010 13/57 0
  46. 71. The method of claim 45 wherein said method is embodied in computer instructions Sin a computer readable medium.
  47. 72. The method of claim 52 wherein said method is embodied in computer instructions in a computer readable medium. 00
  48. 73. A system for identifying a gaming object on a gaming table comprising: C at least one overhead camera for capturing an image of said table; o a detection module for detecting a feature of said object on said image; O a search module for extracting a region of interest of said image that describes said object from said feature; a feature space module for transforming a feature space of said region of interest to obtain a transformed region of interest; and an identity module comprising a statistical classifier trained to recognize said object from said transformed region.
  49. 74. The system of claims 73, wherein said feature space module comprises a Principal Component Analysis module for transforming said feature space according to principal component analysis algorithms. The system of claim 73, further comprising a dimensionality reduction module for reducing said transformed region into a reduced representation according to dimensionality reduction algorithms, wherein said statistical classifier is trained to recognize said object from said reduced representation.
  50. 76. The system of claim 73, wherein said identity module comprises a cascade of classifiers.
  51. 77. The system of claim 73, wherein said detection module comprises a cascade of classifiers.
  52. 78. The system of claim 76, further comprising a boosting module for combining weak ones of said cascade of classifiers.
  53. 79. The system of claim 77, further comprising a boosting module for combining weak ones of said cascade of classiflicrs. 03-May-2006 06:25 PM WATERMARK 61398196010 14/57 \O ci 66 The system of claims 76, wherein said detection module comprises a cascade of CC classifiers, further comprising a boosting module for combining weak classifiers of said cascades of classifiers.
  54. 81. The system of claim 73, wherein said object is a card belonging to a deck of cards, 00 and further comprising a deck verification module for receiving a suit and a rank of said o card from said statistical classifier, and verifying that said deck of cards adheres to a 0 N provided set of standards. O 82. The system of claim 73, wherein said object is a playing card, and said region of interest is a region of said image occupied by an index of said card.
  55. 83. The system of claim 82, wherein said region of interest is a region of said image occupied by a suit of said card.
  56. 84. A method of identifying a value of a playing card placed on a game table comprising: capturing an image of said table; detecting at least one feature of said playing card on said image; delimiting a target region of said image according to said feature, wherein said target region overlaps a region of interest, and said region of interest describes said value; scanning said target region for a pattern of contrasting points; detecting said pattern; delimiting said region of interest of said image according to a position of said pattern; and analyzing said region of interest to identify said value. The method of claim 84, wherein said feature is a segment of an edge of said card. 86, The method of claim 85, further comprising determining at least two scan lines parallel to said edge within said target region, wherein said scanning is performed along said lines, and whereby said scanning is more efficient.
  57. 87. The method of claim 84, wherein said scanning is performed along lines perpendicular to said edge, and said detecting comprises recording a most contrasting point for each of said lines to obtain a series of points, and applying a pattern recognition algorithm to said series to identify a pattern characteristic of a card identifying symbol. 03-Mcxy-2006 06:26 PM WATERMARK 61398196010 15/57 67
  58. 88. The method of claim 87, wherein said applying a pattern recognition algorithm comprises convolving said pattern with a mask of properties expected from a card O identifying symbol.
  59. 89. The method of claim 84, wherein said feature is a orner of said card. 00_
  60. 90. A system for detecting an inconsistency with respect to playing cards dealt on a 0 gamne table comprising: Ste a card reader for determining an identity of each playing card as it is being O dealt on1 said table; an overhead camnera for capturing images of said table; a recognition module for determining an identity of each card positioned on said table from said images; and a tracking module for comparing said identity determined by said card reader wit said identity determined by said recognition module, and detecting said inconsistency.
  61. 91. The system of claim 90, wherein said card reader determines a dealing order of said each playing card as it is being dealt on said table, said recognition module determines a position of said each card positioned on said table, and said tracking module compares said identity and said order determined by said card reader with said identity and said position determnined by said recognition module and detects said inconsistency according to procedures; of a gamne.
  62. 92. The system of claim 90, wherein said recognition module determines an approximate identity of said each card positioned on said table, and said tracking module compares said approximate identity with said identity determined by said recognition module, and detects said inconsistency.
  63. 93. Thc system of claim 90, wherein said card reader is comprised in a card shoe for storing playing cards io be dealt on said table. DATED this 3rd day Of May 2006 TANGAM GAMING TECHJNOLOGY INC WATEIMRK PATENT TRADEMARK ATTORNEYS 290 UWtOD ROAD HAWTHORN VIC 3122
AU2006201849A 2005-05-03 2006-05-03 Gaming object position analysis and tracking Abandoned AU2006201849A1 (en)

Applications Claiming Priority (14)

Application Number Priority Date Filing Date Title
US67693605P 2005-05-03 2005-05-03
US60/676936 2005-05-03
US69340605P 2005-06-24 2005-06-24
US60/693,406 2005-06-24
US72348105P 2005-10-05 2005-10-05
US72345205P 2005-10-05 2005-10-05
US60/723,452 2005-10-05
US60/723,481 2005-10-05
US73633405P 2005-11-15 2005-11-15
US60/736,334 2005-11-15
US76036506P 2006-01-20 2006-01-20
US60/760,365 2006-01-20
US77105806P 2006-02-08 2006-02-08
US60/771,058 2006-02-08

Publications (1)

Publication Number Publication Date
AU2006201849A1 true AU2006201849A1 (en) 2006-11-23

Family

ID=37461021

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2006201849A Abandoned AU2006201849A1 (en) 2005-05-03 2006-05-03 Gaming object position analysis and tracking

Country Status (2)

Country Link
US (1) US20070077987A1 (en)
AU (1) AU2006201849A1 (en)

Families Citing this family (64)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8905834B2 (en) * 2007-11-09 2014-12-09 Igt Transparent card display
US7901285B2 (en) * 2004-05-07 2011-03-08 Image Fidelity, LLC Automated game monitoring
EP1901822A2 (en) * 2005-05-19 2008-03-26 Image Fidelity Llc. Remote gaming with live table games
US9524606B1 (en) 2005-05-23 2016-12-20 Visualimits, Llc Method and system for providing dynamic casino game signage with selectable messaging timed to play of a table game
US7388494B2 (en) * 2005-12-20 2008-06-17 Pitney Bowes Inc. RFID systems and methods for probabalistic location determination
US7704144B2 (en) * 2006-01-20 2010-04-27 Igt Player ranking for tournament play
US7690996B2 (en) 2006-11-06 2010-04-06 Igt Server based gaming system and method for providing one or more tournaments at gaming tables
AU2008205438B2 (en) 2007-09-13 2012-07-26 Universal Entertainment Corporation Gaming machine and gaming system using chips
US9174114B1 (en) * 2007-11-13 2015-11-03 Genesis Gaming Solutions, Inc. System and method for generating reports associated with casino table operation
US8896444B1 (en) 2007-11-13 2014-11-25 Genesis Gaming Solutions, Inc. System and method for casino table operation
US9165420B1 (en) 2007-11-13 2015-10-20 Genesis Gaming Solutions, Inc. Bet spot indicator on a gaming table
US8333655B2 (en) 2008-07-11 2012-12-18 Wms Gaming Inc. Methods of receiving electronic wagers in a wagering game via a handheld electronic wager input device
US8529345B2 (en) 2008-10-02 2013-09-10 Igt Gaming system including a gaming table with mobile user input devices
US8306819B2 (en) * 2009-03-09 2012-11-06 Microsoft Corporation Enhanced automatic speech recognition using mapping between unsupervised and supervised speech model parameters trained on same acoustic training data
CN101853389A (en) * 2009-04-01 2010-10-06 索尼株式会社 Detection device and method for multi-class targets
US20100273547A1 (en) * 2009-04-28 2010-10-28 Stasi Perry B Method and system for capturing live table game data
US20110065496A1 (en) * 2009-09-11 2011-03-17 Wms Gaming, Inc. Augmented reality mechanism for wagering game systems
GB2483168B (en) 2009-10-13 2013-06-12 Pointgrab Ltd Computer vision gesture based control of a device
US8433140B2 (en) * 2009-11-02 2013-04-30 Microsoft Corporation Image metadata propagation
US9710491B2 (en) * 2009-11-02 2017-07-18 Microsoft Technology Licensing, Llc Content-based image search
US20110106798A1 (en) * 2009-11-02 2011-05-05 Microsoft Corporation Search Result Enhancement Through Image Duplicate Detection
CN102456128B (en) * 2010-10-27 2014-06-11 徐继圣 Stereoscopic vision dice point identification system and method in uncontrolled environments
WO2012081012A1 (en) * 2010-12-16 2012-06-21 Pointgrab Ltd. Computer vision based hand identification
FR2982057B1 (en) * 2011-10-28 2014-06-13 Peoleo METHOD FOR RECOGNIZING AN IMAGE IN A SCENE
US8938124B2 (en) 2012-05-10 2015-01-20 Pointgrab Ltd. Computer vision based tracking of a hand
US10046230B1 (en) 2012-10-01 2018-08-14 Genesis Gaming Solutions, Inc. Tabletop insert for gaming table
EP4039344B1 (en) * 2013-08-08 2024-05-15 Angel Playing Cards Co., Ltd. A method for administrating a package of shuffled playing cards
EP3048555B1 (en) * 2013-09-20 2020-07-15 Fujitsu Limited Image processing device, image processing method, and image processing program
US20150199872A1 (en) * 2013-09-23 2015-07-16 Konami Gaming, Inc. System and methods for operating gaming environments
US20150087417A1 (en) * 2013-09-23 2015-03-26 Konami Gaming, Inc. System and methods for operating gaming environments
WO2015143207A2 (en) * 2014-03-19 2015-09-24 Maurice Mills On-line remote game system
US10515284B2 (en) 2014-09-30 2019-12-24 Qualcomm Incorporated Single-processor computer vision hardware control and application execution
US20170132466A1 (en) 2014-09-30 2017-05-11 Qualcomm Incorporated Low-power iris scan initialization
US10096206B2 (en) * 2015-05-29 2018-10-09 Arb Labs Inc. Systems, methods and devices for monitoring betting activities
US10410066B2 (en) 2015-05-29 2019-09-10 Arb Labs Inc. Systems, methods and devices for monitoring betting activities
KR102305619B1 (en) 2015-08-03 2021-09-27 엔제루 구루푸 가부시키가이샤 Management system for table games, substitute currency for gaming, inspection device, and management system of substitute currency for gaming
US11074780B2 (en) 2015-08-03 2021-07-27 Angel Playing Cards Co., Ltd. Management system of substitute currency for gaming
CN115624738A (en) 2015-08-03 2023-01-20 天使集团股份有限公司 Medal, inspection device, method for manufacturing medal, and management system for table game
AU2016302657A1 (en) * 2015-08-03 2018-02-22 Angel Playing Cards Co., Ltd. Fraud detection system at game parlor
US10970962B2 (en) 2015-08-03 2021-04-06 Angel Playing Cards Co., Ltd. Management system of substitute currency for gaming
AU2015261614A1 (en) * 2015-09-04 2017-03-23 Musigma Business Solutions Pvt. Ltd. Analytics system and method
US10872505B2 (en) * 2016-01-05 2020-12-22 Ags Llc Electronic gaming devices for playing a card game having multiple wagering opportunities
WO2017129611A1 (en) * 2016-01-27 2017-08-03 Evolution Malta Ltd Method and system for card shuffle integrity tracking
US10650550B1 (en) * 2016-03-30 2020-05-12 Visualimits, Llc Automatic region of interest detection for casino tables
US11308642B2 (en) * 2017-03-30 2022-04-19 Visualimits Llc Automatic region of interest detection for casino tables
US10217312B1 (en) * 2016-03-30 2019-02-26 Visualimits, Llc Automatic region of interest detection for casino tables
US20210090380A1 (en) * 2016-05-09 2021-03-25 Ags Llc Methods, devices and systems for processing wagers associated with games having multiple wagers
SG11201809960YA (en) * 2016-05-16 2018-12-28 Sensen Networks Group Pty Ltd System and method for automated table game activity recognition
AU2017305765A1 (en) * 2016-08-02 2019-02-21 Angel Playing Cards Co., Ltd. Inspection system and management system
US10061984B2 (en) * 2016-10-24 2018-08-28 Accenture Global Solutions Limited Processing an image to identify a metric associated with the image and/or to determine a value for the metric
US10614332B2 (en) 2016-12-16 2020-04-07 Qualcomm Incorportaed Light source modulation for iris size adjustment
US10984235B2 (en) * 2016-12-16 2021-04-20 Qualcomm Incorporated Low power data generation for iris-related detection and authentication
AT519722B1 (en) 2017-02-27 2021-09-15 Revolutionary Tech Systems Ag Method for the detection of at least one token object
KR102501264B1 (en) * 2017-10-02 2023-02-20 센센 네트웍스 그룹 피티와이 엘티디 System and method for object detection based on machine learning
WO2019068190A1 (en) 2017-10-03 2019-04-11 Arb Labs Inc. Progressive betting systems
US10841458B2 (en) * 2018-03-02 2020-11-17 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium
WO2020072664A1 (en) * 2018-10-02 2020-04-09 Gaming Partners International Usa, Inc. Vision based recognition of gaming chips
CN112541564B (en) * 2019-09-20 2024-02-20 腾讯科技(深圳)有限公司 Method and device for reducing calculation complexity of Bayes deep neural network
EP4046064A4 (en) * 2019-10-15 2024-03-06 Arb Labs Inc. Systems and methods for tracking playing chips
SG10201913152SA (en) * 2019-12-24 2021-07-29 Sensetime Int Pte Ltd Method And Apparatus For Detecting Dealing Sequence, Storage Medium And Electronic Device
CN111445504A (en) * 2020-03-25 2020-07-24 哈尔滨工程大学 Water-to-air distortion correction algorithm based on image sequence
CN112767227B (en) * 2021-03-12 2023-03-31 中山大学 Image watermarking method capable of resisting screen shooting
CN113457113B (en) * 2021-06-30 2024-05-31 南京邮电大学 Card game anti-cheating system and method based on NFC technology
AU2021240186A1 (en) * 2021-09-14 2023-03-30 Sensetime International Pte. Ltd. Status switching method and apparatus, edge computing device and computer storage medium

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0566015A3 (en) * 1992-04-14 1994-07-06 Eastman Kodak Co Neural network optical character recognition system and method for classifying characters in amoving web
US6126166A (en) * 1996-10-28 2000-10-03 Advanced Casino Technologies, Inc. Card-recognition and gaming-control device
US6134344A (en) * 1997-06-26 2000-10-17 Lucent Technologies Inc. Method and apparatus for improving the efficiency of support vector machines
US20020147042A1 (en) * 2001-02-14 2002-10-10 Vt Tech Corp. System and method for detecting the result of a game of chance
US20030062675A1 (en) * 2001-09-28 2003-04-03 Canon Kabushiki Kaisha Image experiencing system and information processing method
US20080113783A1 (en) * 2006-11-10 2008-05-15 Zbigniew Czyzewski Casino table game monitoring system
US6973213B2 (en) * 2001-10-12 2005-12-06 Xerox Corporation Background-based image segmentation
WO2003060846A2 (en) * 2001-12-21 2003-07-24 Cias, Inc. Combination casino table game imaging system for automatically recognizing the faces of players -- as well as terrorists and other undesirables -- and for recognizing wagered gaming chips
US6996268B2 (en) * 2001-12-28 2006-02-07 International Business Machines Corporation System and method for gathering, indexing, and supplying publicly available data charts
GB2389540A (en) * 2002-05-30 2003-12-17 Prime Table Games Isle Of Man Game Playing Apparatus
US20040023722A1 (en) * 2002-08-03 2004-02-05 Vt Tech Corp. Virtual video stream manager
US8905834B2 (en) * 2007-11-09 2014-12-09 Igt Transparent card display
UA72328C2 (en) * 2002-11-26 2005-02-15 Олександр Іванович Кириченко Game equipment for table games with the use of playing-cards and tokens, specifically the playing-cards for black jack game
EP1663419B1 (en) * 2003-09-05 2008-02-20 Bally Gaming International, Inc. Systems, methods, and devices for monitoring card games, such as baccarat
US8693043B2 (en) * 2003-12-19 2014-04-08 Kofax, Inc. Automatic document separation
US7901285B2 (en) * 2004-05-07 2011-03-08 Image Fidelity, LLC Automated game monitoring
US7499588B2 (en) * 2004-05-20 2009-03-03 Microsoft Corporation Low resolution OCR for camera acquired documents
US20060205508A1 (en) * 2005-03-14 2006-09-14 Original Deal, Inc. On-line table gaming with physical game objects
US7591728B2 (en) * 2005-07-01 2009-09-22 Gioia Systems, Llc Online gaming system configured for remote user interaction
US8342533B2 (en) * 2005-09-12 2013-01-01 Bally Gaming, Inc. Systems, methods and articles to facilitate playing card games with multi-compartment playing card receivers

Also Published As

Publication number Publication date
US20070077987A1 (en) 2007-04-05

Similar Documents

Publication Publication Date Title
US8016665B2 (en) Table game tracking
AU2006201849A1 (en) Gaming object position analysis and tracking
US20060252554A1 (en) Gaming object position analysis and tracking
US20070111773A1 (en) Automated tracking of playing cards
AU2021102740A4 (en) Computing apparatus applying machine learning to captured images of a gaming surface
US11798353B2 (en) System and method for synthetic image training of a neural network associated with a casino table game monitoring system
JP7481389B2 (en) Fraud detection system for amusement arcades
US11580746B2 (en) System and method for automated table game activity recognition
US11749053B2 (en) Systems, methods and devices for monitoring betting activities
US20050026680A1 (en) System, apparatus and method for automatically tracking a table game
US20060160600A1 (en) Card game system with automatic bet recognition
US20060160608A1 (en) Card game system with automatic bet recognition
JP2023065688A (en) Management system of substitution coins for game
AU2019201016B2 (en) Systems, methods and devices for monitoring betting activities
KR20240111733A (en) Fraud detection system in casino
KR20240105353A (en) Fraud detection system in casino

Legal Events

Date Code Title Description
DA2 Applications for amendment section 104

Free format text: THE NATURE OF THE AMENDMENT IS: AMEND PRIORITY DETAILS FROM 60/760,365 20 JAN 2005 US TO 60/760,36520 JAN 2006 US.

MK1 Application lapsed section 142(2)(a) - no request for examination in relevant period