NZ788283A - System and method for automated table game activity recognition - Google Patents
System and method for automated table game activity recognitionInfo
- Publication number
- NZ788283A NZ788283A NZ788283A NZ78828317A NZ788283A NZ 788283 A NZ788283 A NZ 788283A NZ 788283 A NZ788283 A NZ 788283A NZ 78828317 A NZ78828317 A NZ 78828317A NZ 788283 A NZ788283 A NZ 788283A
- Authority
- NZ
- New Zealand
- Prior art keywords
- field
- gaming
- depth
- images
- game
- Prior art date
Links
- 238000000034 method Methods 0.000 claims abstract 3
- 238000001514 detection method Methods 0.000 claims 3
- 238000001429 visible spectrum Methods 0.000 claims 1
- 229940035295 Ting Drugs 0.000 abstract 1
Abstract
Some embodiments relate to a system for automated gaming recognition, the system comprising: at least one image sensor configured to capture image frames of a field of view including a table game; at least one depth sensor configured to capture depth of field images of the field of view; and a computing device configured to receive the image frames and the depth of field images, and configured to process the received image frames and depth of field images in order to produce an automated recognition of at least one gaming state appearing in the field of view. Embodiments also relate to methods and computer-readable media for automated gaming recognition. Further embodiments relate to methods and systems for monitoring game play and/or gaming events on a gaming table. ting device configured to receive the image frames and the depth of field images, and configured to process the received image frames and depth of field images in order to produce an automated recognition of at least one gaming state appearing in the field of view. Embodiments also relate to methods and computer-readable media for automated gaming recognition. Further embodiments relate to methods and systems for monitoring game play and/or gaming events on a gaming table.
Description
"System and method for automated table game activity recognition"
Cross-Reference
This application is a divisional application of New Zealand Patent Application No.
, which is a New Zealand national phase of filed on 16 May
2017, which claims the benefit of Australian Patent Application No. 2016901829 filed on 16
May 2016, the disclosures of which are orated herein by reference in their entirety.
Technical Field
[0001a] The described embodiments relate generally to monitoring table games. In particular,
ments relate to systems and methods for monitoring events in table games at gaming
venues.
Background
s and other such venues are now using surveillance technology and other
management software in an effort to monitor players and plan their business strategy. They
seek to deploy real-time behaviour analytics, algorithms (or processes), and player tracking
techniques to maximise player revenue, optimise staffing and optimise the allocation of venue
floor space to the types of games which maximise venue revenue. Most casino-goers
participate in loyalty programs which require them to use player cards instead of coins, paper
money, or tickets. This has given casinos the opportunity to record and analyse individual
gambling behaviour, create player profiles and record such things as the amount each r
bets, their wins and , and the rate at which they push slot e s. However,
table games are less easily monitored than either slot machines or button operated gaming
machines.
Systems for monitoring and managing table games have typically proven to be
expensive to install and maintain, and have failed to achieve the cy levels which are
needed to be truly useful. Other options include having sensors in the casino chips and other
offline yield management solutions, however these have proven ineffective. The operating
environment of gaming venues is fast paced, with high s of visual and auditory noise
and distractions, cards and betting chips can be in disordered positions on the table, and
illumination can vary considerably.
Casinos or other such gaming venues conduct several table based games such as
Baccarat, Blackjack, Roulette that involve players betting on occurrence or non-occurrence of
specific events. Individual games have their own set of defined events that initiate the game,
ine the result of bets placed during the game or ate a game. Most games are
conducted by a designated dealer who undertakes certain actions specific to each game that
may initiate a game, trigger events that determine the result of bets placed or terminate a
game.
Casinos and other such gaming venues have an interest in ascertaining transactional
data associated with events occurring on gaming tables or playing surfaces. This information
may assist in the planning of the ’s business strategy and monitoring behaviour of
players. Information regarding the events and outcomes of games on tables may form a basis
for Casinos to ascertain l staffing, floor space allocation to specific games and other
such revenue enhancing or patron experience-enhancing decisions. One method of
ascertaining transactional data associated with events occurring on gaming tables employed
by Casinos is random sampling by duals who visually inspect the events occurring on a
subset of tables and report the observed information. The reported information may be
extrapolated to estimate the overall level of ty on tables in the . However, such
visual inspection occurs at intervals of an hour or more and relies on human judgement, so
there can be inefficiencies with such methods.
Systems for ring and managing table games have typically proven to be
expensive to install and maintain, and have failed to achieve the accuracy levels which are
needed to be truly useful. Other options e having sensors in the casino chips and other
offline yield management solutions, however these have proven ineffective. The operating
environment of gaming venues is fast paced, with high s of visual and auditory noise
and distractions, cards and betting chips can be in disordered positions on the table, and
nation can vary considerably.
It is desired to address or rate one or more shortcomings or disadvantages
associated with prior techniques for monitoring events in table games at gaming venues, or to
at least provide a useful alternative.
Throughout this specification the word "comprise", or variations such as "comprises"
or "comprising", will be understood to imply the inclusion of a stated element, integer or step,
or group of elements, integers or steps, but not the exclusion of any other element, integer or
step, or group of elements, integers or steps.
In this specification, a ent that an element may be “at least one of” a list of
options is to be understood that the element may be any one of the listed options, or may be
any combination of two or more of the listed options.
Any discussion of documents, acts, materials, devices, articles or the like which has
been included in the present specification is not to be taken as an ion that any or all of
these matters form part of the prior art base or were common general knowledge in the field
relevant to the present disclosure as it existed before the priority date of each claim of this
ation.
Summary
According to a first aspect some embodiments provides a system for ted
gaming ition, the system comprising: at least one image sensor configured to capture
image frames of a field of view including a table game; at least one depth sensor configured to
capture depth of field images of the field of view; and a ing device configured to
receive the image frames and the depth of field , and configured to process the
received image frames and depth of field images in order to produce an automated recognition
of at least one gaming state ing in the field of view.
According to a second aspect some embodiments provide a method of automated
gaming recognition, the method comprising: obtaining image frames of a field of view
including a table game; obtaining depth of field images of the field of view; and processing
the received image frames and depth of field images in order to produce an automated
recognition of at least one gaming state appearing in the field of view.
According to a further aspect some embodiments provide a non-transitory computer
readable medium for automated gaming recognition, comprising instructions which, when
executed by one or more processors, causes performance of the following: obtaining image
frames of a field of view ing a table game; obtaining depth of field images of the field
of view; and processing the received image frames and depth of field images in order to
produce an automated recognition of at least one gaming state appearing in the field of view.
The image frames may comprise images within or constituting the e spectrum,
or may comprise infrared or ultraviolet images. The depth of field images may comprise time
of flight data points for the field of view, and/or phase information data points reflecting
depth of field. The at least one gaming state appearing in the field of view may comprise one
or more or all of: game start; chip detection; chip value estimation, chip stack height
estimation; and game end. Game start and/or game end may be effected by card detection or
dolly detection. The table game may be a card game such as poker, blackjack or baccarat, or a
non-card based game such as roulette.
Some embodiments relate to a method of monitoring game play on a table surface of
a gaming table, the method comprising: analysing in real time captured images of the table
surface to identify a ce of a game object in any one of a plurality of a first pre-defined
regions of interest on the table surface; in response to a game object being identified as
present in the any one of a plurality of the first pre-defined regions of st, recording a
time stamp for a game event; and transmitting game event data to a server, the game event
data comprising the time stamp, an indication of the gaming event and an identifier of the any
one of a plurality of the first pre-defined regions of interest.
The game object may be a game card or a position marker. The ing may
comprise identifying a presence of at least one wager object on the table surface. The at least
one wager object may be different from the game object. The presence of the at least one
wager object may be identified in one or more of a plurality of second pre-defined regions of
st. The ing may se identifying one or more groups of wager objects in the
one or more second pre-defined s of interest.
The analysing may further comprise: ting a height of each of the one or more
groups of wager object with respect to the table surface; and estimating a number of wager
objects present in each group of wager objects. The analysing may further comprise
identifying a colour of an upper-most one of each group of wager s.
The method may further comprise automatically estimating a wager amount
associated with each second pre-defined region of interest in which the presence of at least
one wager object is identified, wherein the estimating is based on the identified colour of the
upper-most wager object of each group of wager objects and the estimated number of wager
objects in each group of wager objects in the respective second region of interest.
The captured images may comprise multi-spectral images and the analysing may
further comprise multi frame sing of the multi-spectral images to identify the presence
of the game object in any one of the plurality of the first pre-defined regions of interest or
second pre-defined regions of interest on the table surface.
Some embodiments relate to a system of monitoring game play on a table surface of
a gaming table, the system comprising: at least one camera configured to e images of a
table surface; and a computing device in communication with the camera, said computing
device configured to analyse in real time captured images of the table surface to automatically
identify a ce of a game object in any one of a plurality of a first pre-defined regions of
interest on the table surface.
The game object may be a game card or a position marker, for example. The
computing device may configured to identify a presence of at least one of wager object on the
table surface. The at least one of wager object may be different from the game . The
presence of the at least one wager object is identified by the computing device in one or more
of a plurality of second pre-defined regions of st. The computing device may configured
to identify one or more groups of wager objects in the one or more second fined regions
of interest.
The at least one camera may further comprise a depth g device to communicate
to the computing device depth data of the game objects with respect to the table e. The
computing device may be further configured to estimate a height of each of the one or more
groups of wager object with respect to the table surface. The ing device may be r
configured to estimate a number of wager objects present in each group of wager objects. The
computing device may be further configured to identify a colour of an upper-most one of each
group of wager objects.
The computing device may configured to automatically estimate a wager amount
associated with each second pre-defined region of interest in which the presence of at least
one wager object is identified, wherein the ting is based on the identified colour of the
most wager object of each group of wager objects and the estimated number of wager
s in each group of wager objects in the respective second region of interest.
The captured images may comprise multi-spectral images and the computing device
may configured to perform multi-frame processing of the multi-spectral images to identify the
presence of the game object in any one of the plurality of the first pre-defined regions of
interest or second fined regions of interest on the table surface.
Some embodiments relate to a system for automated monitoring gaming events on a
gaming table comprising: a depth imaging device configured to capture depth of a plurality of
game objects on a gaming region of the gaming table; a plurality of visual imaging cameras
configured to capture visual images of the gaming region; a gaming configuration module
comprising: configuration data associated with a plurality of games, uration of the
gaming table, location of regions of interest, patterns to recognise game objects, and
definition of gaming events as a change in state of game objects on the gaming table; and a
computer system that receives data from the depth imaging device and the plurality of visual
imaging cameras, and is configured to access the configuration data in the gaming
configuration module to automatically ise s on the gaming region and gaming
events occurring during a game.
The methods described herein may be fully automated, so that game activity
ring can occur without any need for human judgement or intervention. However, some
human interaction can occur in system configuration steps, such as establishing regions of
interest for betting and for locating game objects, like cards or .
Brief Description of Drawings
Figure 1 is a block diagram of a Gaming ring System.
Figure 2 is a schematic diagram of a system for automated table gaming recognition,
forming part of the Gaming Monitoring System of Figure 1.
Figure 3 is an image of a surface of a Gaming Table that may form part of a Gaming
Environment of the system of Figure 1.
Figure 4 is an image of a surface of another Gaming Table that may form part of the
Gaming nment of the system of Figure 1.
Figure 5 is an image of a surface of another Gaming Table that may form part of the
Gaming Environment of the system of Figure 1.
Figure 6 is an image of a surface of r Gaming Table that may form part of the
Gaming Environment of the system of Figure 1.
Figure 7 is a block diagram of a Depth Sensing Device and Camera for use in the
system of Figure 1.
Figure 8 is a front view of the inside of a housing of a Depth Sensing Device and
Camera according to some embodiments.
Figure 9 is a block diagram of a Computing Device of the system of Figure 1.
Figure 10 is a block diagram of a Message Broker Server of the system of Figure 1.
Figure 11 is a block m of a Database Server of the system of Figure 1.
Figure 12 is a block diagram of a Web Application Server of the system of Figure 1.
Figure 13 is a screen shot of a Web Application showing an interface for managing
the configuration of another Gaming Table that may form part of a Gaming Environment of
the system of Figure 1.
Figure 14 is another screen shot of the Web Application showing an interface for
managing the configuration of another Gaming Table that may form part of a Gaming
Environment of the system of Figure 1.
Figure 15 is another screen shot of the Web Application showing an interface for
managing the configuration of another Gaming Table that may form part of a Gaming
nment of the system of Figure 1.
Figure 16 is a set of images to illustrate the results of several types of thresholding
operations on a sample image.
Figure 17 is a set of images to illustrate the results of further types of thresholding
operations on a sample image.
Figure 18 is a set of images to illustrate the s of erosion and dilation ions
on a sample image.
Figure 19 is a flowchart of an edge detection process.
Figure 20 is a diagram to illustrate edge detection in part of the process of Figure 19.
Figure 21 is an example graph to illustrate example ia applied in the edge
detection process of Figure 19.
Figure 22 is a set of images to illustrate ation of a r detection process on
a sample image with different parameters.
Figure 23(a) is a plot of a set of points over which a plane estimation process is to be
applied.
Figure 23(b) is a plot of the result of the application of the plane estimation process
and the orthogonal distances of the estimated plane to points shown in the plot of Figure
23(a).
Figure 24 is a flowchart of a Game Monitoring System according to some
embodiments.
Figure 25 is flowchart of a Game Monitoring System according to further
embodiments.
Figures 26(a) and 26(b) are image frames that illustrate the application of some card
detection processes on another Gaming Table.
Figures 27(a) is an image frames of another Gaming Table over which card and chip
detection processes may be applied.
Figure 27(b) is an image frame of the Gaming Table of Figure 27(a) obtained by
applying a thresholding technique to the image frame of Figure 27(a).
Figures 28(a) and 28(b) are image frames obtained by an infrared camera that
illustrate the application of some card and chip detection processes on another Gaming Table.
Figure 29(a) is an image frame that may be an input to a Chip ion Process.
Figure 29(b) is an image frame obtained by application of a binary thresholding
operation on the image frame of Figure 29(a).
Figure 29(c) is an image frame ed by application of an erosion operation on the
image frame of Figure 29(b).
Figure 29(d) is an image frame obtained by application of a dilation operation on the
image frame of Figure 29(c).
Figure 29(e) is an image frame that illustrates the results of application of a Chip
ion Process on the input image frame of Figure 29(a).
Figure 30 is an image frame that illustrates the s of application of a Chip
Detection Process on a Gaming Table n part of the view of a gaming table is
obstructed.
Figure 31 is an image frame that illustrates the results of application of a Chip
Detection Process to the Gaming Table of Figure 30(a) obtained after the obstruction in
Figure 30(a) is not in the image frame.
Figure 32 in an image frame that illustrates the results of application of a Chip
Detection Process to a Gaming Table, wherein the input image frame is based on an image
captured by a visual image .
Figure 33 an image frame that illustrates the results of application of a Chip
Detection Process to the Gaming Table of Figure 32, wherein the input image frame is based
on an image captured by a infrared camera.
Figure 34 is a flowchart of a Multi Frame Processing Technique according to some
embodiments.
ed Description
Described embodiments relate generally to monitoring table games. In ular,
embodiments relate to s and methods for monitoring events in table games at gaming
venues.
Gaming Monitoring System: Figure 1 is a block diagram of a Gaming Monitoring
System 100 according to some embodiments. The system 100 may comprise a plurality of
Gaming Monitoring Setups 105, a Gaming Monitoring Infrastructure 115, an Administrator
Client 170 and a Database Client 180. The Gaming Monitoring Setup 105 comprises a
Gaming Environment 110, a Depth g Device and Camera 120 and a Computing Device
130. The system 100 is suited for installation and operation in a one or more gaming rooms of
a gaming venue, such as a casino. The gaming rooms each have one or multiple gaming
tables located therein and some or each of those tables may form part of a respective Gaming
Monitoring setup 105. Commercially available s such as MicrosoftTM Kinect or AsusTM
Xtion or an InfineonTM 3D Image Sensor REAL3TM other similar depth sensing s with
camera functions can be employed as a Depth Sensing Device and Camera, for example. The
depth sensing device and camera 120 is coupled with or connected to a Computing Device
130 to receive instructions from the ing Device 130 and transmit recorded data to the
Computing Device 130 using a link 107. For example, a MicrosoftTM Kinect device may be
connected to a Computing Device using a USB port on the Computing Device.
A gaming venue may have multiple Gaming Environments, for example an area or
room where table games are played, and to monitor each one of those Gaming Environments,
there may be multiple ones of Gaming Monitoring Setup 105. le Gaming Monitoring
Setups 105 may be coupled or linked with a common Gaming ring Infrastructure 115
using a k link 187. The network link 187 comprises k link 117 between the
Computing Device 130 and a Message Broker Server 140; and a network link 167 between a
Database Server 150 and the Computing Device 130. The Gaming Monitoring Infrastructure
115 may also be coupled with or linked to Gaming Monitoring Setups 105 in two or more
different gaming venues. In some embodiments where a gaming venue may have a large
number of Gaming Environments 110, le ones of Gaming Monitoring Infrastructure
115 may be coupled with different subsets of Gaming Monitoring Setups 105 in the same
venue.
The Gaming Monitoring Infrastructure 115 comprises the Message Broker Server
140, the Database Server 150, and a Web Application Server 160. The Message Broker Server
140 may be connected to a plurality of Computing s 130 through the two way Network
Link 117. Network link 127 may exist between the Message Broker Server 140 and the
Database Server 150 to enable the transfer of data or instructions. k link 137 may exist
between the Web Application Server 160 and the Database Server 150 to enable the transfer
of data or instructions. Each of the servers 140, 150 and 160 may be implemented as
standalone s or may be implemented as distinct virtual servers on one or more physical
servers or may be implemented in a cloud computing service. Each of the servers 140, 150
and 160 may also be implemented through a network of more than one servers configured to
handle increased performance or high availability requirements.
The Administrator Client 170 may be an end user computing device such as a
Computer or a Tablet, for example and may be connected to the Web Application Server 160
through the k Link 147. The Database Client 180 may be an end user ing
device or an interface to relay data to other end user computing devices or other databases and
may be connected to the se Server 150 through the Network Link 157.
Gaming nment: Configuration of a Gaming Environment 110 may vary
ing on a specific game being conducted, but most games monitored by any one of the
embodiments have common elements. Figure 2 illustrates a system for automated table
gaming recognition 200 in accordance with some embodiments. The main functions of the
system of the presently described embodiment of the invention is to detect when a game starts
and finishes, to detect a location of placed chips, and to estimate the value and the height
(how many chips) of chip stack. This system is based on a combination of image processing
techniques and sensing device features.
The Gaming Environment 110 comprises a g surface or a gaming table 210
over and on which the game is conducted. The playing surface 210 is commonly a
ntially horizontal planar surface and may have placed thereon various game s,
such as cards 211 or chips 213 or other objects, that may be detected by the Gaming
Monitoring System 100. The depth sensing device and camera 120 may be d on a
pillar or post 220 at a height so as to position the depth sensing device and camera 120 above
any obstructions in the field of view of the depth sensing device and angled to direct the field
of view of the depth sensing device and camera 120 somewhat downwardly towards the
gaming table 210. The ctions may be temporary obstructions, such as a dealer
conducting a game at a table or a participant of a game or a passer-by, for example. In some
embodiments, the depth sensing device or camera 120 may be positioned behind and above
the dealers’ shoulders to get an ructed view of a playing surface of the gaming table
210. The position of the depth sensing device or camera 120 and the computing device 130
may be above or adjacent to other display screens on a pillar or post that are located at that
gaming table 210.
Figure 3 is an image 300 that illustrates part of the playing surface of a gaming table
configured for the game of ack. The playing surface or gaming table comprises a
plurality of pre-defined regions of st and, depending on the nature of the game, may
have a specific orientation and function with respect to the operation of the game. A predefined
region of interest may be designated for detection of specific game objects such as
game cards or wager objects. For example, in Figure 3, first pre-defined regions of interest
305 are designated to locate cards 211 dealt to a player and second pre-defined s of
interest 308 are designated to locate the chips or wager objects 213 a player may wager in a
game. In some embodiments, one or more of a pre-defined region of interest may overlap with
one or more of r pre-defined region of interest. In some embodiments, one pre-defined
region of interest may form a part of another pre-defined region of interest.
Participants of the game include players who may place bets and s who
conduct the game. To place bets or conduct the game, objects described as Game Objects are
used by the players or dealers. Game Objects may comprise cards 211 in a ic shape with
specific markings to identify them, Chips or wager objects 213 or other such objects may
designate amounts players may wager in a game, or may comprise other objects with a
distinct shape that may designate the outcome of a game such as a position marker or a dolly
used in a game of roulette. The game is conducted through a series of Gaming Events that
ses the start of a game, placing of bets by players during a game, intermediate
es during a game and the end of a game determining the final outcome of the game.
During a game, a player may place bets by placing his wager objects (i.e. betting tokens or
chips) in a region of interest designated for placing of bets. For example, in the game of
blackjack as shown in Figure 3, a player may place a bet during a game, by placing one or
more wager objects, such as chips 213, in their designated region 308 for placing a bet. The
chips or wager objects may be arranged in groups or stacks within a region of interest. Often a
group or stack of wager objects will be comprise a common colour of wager objects.
In certain games, a player may choose a region of interest that is associated with the
likelihood of success and s associated with a bet placed. For example, the playing
surface 210 of Figure 6 is a playing surface with marking for a betting area for a game of
roulette. Different regions of interest 308 on the playing surface 600 may have different
prospects of success and payoffs for a player’s bet. Playing surfaces may have l
ent configurations in terms of on and structure of various regions of interests,
depending on different rules and/or betting conventions associated with the game. Figure 4 is
an image 400 of a gaming table ed for a game of Baccarat. Figure 5 is an image 500 of
a gaming table ed for another game according to some embodiments.
Depth Sensing Device and Camera: The Depth Sensing Device and Camera 120
performs the functions of capturing visual images and depth information of the field of view
before the device. The device 120 is placed before a playing surface or a gaming table in
order to capture all the designated regions of st identified on the gaming table. The
device 120 comprises an infrared projector 710, an infrared sensor 720, a camera 730, a
processor 740, a communication port 750 and internal data links 705 that connect the infrared
sensor 720 and camera 730 with the processor 740. Internal data link 706 connects the
sor 740 with the communication port 750. The Depth Sensing Device and Camera 120,
may capture images from multiple spectrums of human-visible and/or human-invisible light.
For example, the Depth Sensing Device and Camera 120 may capture an image from the
visible light spectrum through camera 730 and from the infrared spectrum through the
infrared sensor 720, and consequently may operate as a multi-spectral camera.
The Depth Sensing Device and Camera 120 may rely on the Time of Flight
technique to sense depth of the field of view or scene before it. The infrared projector 710
may project a pulsed or modulated light in a continuous wave that may be sinusoidal or a
square wave. Multiple phases of projected light may be projected and sensed to improve
accuracy of the depth information. The measured phrase shift n the light pulse emitted
by the infrared projector 710 and the reflected pulse sensed by the infrared sensor 720 is
relied upon by the processor 740 to calculate the depth of the field before the device. In some
embodiments, the Depth Sensing Device and Camera 120 may include a structured-light 3D
scanner g on the principle of using a projected light pattern and a camera for depth
sensing. In some embodiments, the Depth Sensing Device and Camera 120 may include a
stereo camera that performs the function of depth sensing by using images from two or more
lenses and corresponding image s. The Depth Sensing Device and Camera 120 may
rely on other ative means of depth or range sensing or a combination of two or more
techniques to acquire depth information of the Gaming Environment and of the table playing
e 210 in particular.
The sensed depth information is combined with pixel grid information determined by
the processor and presented as an output to the communication port 750. The pixel grid
information is also combined with the visual images captured by the camera 730 and
presented as output through the port 750 in combination with the depth ation. Apart
from the depth information, the infrared sensor 720 also senses the intensity of the ed
light reflected by the field of view and this information is combined with the pixel grid
ation by the processor 740 and passed to the port 750. The port 750 may be in the form
of a physical port such as a USB port, for example or a ss itter such as a wireless
network adapter. The Depth Sensing Device and Camera 120 returns the sensed depth, colour
and infrared imaging data in different coordinate spaces that may be mapped with each other
to get a d depth, colour and infrared data associated with a specific region or point in the
field of view of the device. Figure 8 is a front view of the inside of a g of a Depth
Sensing Device and Camera according to one embodiment 800 and illustrates some of its
ents including the Infrared Projector 710, Infrared Sensor 720, and Camera 730.
Computing Device: The data generated by the Depth Sensing Device and Camera
120 is received by the Computing Device 130 through the communication port 990. The port
990 may be in the form of a USB port or a ss adapter that couples with the
communication port 750 to e sensor data or transmit instructions to the Depth Sensing
Device and Camera 120. Hardware Components 910 of the computing device 130 comprise
Memory 914, Processor 912 and other components necessary for operation of the computing
device. Memory 914 stores the necessary Software Modules 920 which comprise: an Image
Processing Library 922; Depth g Device and Camera API 924; Runtime Environment
Driver 926; Gaming Monitoring Module 928; Batch Scripts 930; Scheduled Jobs 932; and a
Message Producer Module 934.
The Image Processing Library 922 is a set of programs to perform basic image
processing ions, such as performing thresholding operations, morphological operations
on images and other programs necessary for the image processing steps undertaken by
Gaming Monitoring Module 928. OpenCV is an example of an Image Processing Library that
may be employed. The Depth Sensing Device and Camera API 924 is a set of programs that
enables the Computing Device 930 to establish a communication channel with one or more
Depth g Device and Camera 920. For example, if a MicrosoftTM KinectTM device is
ed as a Depth g Device and Camera, then a Kinect for WindowsTM SDK will be
employed as the Depth Sensing Device and Camera API 924. This API 924 s the
Computing Device 130 to make queries to the Depth Sensing Device and Camera 120 in an
appropriate protocol and to understand the format of the returned results. This API 924
enables the data generated by the Depth Sensing Device and Camera 120 to be received and
processed by the Gaming Monitoring Module 928. The computing device 130 also has the
Necessary Runtime Environment Drivers 926 to provide the necessary dependencies for the
execution of the Gaming Monitoring Module 928. The Gaming Monitoring Module 928
comprises the programs that monitor gaming events occurring in the course of a game.
The Software Modules 920, also comprises Batch Scripts that may be in the form of
windows power shell scripts or a script in other ing languages to perform the ary
housekeeping and maintenance ions for the Gaming Monitoring Module 928. The batch
scripts 930 may be executed on a scheduled basis through the led Jobs 932 that may be
in the form of windows scheduler jobs or other similar job ling services. The Message
Producer Module 934 based on instructions from the Gaming Monitoring Module 928
produces es that are passed on to the Message Broker Server 140. The Message
Producer Module may be based on a standard messaging system, such as RabbitMQ or Kafka,
for e. Based on stored Message Broker Configuration 942 in the Configuration Module
940, the e Producer Module 934 may communicate messages to the e Broker
Server 140 through the Communication Port 990 and the network link 117. The uration
Module 940 also comprises Table Configuration 942 and Game Start and End Trigger
Configuration 944. The components of the Configuration Module 940 are stored in the form
of one or more uration files in the Memory 914. The configuration files may be stored
in an XML format, for example.
Message Broker Server: The Message Broker Server 140 implements a message
brokering service and listens for messages from a plurality of Computing Devices 130
through the network link 117. The Message Broker Server 140 may be located on the same
premises as the Computing Device 130 within a common local network or it may be located
off-premises (remotely) but still in communication via the network link 117 established
between the two premises to enable the transfer of messages and data. The Message Broker
Server 140 may be centralised and connected to Computing s 130 in a plurality of
gaming venues to provide a centralised message brokering service. The Message Broker
Server 140 has Hardware Components 1010 comprising Memory 1014, Processor 1012 and
other necessary hardware components for the operation of the server. The Message Queue
Module 1020 implements a queue to receive, interpret and process messages from a ity
of Configuration Devices 130. The messages are received through the Communication Port
1090 with may be in the form of a Network Adapter or other similar ports capable of ng
two way transfer of data and instructions to and from the Message Broker Server 140. The
Message Queue Module 1020 may be implemented through a message broker package such
as RabbitMQ or Kafka. The Message Queue Module 1020 on receiving a message comprising
ction information regarding gaming events occurring on a gaming table initiates a
Database Parsing Module 1030. The Database Parsing Module 1030 parses the message
received by the Message Queue Module 1020 into a database query that is subsequently
executed on the Database Server 150 through the Network Link 127.
Database Server: The se Server 150 serves the purpose of ing gaming
event data from the Message Broker Server 140, storing table configuration data that is
d through the Web Application Server 160 and serving as a repository for Database
Client 180 to provide access to the gaming event data captured by the Gaming Monitoring
System 100. The Database Server 150 has Hardware Components 1110 comprising Memory
1114, Processor 1112 and other necessary hardware components for the operation of the
server. A Communication Port 1190 may be in the form of a Network Adapter or other similar
ports capable of enabling two way er of data and instructions to and from the Database
Server 150 through one or more network links. Database Module 1120 may be ented
through a database management system such as MySQLTM, Postgres or MicrosoftTM SQL
Server.
The Database Module 1120 holds data comprising Table uration Data 1122
and Gaming Event Data 1124. Gaming Event Data 1124 comprises transaction data
representing Gaming Events that occur on a gaming table or a playing surface. The records
forming Gaming Event Data may comprise a timestamp for the time a gaming event was
recognised; a unique identifier for the gaming table on which the gamin event occurred; an
fier for the nature of the gaming events such as placing of a bet, intermediate outcome if
a game, final outcome of a game; an fier of a region of interest associated with the
gaming event; an estimate of a bet value ated with a region of interest; and other
nt attributes representing a gaming event.
The Table Configuration Data 1122 comprises: unique identifiers for gaming tables
and associated Computing Device 130; location of regions of interest 308 on the gaming table
in the form of polygons and coordinates of pixels associated with the Depth g Device
and Camera 120 g the endpoints of the polygons; the nature of the region of interest
308, whether it is a region for placing cards or for placing chips or for placing a specific
gaming object to be detected; nature of game start and end triggering events, whether the start
of a game is detected by placing of cards on the region of interest or the placing of a specific
gaming object on a specific region of interest; model contours for game objects such as cards
or chips, for example to enable detection by the Gaming Monitoring Module 928; and other
relevant data necessary to represent the parameters relied on by the Gaming Monitoring
System 100. In some embodiments, the Table Configuration Data 1122 and Gaming Event
Data 1124 may be held in separate se servers to enable greater scalability and
manageability of the Gaming Monitoring System 100.
The Database Server 150 also comprises a Table Configuration Propagator Module
1140 which performs the function of propagating Table Configuration Data 1122 to the
tive Computing Device 130. The Table Configuration Propagator Module may be
implemented through a combination of database scripts and d line scripts that first
generate Table Configuration 942, Game Start and End Trigger uration 944 and
Message Broker Configuration 946 in the form of a configuration file such as an XML file,
for example. The generated configuration files may be transferred to the respective
Computing Device 130 through the Communication Port 1190 replying on a Network Link
167. The Network Link 167 may be a local network link if the Database Server 150 and the
Computing Device 130 are in the same local network or a network link spanning multiple
computer networks if the Database Server 150 and the Computing Device 130 are located in
separate networks. The transfer of the configuration files may be effected h an
appropriate k protocol such as File Transfer Protocol or SSH File Transfer ol,
for example.
Web Application Server: A Web Application Server 160 hosts a Web Application
that facilitates the configuration and ment of the Table Configuration Data 1122 on
the Database Server 150. The Web Application Server 160 has Hardware Components 1210
comprising Memory 1214, Processor 1212 and other necessary re components for the
operation of the server. A Communication Port 1290 may be in the form of a Network
Adapter or other r ports capable of enabling two way transfer of data and ctions to
and from the Web Application Server 160 through one or more k links. The Web
Application Server comprises a Web Application Module 1220 which comprises web
interfaces that enable a user to create and update Table Configuration Data 1122 on the
Database Server 150. The web application may be implemented through a web application
framework such as Django in python or ASP.NET or other similar web frameworks, for
e. The Web Application Server 160 also comprises a Database Parsing Module 1230
that translates instructions received by the Web Application Module 1220 through the web
interface into specific database queries or commands that will create or update The Table
Configuration Data 1122 to t the operations undertaken by an Administrator Client 170.
The database queries or commands are executed on the Database Server 150 h a
Network Link 137. The k Link 137 may be a local area network link if the Database
Server 150 and the Web Application Server 150 are in a common network or it may span
multiple ks if the Database Server 150 and the Web Application Server 150 are located
in separate networks.
Web Interface: Figure 13 is a screen shot 1300 of a Web Application showing an
interface for managing the configuration of an embodiment of the Gaming Table that may
form part of a Gaming Environment 110. Parameters that may be required in setting up a
gaming table and the parameters may include a unique identifier for a table, an IP address of
an associated computing device, for example are located in the screen region 1310. Part on an
XML configuration file that may be propagated to the Computing Device 130 to codify the
Table Configuration 942 is shown in the screen area 1320. A button 1330 may be used to
create records for additional gaming tables and a submit button 1340 enables a user to submit
a new configuration.
Figure 14 is another screen shot 1400 of the Web Application showing another
interface for managing the configuration of an embodiment of the Gaming Table that may
form part of a Gaming Environment 110. A deploy button 1410 may be clicked to deploy a
set of saved configurations to the Computing Device 130 through the network link 167. The
delete button 1424 may be clicked to delete any saved configurations. 1420 is a sample of part
of r XML file that may be used to store and ate configuration information to
Computing Device 130. Screen regions 1412, 1414 and 1416 represent depth, colour and
ed image s from the Depth Sensing Device and Camera 130. Details of
configurations associated with individual streams may be view by clicking the button 1422.
The configuration details may be d by clicking on the button 1418. The screen region
1440 allows a user to set up default configurations for all tables that may be saved by clicking
the save button 1430.
The button 1430 may be used to save changes to gaming table configurations before
deployment.
Figure 15 is another screen shot 1500 of the Web Application showing another
interface for managing the configuration of an embodiment of the Gaming Table that may
form part of a Gaming nment 110. The ace shown in screenshot 1500 allows the
types, ons and boundaries of the regions of interest to be defined using user interface
tools. Such defined regions of interest then become “pre-defined regions of interest” as
referred to herein once they are saved into the game configuration data. The button 1510 may
be clicked to get a refreshed image of a gaming table if the position of the gaming table with
respect to the Depth Sensing Device and Camera 120 changes. The image frame 1515 shown
in screenshot 1500 includes one or more bounded regions of interest 1520, with each bounded
region of st defined by a polygon 1525. Custom ns may be drawn using
selectable handles 1560 and added to a list of polygons 1530, for example. Save button 1540
enables a user to save s made to polygons and a remapping button 1550 enables a user
to remap existing polygons to ent locations.
To develop a system for automated recognition of gaming, it is necessary to
understand the behaviour of . We deal with two kind of gaming tables. One is a cardbased
game or card game which is any game using playing cards as the primary device with
which the game is played, be they traditional or game-specific. Examples of this type are
blackjack, baccarat, and poker. The other type of table game is not based on cards, for
example roulette. In this game, players may choose to place bets on either a single number or
a Depth of numbers, the colours red or black, or whether the number is odd or even. To
ine the winning number and colour, a croupier spins a wheel in one direction, then
spins a ball in the te direction around a tilted circular track running around the
circumference of the wheel.
The behaviour of card-based games is as follows.
• Players are free to place bets for some time.
• Dealer says 'no more bets' and starts dealing cards to players. This is when the
game starts.
• After the game results have been finalized, the dealer collects losing players’
chips and gives out winning chips.
• Then dealer clear out all the cards. This is when the game ends.
Behaviour of non-card based games, especially roulette.
• Players are to place bets for some time.
• Dealer spins the wheel and waits for some time until the ball is close to
stopping, then announces 'no more bets'. This is when the game starts.
• After the ball has stopped, the dealer puts a dolly on the table on the wining
number. Winning / losing chips are allocated / ed by the dealer.
• After that, the game ends.
Overall Monitoring Process: The Gaming Monitoring System 100 in its operation
underpins two fundamental aspects: Contour Detection; and Plane Estimation. Contour
ion comprises a set or processes or techniques that may be med in real-time or
near real-time, to recognise shapes in images ed by the Depth Sensing Device and
Camera 120. Near real-time sing may comprise processing with a latency of a few
seconds, for example 2-3 seconds or less after the occurrence of an event. Plane Estimation
ses a set of processes or techniques that may be performed in real-time or near realtime
, to estimate the position of a plane representing a gaming table or a playing surface based
on the depth data and images captured by the Depth Sensing Device and Camera 120. Once
the step of Plane Estimation is performed, the obtained plane position information may be
combined with additional depth data captured by the Depth Sensing Device and Camera 120
to estimate the height of a stack of game objects on a gaming table and make an inference
about the value associated with a stack of game objects such as a stack of chips, for example.
Image pre-processing steps: Before the Contour Detection or Plane Estimation
techniques may be applied, a number of Image Pre-processing Steps are applied to the images
captured by the Depth Sensing Device and Camera 120. These Image Pre-processing Steps
improve the performance and accuracy of processes implementing the Contour Detection and
Plane Estimation ques.
Thresholding: One image pre-processing technique that may be employed is
Thresholding. The Depth Sensing Device and Camera 120 returns data in colour and infrared
image frames of the Gaming Environment 110. Thresholding techniques may be d to
both streams of colour and infrared data. Each pixel in a particular frame of either colour or
infrared stream is represented by a c value that indicates the colour or intensity of the
pixel. A colour image frame may be converted into a greyscale image frame before
performing a thresholding operation. Global thresholding is one method of implementing
thresholding. In Global thresholding each pixel value is compared with an arbitrary threshold
value; if the pixel value is greater than the threshold it is assigned a value ponding to
white, for example, 255 in an 8 bit scale; else it is assigned a value ponding to black, for
example, 0 in an 8 bit scale. h a series of images 1600, Figure 16 illustrates an
example of the results of thresholding on a sample image 1601. Image 1602 is the result of the
application of Global Thresholding to the Image 1601 using the value of 127, the Image 1601
being represented in an 8 bit format. Global thresholding may not be sufficient for a variety of
real world applications. An image may have different ng conditions in various parts of
the image and ation of the Global Thresholding technique may diminish the parts of an
image with low lighting conditions.
Adaptive Thresholding is an alternative to address the limitations of Global
Thresholding. In Adaptive Thresholding, threshold values for different, small regions of an
image are found and applied to the regions for the purposes of thresholding. The threshold
values for ve thresholding may be calculated by taking the mean values of the pixels in
a neighbourhood of a pixel, or a weighted sum of neighbourhood pixels where the weights
may be taken from a Gaussian bution. In Figure 16, image 1603 is an e of the
output of ation of the Adaptive Means Thresholding technique and image 1604 is an
example of the output of the application of Adaptive Gaussian Thresholding technique to the
original image 1601. Another alternative method of thresholding is Otsu’s Binarization. In
this method the thresholding is performed based on image histograms. Through a series of
images 1700, Figure 17 illustrates an example of the application of Otsu’s Binarization
technique on a set of sample images 1701 with representative histograms 1702. One or more
of the alternative thresholding techniques may be applied at a pre-processing stage and to the
colour or infrared image frames. The Image sing Library 922 may provide reusable
libraries that implement the proposed thresholding techniques that can be invoked by the
Gaming Monitoring Module 928 in the image pre-processing stage.
Morphological Transformations: Morphological transformations are operations
based on the image shape. It is normally performed on binary images. It needs two inputs, one
is our original image, second one is called structuring element or kernel which decides the
nature of operation. Two basic morphological operators are Erosion and Dilation
Morphological Transformations are performed on the images captured by the Depth
Sensing Device and Camera 120 in order to enhance the features to be ed in the images
and improve the performance and cy of Contour detection processes. Erosion and
Dilation are examples of morphological transformations that may be applied during the image
pre-processing stage. Both the erosion and dilation processes require two inputs, image data in
the form of a matrix captured by the Depth Sensing Device and Camera 120 and a structuring
element, or kernel which determines the nature of the morphological operation performed on
the input image. The Kernel may be in the shape of a square or a circle and has a defined
centre and is applied as an operation by traversing through the input image.
n: A morphological transformation of erosion comprises a sharpening of
foreground s in an image by using a kernel that as it traverses through an image, the
value of a pixel is left to a value of 1 or a value corresponding to the white colour only if all
the values in corresponding to the kernel are 1 or a value corresponding to the white colour.
Kernels of size 3x3 or 5x5 or other sizes may be employed for the operation of erosion.
Erosion operation, erodes away the boundary of foreground objects. Through a series of
images 1800, Figure 18 illustrates an example of the application of erosion and dilation
ors. An example of the effect of the erosion operation on an input image 1801 may be
seen in erosion output image 1802. The ion of erosion may be performed by a
ined library in the Image Processing Library 922. For example, if the OpenCV library is
used, the function “erode” may be invoked by the Gaming Monitoring Module 928 to operate
on an image captured by the Depth g Device and Camera 120.
To achieve Erosion the kernel slides through the image (as in 2D convolution). A
pixel in the original image (either 1 or 0) will be considered 1 only if all the pixels under the
kernel is 1, otherwise it is eroded (made to zero).
Dilation: An operation of dilation is the inverse of n. For example, in a dilation
operation using a 3x3 square matrix kernel, the pixel at the centre of the kernel may be left to
a value of 1 or a value corresponding to the white colour in any one of the values in the
corresponding kernel is 1 or a value corresponding to the white colour. An example of the
effect of this operation on an input image 1802 may be seen in the erosion output image 1803.
As a consequence of dilation, the features in an image become more continuous and .
The operation of dilation may be performed by a ined library in the Image Processing
Library 922. For example, if the OpenCV library is used, the function e” may be
invoked by the Gaming Monitoring Module 928 to operate on an image captured by the Depth
Sensing Device and Camera 120.
Dilation is just the te of erosion. Here, a pixel element is 1 if at least one pixel
under the kernel is 1. So it increases the white region in the image or size of foreground object
increases. Normally, in cases like noise removal, n is followed by dilation. Because,
erosion removes white noises, but it also shrinks our object. So we dilate it. Since noise
elements are removed by erosion they are not oduced by dilation, but the object area
increases. It is also useful in joining broken parts of an object.
The application of a thresholding technique to an image produces a binary image. To
further enhance features present in an image, the morphological transformations of erosion
and dilation are applied. Advantageously, the morphological transformations assist in
reduction of noise from images, isolation of individual elements and joining disparate
elements in an image.
An image contour comprises a curve joining all continuous points along the
boundary of an object represented in an image. Contours are a useful tool for shape analysis
and object detection and recognition. Contour Approximation is used to approximate the
similarity of a certain shape to that of the desired shape in the application. The desired shape
may be in the form of a polygon or a circle or an ellipse, for example. For better accuracy and
performance, contour detection operations may be performed on binary images after Edge
Detection operation has been performed.
Edge Detection: Edge detection is an image processing technique for g the
boundaries of objects within images. It works by detecting discontinuities in brightness.
Among those, Canny is a popular multi-stage edge detection algorithm (or process) which can
be described as following steps.
Edge ion may be performed by ing brightness discontinuities between
neighbouring pixels and pixel rs. l image processing techniques may be
employed to perform this operation. Some embodiments implement the Canny Edge detection
or or process to detect edges in images captured by the Depth Sensing Device and
Camera 120. Figure 19 is a flowchart 1900, that ents a series of steps that are ed
in the implementation of the Canny Edge detection operator. The step 1910 es the
preparation of an image for use as an input to the operator. This step may comprise
application of an appropriate thresholding technique to the image, and application of erosion
or dilation to e the performance of rest of the edge detection process. The step 1920
comprises reduction of unwanted noise from the image. This may be achieved with the
application of a 5x5 an filtering kernel, for example. This step smoothens the features
in the image and improves the performance of rest of the process. An example of a Gaussian
ing kernel that may be employed is as s:
2 4 5 4 2
⎡4 9 12 9 4⎤
⎢ ⎥
⎢5 12 15 12 5⎥
⎢4 9 12 9 4⎥
⎣2 4 5 4 2⎦
The step 1930 comprises estimation of the intensity gradient of the image. To
perform this operation, the input image is filtered by two Sobel kernels. Operation of the
kernel Gx returns a first derivative of the image in the horizontal direction and kernel Gy
returns a first derivative of the input image in the vertical direction. The kernels Gx and Gy that
may be used are:
−1 0 1 −1 −2 −1
Gx = �−2 0 2� Gy = � 0 0 0 �
−1 0 1 1 2 1
Based on the first horizontal and vertical derivatives of the input image, the edge
gradient G and the direction of each pixel ���� can be calculated as follows:
���� = ����� ���� 2 + ���� ���� 2 ���� = tan−1 ����� ���� �
���� ����
The gradient direction is generally perpendicular to the edges, and may be d to
one of four angles which represent the vertical, horizontal and two diagonal directions.
In step 1940, a complete scan of the edge intensity image may be done to remove any
unwanted pixels which may not constitute an edge or a desired edge. This is achieved by
checking if each pixel is a local maximum in its neighbourhood in the direction of its nt.
As illustrated in graph 2000 of Figure 20, Point A is on an edge in a al direction; a
gradient direction is normal to the edge. Points B and C are d within the gradient
ion, therefore point A may be compared against points B and C to observe if it forms a
local m. If so, it is considered for the next stage 1950 in the process, otherwise it may
be ssed by being assigned to point A, a pixel value of 0. The result is a binary image
with pixels of value 1 corresponding to thin edges and 0 to no edge.
In step 1950, it is estimated which of the edges detected in the previous step are true
positives, meaning they are more likely represent an edge in the real world represented by the
input image, rather than a false positive. In order to perform this operation, two threshold
values may be defined: minVal and maxVal. Edges with an intensity gradient greater than
maxVal are considered to be a sure-edge, and those below minVal may be considered as nonedges
and ded. The edges which lie within the two thresholds may be further classified
as edges or non-edges by their connectivity property. If they are connected to sure-edge
pixels, they are considered to form part of the edge. Otherwise, they may be discarded as false
positives. For e in graph 2100 of Figure 21, edge A is above maxVal, therefore it may
be considered a true positive. Although edge C is below maxVal, it is ted to edge A,
and therefore it may also be treated as a true positive edge, and the entire curve may be
considered valid. Although edge B is above minVal and is in the same region as that of edge
C, it is not connected to any true positive edges and therefore it may be treated as a false
positive. Values for minVal and maxVal are chosen to e the optimal result. For
example minVal may be set to a value of between 20-60 and maxVal may be set to a value
n 60-180. This stage may also remove noise in the form of small pixel clusters.
Some or all of the steps identified in Figure 19 may be performed through programs
available in the Image Processing Library 922. For example, if the OpenCV library is used,
the “canny” edge detection function call may be used. Other alternative methods of edge
detection may also be utilized as an alternative to canny edge detection to get the same result
of identification of edges in an input image.
Contour Detection: After an edge detection operator has been applied to an input
image to identify edges, contour detection processes may be applied to the result of the edge
detection operation to approximate the rity of shapes in an image to certain model
shapes such as a n, or a circle for example.
Contours may be explained as a curve g all the continuous points (along the
boundary), having same colour or intensity. The contours are a useful tool for shape analysis
and object detection and recognition. It is suggested that for better accuracy, binary images be
used as an input to contour detection algorithms (or processes). So before finding contours, it
is suggested that one should apply thresholding or canny edge detection. In our application we
use border following algorithms (or ses) for the gical analysis of zed binary
images. These thms (or processes) determine the surroundness relations among the
borders of a binary image. Since outer borders and hole borders have a one-to-one
correspondence to the connected components of a pixels and to the holes, respectively, the
algorithm (or process) yields a representation of a binary image, from which one may extract
some features without reconstructing the image. The second border following thm (or
process), which is a modified version of the first, follows only the outermost borders (i.e., the
outer borders which are not surrounded by holes).
Contour Approximation is also performed, which approximates a contour shape to
another shape (polygon) with a lesser number of vertices, depending upon the precision we
specify. It may be implemented through the Douglas-Peucker algorithm as follows:
function sPeucker ( PointList [], epsilon )
// Find the point with the maximum distance
dmax = 0
index = 0
end = length ( PointList )
for i = 2 to ( end - 1) {
d = perpendicularDistance ( PointList [i],
Line ( PointList [1] , PointList [end ]))
if ( d > dmax ) {
index = i
dmax = d
// If max distance is greater than epsilon,
// recursively simplify
if ( dmax > epsilon ) {
// Recursive call
recResults1 [] = DouglasPeucker ( PointList [1... index
epsilon )
recResults2 [] = DouglasPeucker ( ist [ index ...
end],
epsilon )
// Build the result list
ResultList [] = { recResults1 [1... length ( recResults1
) -1],
recResults2 [1... length ( recResults2 )]}
} else {
ResultList [] = { ist [1] , PointList [end ]}
// Return the result
return ResultList []
In Figure 22, a series of images 2200 represent various stages of the application of
the Douglas-Peucker algorithm. Image 2210 is the original image that may be used as an input
image. The line 2205 in image 2220, represents an imated curve for the value of
epsilon equal to 10% of arc length. The line 2215 in image 2230, represents an approximated
curve for the value of epsilon equal to 1% of arc .
Contour estimation operations may be performed using pre-packaged functions in the
Image Processing Library 922 by invoking them in the Gaming Monitoring Module 928. For
example if OpenCV is used for implementing the r estimation process, then the
functions “findContours” or “drawContours” or “approxPolyDP” may be invoked to
implement the s, for example.
Plane ion Process: As a precursor to the estimation of the height of a stack of
chips on the gaming table or the playing surface, the Gaming Monitoring System 100
estimates the position of a plane which comprises the gaming table or the g surface.
One method of estimating the on of the plane is through Principal Component Analysis
(PCA). PCA minimizes the dicular distances from a set of data to a fitted model. This
is the linear case of what is known as Orthogonal Regression or Total Least Square. For
example, given two data vectors, x and y, a line in the form of a linear equation with two
variables can be estimated that minimizes the perpendicular distanced from each of the points
(xi, yi) to the line. More generally, an r-dimensional hyperplane can be fit in p-dimensional
space, where r is less than p. The choice of r is lent to the choice of number of
components to retain during PCA.
The basic mathematical model of a plane can be formulated as:
�������� + �������� + �� + ���� = 0
The values a, b, c, and d need to be estimated to minimize the distance from points
selected on the gaming table or the playing surface. Based on a binary version of an image
obtained during the chip detection stage, 100 or more points that are not detected as chips may
be chosen as input points for PCA. Since these chosen points are points in a two dimensional
space, the depth information associated with points is utilized to obtain co-ordinates in a two
dimensional space. The principle of the Pinhole Camera Model is relied on to convert the
points in the two dimensional space to points in the three dimensional space. Given a point in
the two dimensional space with coordinates (x,y), depth value Z and (Cx, Cy) as x, y
coordinates of the optical centre of the Depth Sensing Device and Camera 120; the
coordinates (X, Y, Z) of the same point in the three dimensional space can be determined using
the following equations:
���� −���� ����
���� = ���� −���� ���� ���� =
���� ���� ���� ����
s 23(a) and 23(b) rate the application of PCA to a set of points 2310 in a
three dimensional coordinate graph 2300. In coordinate graph 2301 in figure 23(b), the
application of PCA enables identification of a place 2350 that minimizes the orthogonal
distances 2320 between the points 2310 and the plane 2350. The plane estimation operations
may be performed using pre-packaged functions in the Image Processing Library 922 by
invoking them in the Gaming Monitoring Module 928. For e if OpenCV is used for
implementing the contour estimation process, then the functions implemented by the class
CA” may be invoked to implement the plane estimation process, for example.
To estimate the value of chips, a traditional Euclidean distance of images can be used
to classify chips. Chip template images will be ted in e for comparison purpose.
Then a est ours algorithm is used to assign the value of chips. However, there are
some chip types with similar colour. To distinguish these types of chips, we further utilize the
reflectivity of chip type from infrared image to classify them with the same k-nearest
neighbours algorithm.
After having the table plane, distance from centre of each chip to that plane may be
estimated to get the height of a stack of chips. The number of chips may also be estimated
based on this height by linear or non-linear mapping. Calculating distance from a point to a
plane can be derived as follows. Given a plane in 3-dimensional space
ax + by + cz + d = 0
and a point x0 = (x0; y0; z0), the normal vector to the plane is given by
���� = ����� �
then the distance from that point to the plane is calculated as
|����
Claims (12)
1. A system for automated gaming recognition, the system comprising: at least one image sensor configured to capture image frames of a field of view including a table game; at least one depth sensor configured to e depth of field images of the field of view; and a computing device configured to receive the image frames and the depth of field images, and configured to process the received image frames and depth of field images in order to produce an automated recognition of at least one gaming state appearing in the field of view.
2. The system of Claim 1, wherein the image frame comprise images within or constituting the visible um or infrared or iolet images.
3. The system of Claim 1 or claim 2, n the depth of field images comprise time of flight data points for the field of view and phase information data points reflecting depth of field.
4. The system of any one of Claims 1 to 3, wherein the at least one gaming state appearing in the field of view comprises one or more or all of: game start; chip detection; chip value estimation, chip stack height estimation; and game end.
5. A method of ted gaming recognition, the method comprising: obtaining image frames of a field of view including a table game; obtaining depth of field images of the field of view; and processing the received image frames and depth of field images in order to produce an automated recognition of at least one gaming state appearing in the field of view.
6. The method of Claim 5, wherein the image frame comprise images within or constituting the visible spectrum or ed or ultraviolet images.
7. The method of Claim 5 or claim 6, wherein the depth of field images comprise time of flight data points for the field of view and phase information data points reflecting depth of field.
8. The method of any one of Claims 5 to 7, wherein the at least one gaming state appearing in the field of view comprises one or more or all of: game start; chip detection; chip value estimation, chip stack height estimation; and game end.
9. A non-transitory computer readable , comprising instructions for ted gaming ition which, when executed by one or more processors, causes performance of the following: obtaining image frames of a field of view including a table game; obtaining depth of field images of the field of view; and processing the received image frames and depth of field images in order to produce an automated recognition of at least one gaming state appearing in the field of view.
10. The non-transitory computer readable medium according to Claim 9 wherein the image frame comprise images within or constituting the visible um or infrared or ultraviolet images.
11. The non-transitory computer readable medium according to Claim 9 wherein the depth of field images comprise time of flight data points for the field of view and phase information data points reflecting depth of field.
12. The ansitory computer readable medium according to Claim 9 wherein the at least one gaming state appearing in the field of view comprises one or more or all of: game start; chip detection; chip value estimation, chip stack height estimation; and game end. Gaming Gaming 105 ring Monitoring Setup Setup 105 105 110 z 187 Gaming Environment 120 Z Depth Sensing Computing Device and Device Camera S 6 167 115 Message —127— Database Broker Server Server Application .137 Server Administrator Database Client Client
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2016901829 | 2016-05-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
NZ788283A true NZ788283A (en) | 2022-05-27 |
Family
ID=
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11580746B2 (en) | System and method for automated table game activity recognition | |
US11694336B2 (en) | System and method for machine learning-driven object detection | |
US10380838B2 (en) | Systems, methods and devices for monitoring betting activities | |
EP3528219A1 (en) | Systems, methods and devices for monitoring betting activities | |
NZ788283A (en) | System and method for automated table game activity recognition |