Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
VISION BASED RECOGNITION OF GAMING CHIPS
Document Type and Number:
WIPO Patent Application WO/2020/072664
Kind Code:
A1
Abstract:
Vision-based recognition systems and methods can be used to identify gaming chips and stacks of gaming chips on a gaming table. The vision-based recognition systems and methods can include imaging devices that capture images of gaming tables, and an image service that can analyze captured images to identify gaming chips therein and determine denominations of the identified gaming chips. To identify gaming chips, an image service can identify gaming chip edges, gaming chip transitions, and other features. To determine gaming chip denominations, an image service can identify colors, patterns, and other gaming chip indicia, and match or correlate the identified colors, patterns, and indicia to stored colors, patterns, indicia, and other metrics associated with gaming chip denominations. To augment gaming chip identification and denomination determination, an imaging service can perform image processing methods, such as, for example, executing machine learning models and various applying algorithms.

Inventors:
GELINOTTE EMMANUEL (FR)
REED JEFFREY (US)
OROS NICOLAS YVAN (US)
RAJARAMAN ARUN (US)
Application Number:
PCT/US2019/054320
Publication Date:
April 09, 2020
Filing Date:
October 02, 2019
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
GAMING PARTNERS INT USA INC (US)
International Classes:
G07D9/04
Domestic Patent References:
WO2016053521A12016-04-07
Foreign References:
US20030220136A12003-11-27
US5742656A1998-04-21
US20180075698A12018-03-15
US20110227703A12011-09-22
US6567159B12003-05-20
US20070077987A12007-04-05
US20110052049A12011-03-03
US20050111730A12005-05-26
US20180247134A12018-08-30
US20050164781A12005-07-28
Attorney, Agent or Firm:
THOMPSON, Adam J. (US)
Download PDF:
Claims:
CLAIMS

Therefore, at least the following is claimed:

1. A system comprising:

a gaming table;

at least one imaging device; and

at least one computing device in communication with the at least one imaging device, the at least one computing device being configured to at least:

receive an image from the at least one imaging device;

locate at least one stack of gaming chips in the image;

determine a count of a plurality of gaming chips in the at least one stack of gaming chips; and

determine a denomination of each of the plurality of gaming chips in the at least one stack of gaming chips.

2. The system of claim 1, wherein the at least one computing device is further configured to:

perform a horizontal edge detection on the image;

perform a horizontal line and transition estimation based at least in part on the horizontal edge detection; and

determine the count based at least in part on the horizontal line and transition estimation.

3. The system of claim 1, wherein the at least one computing device is further configured to: determine a height of the at least one stack of gaming chips; and determine the count of the plurality of gaming chips by dividing the height by a relative gaming chip thickness.

4. The system of claim 3, wherein the relative gaming chip thickness is associated with a capture angle of the image.

5. The system of claim 1, wherein the at least one computing device is further configured to locate the at least one stack by excluding background of the image from analysis.

6. The system of claim 1, wherein the background is excluded at least in part by applying a mirror algorithm.

7. The system of claim 1, wherein the at least one computing device is further configured to verify the count of the plurality of gaming chips in the at least one stack of gaming chips based on reading an RFID tag in each of the plurality of gaming chips in the at least one stack of gaming chips.

8 The system of claim 1, wherein the at least one computing device is further configured to:

determine a dominant color associated with at least one of the plurality of gaming chips; and

determine the denomination of the at least one of the plurality of gaming chips based at least in part on the dominant color.

9. The system of claim 1, wherein the at least one computing device is further configured to locate the at least one stack of gaming chips in the image based at least in part on a spiking neural network (SNN) model.

10. The system of claim 1, wherein the at least one computing device is further configured to generate a histogram corresponding to a region of a gaming chip of the plurality of gaming chips.

11. The system of claim 1, wherein the at least one computing device is further configured to:

determine a plurality of color percentages corresponding to counts of pixel color; and

determine each of the plurality of color percentages fall within predefined ranges associated with one of a plurality of denominations of gaming currency.

12. The system of claim 1, wherein the at least one computing device is further configured to partition colors of pixels into clusters.

13. The system of claim 1, wherein the at least one computing device is further configured to partition colors of pixels into clusters based at least in part on K- means clustering.

14. A method comprising:

processing, via at least one computing device, at least one image to locate a stack of gaming chips;

determining, via the at least one computing device, a count of a plurality of gaming chips in the stack of gaming chips; and

determining, via the at least one computing device, a denomination of each of the plurality of gaming chips in the at least one stack of gaming chips.

15. The method of claim 14, further comprising:

performing, via the at least one computing device, a horizontal edge detection on the image;

performing, via the at least one computing device, a horizontal line and transition estimation based at least in part on the horizontal edge detection; and

determining, via the at least one computing device, the count based at least in part on the horizontal line and transition estimation.

16. The method of claim 14, further comprising:

determining, via the at least one computing device, a dominant color associated with at least one of the plurality of gaming chips; and

determining, via the at least one computing device, the denomination of the at least one of the plurality of gaming chips based at least in part on the dominant color.

17. The method of claim 14, further comprising generating, via the at least one computing device, a histogram corresponding to a region of a gaming chip of the plurality of gaming chips.

18. The method of claim 14, further comprising:

determining, via the at least one computing device, a plurality of color percentages corresponding to counts of pixel color; and

determining, via the at least one computing device, each of the plurality of color percentages fall within predefined ranges associated with one of a plurality of denominations of gaming currency.

19. The method of claim 14, further comprising partitioning, via the at least one computing device, colors of pixels into clusters.

20. The method of claim 14, wherein partitioning is performed based at least in part on K-means clustering.

Description:
VISION BASED RECOGNITION OF GAMING CHIPS

CROSS-REFERENCE TO RELATED APPLICATIONS

[0001] This application claims the benefit of and priority to U.S. Provisional Patent Application No. 62/739,918, filed October 2, 2019, and entitled“VISION BASED RECOGNITION OF GAMING CHIPS,” which is incorporated herein by reference in its entirety.

BACKGROUND

[0002] Gaming chips can be used in place of currency and credits utilized in wagering and other gaming environments. Chips are typically used on a gaming table, often positioned in stacks of chips. To track transactions and wagering activities at a gaming table, a person can visually count chips located thereon and correlate the identified chips to one or more denominations of currency, of credits, or other values. A gaming table can include hundreds of gaming chips, and, therefore, human-based manual tracking of chips can become prohibitively costly and time consuming. Previous approaches to tracking chips have utilized image recognition systems to monitor table game activity; however, previous systems are not tolerant to partial visual obfuscations of chips or to atypical chip arrangements, and may be incapable of resolving and identifying gaming chips in images when viewed against background imagery.

Accordingly, there is a long-felt, but unresolved need for vision-based chip recognition systems and methods that can accurately and precisely identify gaming chips of varying denominations on a gaming table in a manner that resolves partial visual obfuscations, demonstrates tolerance of atypical chip arrangements, and resolves chip images against background imagery. SUMMARY

[0003] Briefly described, and according to one embodiment, aspects of the present disclosure generally relate to systems and methods for vision-based recognition of gaming chips. In various embodiments, vision-based recognition may also be referred to as image-based recognition and can include performing image-based recognition on individual images, on video feeds, on computed image data, such as by combining two or more images or computing data from one or more images, or on other visual data.

[0004] Described herein, in various embodiments, are systems and methods for: 1) identifying gaming chips and stacks of gaming chips on a gaming table; and 2) determining a denomination of identified gaming chips. A vision-based gaming chip recognition system can be deployed in environments with one or more gaming tables, such as, for example, casinos. The system can utilize one or more cameras to capture images of the gaming tables. The system can include an image service that analyzes captured images to identify, if present therein, gaming chips and gaming chip stacks.

The image service can also analyze images of gaming chips and gaming chips stacks to identify denominations thereof.

[0005] For example, the system can include, in one or more databases, tables and/or other data objects that relate colors and color patterns to specific denominations. The image service can identify colors and color patterns in a gaming chip image, compare the identified colors and color patterns to one or more stored color and/or pattern tables, and, based on matches identified there between, determine a denomination of gaming chips in the gaming image. In the same example, the image service may also perform one or more image processing methods to correct or control for image distortions, remove background imagery, and resolve partial views of gaming chips. [0006] The image service can perform image processing methods that can include, but are not limited to: 1) removing background imagery, for example, via executing mirror algorithms; 2) resolving partial views, for example, by replacing portions of the partial views using views from other cameras and/or by combining camera views; and 3) controlling for image distortions, for example, by executing one or more K-means processes as described herein. To recognize gaming chips and gaming chip stacks, the image service can perform techniques including, but not limited to: 1) edge detection algorithms and methods; 2) gaming chip transition detection algorithms and methods; 3) machine learning methods, for example, to train machine learning models to recognize colors and color patterns of denominations, and recognize those patterns in gaming chip images; and 4) other methods described herein.

[0007] The system can also include RFID elements for detecting gaming chips and gaming chip stacks. For example, the gaming chips can include RFID tags that can be read by RFID readers configured within a gaming table. Upon reading a gaming chip’s RFID tag, the system may receive an RFID identifier that can be correlated with stored RFID identifiers to identify the gaming chip and/or the gaming chip’s denomination.

[0008] According to a first clause, a system including: A) a gaming table; B) at least one imaging device; and C) at least one computing device in communication with the at least one imaging device, the at least one computing device being configured to at least: 1) receive an image from the at least one imaging device; 2) locate at least one stack of gaming chips in the image; 3) determine a count of a plurality of gaming chips in the at least one stack of gaming chips; and 4) determine a denomination of each of the plurality of gaming chips in the at least one stack of gaming chips.

[0009] According to a second clause, the system of the first clause or any other clause, wherein the at least one computing device is further configured to: A) perform a horizontal edge detection on the image; B) perform a horizontal line and transition estimation based at least in part on the horizontal edge detection; and C) determine the count based at least in part on the horizontal line and transition estimation.

[0010] According to a third clause, the system of the first clause or any other clause, wherein the at least one computing device is further configured to: 1) determine a height of the at least one stack of gaming chips; and 2) determine the count of the plurality of gaming chips by dividing the height by a relative gaming chip thickness.

[0011] According to a fourth clause, the system of the third clause or any other clause, wherein the relative gaming chip thickness is associated with a capture angle of the image.

[0012] According to a fifth clause, the system of the first clause or any other clause, wherein the at least one computing device is further configured to locate the at least one stack by excluding background of the image from analysis.

[0013] According to a sixth clause, the system of the first clause or any other clause, wherein the background is excluded at least in part by applying a mirror algorithm.

[0014] According to a seventh clause, the system of the first clause or any other clause, wherein the at least one computing device is further configured to verify the count of the plurality of gaming chips in the at least one stack of gaming chips based on reading an RFID tag in each of the plurality of gaming chips in the at least one stack of gaming chips.

[0015] According to an eighth clause, the system of the first clause or any other clause, wherein the at least one computing device is further configured to: A) determine a dominant color associated with at least one of the plurality of gaming chips; and B) determine the denomination of the at least one of the plurality of gaming chips based at least in part on the dominant color. [0016] According to a ninth clause, the system of the first clause or any other clause, wherein the at least one computing device is further configured to locate the at least one stack of gaming chips in the image based at least in part on a spiking neural network (SNN) model.

[0017] According to a tenth clause, the system of the first clause or any other clause, wherein the at least one computing device is further configured to generate a histogram corresponding to a region of a gaming chip of the plurality of gaming chips.

[0018] According to an eleventh clause, the system of the first clause or any other clause, wherein the at least one computing device is further configured to: A) determine a plurality of color percentages corresponding to counts of pixel color; and B) determine each of the plurality of color percentages fall within predefined ranges associated with one of a plurality of denominations of gaming currency.

[0019] According to a twelfth clause, the system of the first clause or any other clause, wherein the at least one computing device is further configured to partition colors of pixels into clusters.

[0020] According to a thirteenth clause, the system of the first clause or any other clause, wherein the at least one computing device is further configured to partition colors of pixels into clusters based at least in part on K-means clustering.

[0021] According to a fourteenth clause, a method including: A) processing, via at least one computing device, at least one image to locate a stack of gaming chips; B) determining, via the at least one computing device, a count of a plurality of gaming chips in the stack of gaming chips; and C) determining, via the at least one computing device, a denomination of each of the plurality of gaming chips in the at least one stack of gaming chips. [0022] According to a fifteenth clause, the method of the fourteenth clause or any other clause, further including: A) performing, via the at least one computing device, a horizontal edge detection on the image; B) performing, via the at least one computing device, a horizontal line and transition estimation based at least in part on the horizontal edge detection; and C) determining, via the at least one computing device, the count based at least in part on the horizontal line and transition estimation.

[0023] According to a sixteenth clause, the method of the fourteenth clause or any other clause, further including: A) determining, via the at least one computing device, a dominant color associated with at least one of the plurality of gaming chips; and B) determining, via the at least one computing device, the denomination of the at least one of the plurality of gaming chips based at least in part on the dominant color.

[0024] According to a seventeenth clause, the method of the fourteenth clause or any other clause, further including generating, via the at least one computing device, a histogram corresponding to a region of a gaming chip of the plurality of gaming chips.

[0025] According to an eighteenth clause, the method of the fourteenth clause or any other clause, further including: A) determining, via the at least one computing device, a plurality of color percentages corresponding to counts of pixel color; and B) determining, via the at least one computing device, each of the plurality of color percentages fall within predefined ranges associated with one of a plurality of denominations of gaming currency.

[0026] According to a nineteenth clause, the method of the fourteenth clause or any other clause, further including partitioning, via the at least one computing device, colors of pixels into clusters. [0027] According to a twentieth clause, the method of the fourteenth clause or any other clause, wherein partitioning is performed based at least in part on K-means clustering.

BRIEF DESCRIPTION OF THE DRAWINGS

[0028] For a more complete understanding of the embodiments and the advantages thereof, reference is now made to the following description, in conjunction with the accompanying figures briefly described as follows:

[0029] FIG. 1 is an illustration of a gaming table and a camera according to various embodiments of the present disclosure.

[0030] FIG. 2 is a drawing of a networked environment according to various example embodiments.

[0031] FIG. 3 illustrates an example flowchart of certain functionality implemented by portions of image service executed in a computing environment in the networked environment of FIG. 2 according to various embodiments of the present disclosure.

[0032] FIG. 4 illustrates generated chip transitions using a spiking neural network model according to various embodiments of the present disclosure.

[0033] FIG. 5 is an image representing chip transition hits on a contrast enhanced grayscale image according to various embodiments of the present disclosure.

[0034] FIG. 6 is an image representing merged hits for the contrast enhanced grayscale image from FIG. 5 according to various embodiments of the present disclosure.

[0035] FIG. 7 is an image showing candidate areas according to various embodiments of the present disclosure. [0036] FIG. 8 is an image indicating identified stacks of gaming chips according to various embodiments of the present disclosure.

[0037] FIG. 9 shows example images from different stages in chip identification according to various embodiments of the present disclosure.

[0038] FIG. 10 shows images of example gaming chips with various denominations according to various embodiments of the present disclosure.

[0039] FIG. 11 is an example color histogram according to various embodiments of the present disclosure.

[0040] FIG. 12 is an image of an example gaming chip side by side with a 2D flat view of a side of the gaming chip according to various embodiments of the present disclosure.

[0041] FIG. 13 is an image detecting a comparison of a detected gaming chip to a 2D flat view according to various embodiments of the present disclosure.

[0042] FIG. 14 is an example of a bimodal stack histogram according to various embodiments of the present disclosure.

[0043] FIG. 15 is a schematic block diagram that illustrates an example computing environment employed in the networked environment of FIG. 2 according to various embodiments.

[0044] The drawings illustrate only example embodiments and are therefore not to be considered limiting of the scope described herein, as other equally effective embodiments are within the scope and spirit of this disclosure. The elements and features shown in the drawings are not necessarily drawn to scale, emphasis instead being placed upon clearly illustrating the principles of the embodiments. Additionally, certain dimensions may be exaggerated to help visually convey certain principles. In the drawings, similar reference numerals between figures designate like or corresponding, but not necessarily the same, elements.

DETAILED DESCRIPTION

[0045] In the following paragraphs, the embodiments are described in further detail by way of example with reference to the attached drawings. In the description, well known components, methods, and/or processing techniques are omitted or briefly described so as not to obscure the embodiments. As used herein, the“present disclosure” refers to any one of the embodiments described herein and any equivalents. Furthermore, reference to various feature(s) of the“present embodiment” is not to suggest that all embodiments must include the referenced feature(s).

[0046] Among embodiments, some aspects of the present disclosure are implemented by a computer program executed by one or more processors, as described and illustrated. As would be apparent to one having ordinary skill in the art, one or more embodiments may be implemented, at least in part, by computer-readable instructions in various forms, and the present disclosure is not intended to be limiting to a particular set or sequence of instructions executed by the processor.

[0047] The embodiments described herein are not limited in application to the details set forth in the following description or illustrated in the drawings. The disclosed subject matter is capable of other embodiments and of being practiced or carried out in various ways. Also, the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of "including,"

"comprising," or "having" and variations thereof herein is meant to encompass the items listed thereafter, additional items, and equivalents thereof. The terms "connected" and "coupled" are used broadly and encompass both direct and indirect connections and couplings. In addition, the terms "connected" and "coupled" are not limited to electrical, physical, or mechanical connections or couplings. As used herein the terms "machine," "computer," "server," and "work station" are not limited to a device with a single processor, but may encompass multiple devices (e.g., computers) linked in a system, devices with multiple processors, special purpose devices, devices with various peripherals and input and output devices, software acting as a computer or server, and combinations of the above.

[0048] The gaming chips or chip as used herein can include any chip, plaque, jeton, or other gaming currency that may be used in a casino, gaming room, or digital game. Each gaming chip can represent a value that is predetermined or not. The gaming chips can be made from a rigid plastic material or clay to obtain a structure that is solid enough to resist conditions of use in casinos. The gaming chips can be used throughout a casino. For example, at gaming tables, gaming chips can be received for play or the conclusion of a game or hand, cash can be received and gaming chips paid out (buy-in), and gaming chips may be paid out during play. In a cashier area, gaming chips can be received and cash can be paid out (cash out). Alternatively, cash can be received and gaming chips can be paid out (buy-in).

[0049] Turning now to the drawings, exemplary embodiments are described in detail. With reference to FIG. 1, shown is a gaming table 100 in a networked environment according to various embodiments of the present disclosure. The gaming table 100 can be monitored by one or more imaging devices 103, such as, for example, a camera. One or more gaming chips can be played on the gaming table as part of a wagering game. The video feed from the one or more imaging devices 103 can be used to locate stacks of one or more gaming chips, count the gaming chips in each stack, and evaluate a denomination for each of the gaming chips.

[0050] With reference to FIG. 2, shown is a networked environment 200 according to various embodiments. The networked environment 200 includes a computing environment 203, one or more gaming table devices 206, and one or more cameras 209, which are in data communication with each other via a network 212. The network 212 includes, for example, the Internet, intranets, extranets, wide area networks (WANs), local area networks (LANs), wired networks, wireless networks, or other suitable networks, etc., or any combination of two or more such networks. For example, such networks may comprise satellite networks, cable networks, Ethernet networks, and other types of networks.

[0051] The computing environment 203 can include, for example, a server computer or any other system providing computing capability. Alternatively, the computing environment 203 may employ a plurality of computing devices that may be arranged, for example, in one or more server banks or computer banks or other arrangements. Such computing devices may be located in a single installation or may be distributed among many different geographical locations. For example, the computing environment 203 may include a plurality of computing devices that together may comprise a hosted computing resource, a grid computing resource, and/or any other distributed computing arrangement. In some cases, the computing environment 203 may correspond to an elastic computing resource where the allotted capacity of processing, network, storage, or other computing-related resources may vary over time.

[0052] Various applications and/or other functionality may be executed in the computing environment 203 according to various embodiments. Also, various data is stored in a data store 215 that is accessible to the computing environment 203. The data store 215 may be representative of a plurality of data stores 215 as can be appreciated. The data stored in the data store 215, for example, is associated with the operation of the various applications and/or functional entities described below. The data store 215 can include currency data 218, gaming data 221, and training data 224, among other data. [0053] The components executed on the computing environment 203, for example, include an image service 227 and an RFID service 228, and other applications, services, processes, systems, engines, or functionality not discussed in detail herein. The image service 227 is executed to recognize gaming chips used on one or more tables 206. As ' an example, a stack of gaming chips can be positioned in various locations on a gaming table 206 during a wagering game. One or more images can be captured of the gaming table 206 and the image service 227 can locate and/or identify the one or more stacks of gaming chips in the images, count the gaming chips in each stack, and evaluate denominations for the gaming chips. It can be appreciated that some or all of the functionality as described with reference to the image service 227 can be executed in a computing device at the gaming table 206.

[0054] The currency data 218 can include a list of all active gaming chips including any identifiers associated with the gaming chips, such as, for example, RFID tag identifiers, barcode identifiers, visual characteristics including color information (e.g., such as color pixel values, thresholds, color patterns, etc.), and other identifiers. Active gaming chips can correspond to gaming chips indicated as currently in use in the data store 215 excluding gaming chips that are yet to be deployed, decommissioned, and/or damaged or broken. The gaming data 221 can store a history of sensor inputs received as well as any configuration, calibration, and control settings. The training data 224 can include data corresponding to transitions (e.g., transitions in color or patterns on a gaming chip) in various denominations of gaming chips, color tables for the various denominations of gaming chips, 2D views of gaming chips, and other training information. For example, the training data 224 can include data corresponding to a color table that relates chip colors, chip color patterns, and other color information (e.g., such as pixel values) to denominations (e.g., of currency, credits, prize values, etc.). As another example, the training data 224 can include 2D views of gaming chips, such as the 2D view shown in FIG. 10.

[0055] The gaming table 206 is representative of a plurality of gaming tables that may be coupled to the network 212. The gaming table 206 can include, for example, one or more computing devices with a processor-based system such as a computer system. Such a computer system may be embodied in the form of an embedded computing device or other devices with like capability. The gaming table 206 can include one or more cameras 230, one or more sensors 233, a chip tray 236, one or more bet spots 239, a chip recycler 242, and a bill validator 245. The cameras 209 and 230 can be imaging devices 103 (FIG. 1).

[0056] Similar to cameras 209, the cameras 230 can capture images of a surface of the gaming table 206. The gaming table 206 and/or cameras 230 can send the images to the image service 227 via the network 212. The images can be sent to the image service 227 as a video stream of the surface of the gaming table 206. In some embodiments, the image can be sent based on differences from a previous frame in a video, such as based on a key frame. The image service 227 can receive images from various angles from cameras 209 and 230.

[0057] The sensors 233 can include RFID antennas, video barcode scanners, weigh scales, and other sensors. The sensors 233 can be used to identify gaming chips played on a gaming table. For example, an RFID reader can utilize an RFID antennas to read RFID information from RFID tags configured within the gaming chips. The RFID information from each RFID tag can include an identifier associated with the gaming chip. As another example, video barcode scanners can read barcode information, or other information, from barcodes or barcode tags located on the gaming chips. The barcode information for each gaming chip can include an identifier associated with the gaming chip. In another example, weigh scales (e.g., located beneath bet spots 239, a chip tray 236, etc.) can read weights of one or more and/or each of the gaming chips. Detected weights of the gaming chips can be compared against known weights of a plurality of gaming chip denominations, and the gaming chips can be identified based on the comparisons.

[0058] The RFID service 228 can validate RFID currency based on information read from RFID readers via the sensors 233 corresponding to an RFID antenna. An RFID antenna can be positioned at the chip tray 236, at each of the bet spots 239, at the chip recycler 242, and in another positions. The gaming table 206 can read RFID tags from RFID-enabled gaming chips using the RFID antennas. The information from the RFID tags can be stored along with data related to the RFID antenna used to read the RFID tag. For example, an identifier from one or more RFID-enabled gaming chips can be read by an RFID antenna at a particular bet spot 239. An identifier can include information that uniquely identifies each of the one or more RFID-enabled gaming chips. In one example, a gaming table 206 can read one or more RFID-enabled gaming chips via an RFID antenna. In the same example, the gaming table 206 can record and store one or more RFID identifiers that identify the one or more RFID-enabled gaming chips, and can also record and store information that identifies the RFID antenna or table area where the one or more RFID-enabled gaming chips were read. Information identifying the RFID antenna or table area can include, but is not limited to: 1) an identifier (e.g., such as a string of characters); 2) a table identifier that identifies a particular gaming table 206 into which the RFID antenna is installed; and 3) a bet spot identifier that identifies a particular bet spot 239 onto which the one or more RFID-enabled gaming chips were disposed. [0059] The gaming table device 206 can determine a patron placed a wager of the RFID-enabled gaming chips based on a corresponding to the particular bet spot 239 where the wager was placed. The gaming table device 206 can transmit a count of the gaming chips read at each of the RFID antennas. The count can include one or more identifiers from each gaming chip. In one embodiment, the gaming table 206 can perform a read of all RFID antennas at least once per game being played, and can transmit the at least one count to the RFID service 228 after each game. The gaming table 206 can perform reads of all RFID antennas several times per game. In some embodiments, the gaming table 206 automatically and repeatedly sends any detected table and/or gaming chip changes based on readings of RFID-enabled gaming chips that occur during the game. The system can verify or validate a count of gaming chips in a stack determined by the image service 227 by reading RFID tags in each of the gaming chips in the stack using an RFID reading including an RFID antenna.

[0060] The image service 227 can validate gaming chips using the cameras 209 and/or the cameras 230. One or more cameras 230 can be positioned on the table, and one or more cameras 209 can be positioned separate from the table 206. The cameras 230 can be positioned overhead or above the table 206. The cameras 209 can also be positioned on the table 206, such as, for example in a chip tray 236, chip recycler 242, on top of a bill validator 245, or at another position. The cameras 209 and 230 can record a video stream of various angles or segments of the table 206. For example, multiple cameras 230 can be positioned in the chip tray 236 pointing toward the bet spots 239. A camera 230 can be directed to a single bet spot 239 or a group of bet spots 239. In one embodiment, the gaming table 206 can join or stitch together video feeds from multiple cameras 230 to generate a video feed of an area, such as, for example a video feed for all bet spots 239. Similarly, the image service 227 can join video together from cameras 209 and/or cameras 230.

[0061] The image service 227 can identify stacks of gaming chips, determine a count of the gaming chips in each stack, and determine a denomination for the gaming chips in each stack at a variety of positions on the gaming table 206. The gaming table 206 or the image service 227 can perform image recognition on frames of the video feeds to identify information for gaming chips on the gaming table 206. For example, a height of a stack of gaming chips can be determined, and the count of the chips can be calculated based on the height. In the same example, the count of the chips can be calculated based on computations relating the overall stack height to individual chip thickness. In an exemplary scenario, the image service 227 may receive an image, identify a gaming chip stack depicted in the image, and determine a stack height of 300 pixels. The image service 227 may retrieve a stored gaming chip thickness of 15 pixels. To compute a gaming chip count, the image service 227 can divide the 300 pixel stack height by the 15 pixel gaming chip thickness to obtain a gaming chip count of 20. The image service 227 can round the resulting gaming chip count when appropriate. As an example, if the stack height of 307 pixels had been determined, the result of 20 gaming chips can be determined by rounding 20.466 to a count of 20 gaming chips.

[0062] The gaming chip thickness can be associated with a predefined capture angle at which the image was captured and also associated with a position of the gaming chip stack within the image. The gaming chip thickness can also vary based on where in the view of view for each camera 209/230 that the gaming chip stacks are identified. As an example, the image service 227 can identify a thickness of 12 pixels in a first portion of a field of view for a camera 209/230 and a thickness of 15 pixels for a second portion of a field of view. In some embodiments, the image service 227 can calculate a distance and angle to an identified gaming chip stack, and can calculate a thickness per gaming chip based on the distance and angle. In some embodiments, the image service 227 can determine a width of an identified gaming chip stack, and can calculate a thickness per gaming chip based on the width of the gaming chip stack.

[0063] The data store 215 can store a variety of gaming chip thicknesses associated with various capture angles and various gaming chip stack image positions. The data store 215 can store a variety of dimensional information for various gaming chips, such as a ratio of width to height, width to depth, and height to depth, including ratios at various camera angles and distances.

[0064] The denomination of each of the gaming chips in the stack can be determined based on various visual characteristics. In one example, a color pattern on the edge of the gaming chips can be used to determine the denomination. In another example, one or more visual security features can be used to determine the

denomination. For example, each denomination of currency can have a different holographic symbol or other visual security protection. The image service 227 or the gaming table device 206 can determine the denomination by identifying which visual security protection each gaming chip in a video feed contains.

[0065] The chip recycler 242 can operate in a similar fashion to a coin recycler. The chip recycler 242 can be used in addition to or in place of chip tray 236. At the end of the game or hand, if a dealer has collected gaming chips from players, the gaming chips can be placed into an input area, such as a funnel, hopper, or tube, and then validated (authenticated), counted, sorted, and stored by the chip recycler 242. If gaming chips are to be paid out to players, exchanged for cash, or exchanged for other gaming chips, then the gaming table 206 or a table management system or a control system executed in the computing environment 203 can instruct the chip recycler 242 how much in gaming chips and which denominations to pay out. A chip recycler 242 within a cashier cage, a bank or vault, or kiosk (not shown) can operate in a similar fashion. A user places the gaming chips in the chip recycler 242, the chip recycler 242 processes the gaming chips, and the chip recycler 242 either automatically outputs gaming chips in other

denominations or outputs cash equal to the input value.

[0066] With reference to FIG. 3, shown is a process 300 in a flowchart according to various embodiments of the present disclosure. It is noted that embodiments described herein may be practiced using an alternative order of the steps illustrated in FIG. 3. That is, the process 300 illustrated in FIG. 3 is provided as an example only, and the embodiments may be practiced using process flows that differ from those illustrated. Additionally, it is noted that not all steps are required in every embodiment. In other words, one or more of the steps may be omitted or replaced, without departing from the spirit and scope of the embodiments. Further, steps may be performed in different orders, in parallel with one another, or omitted entirely, and/or certain additional steps may be performed without departing from the scope and spirit of the embodiments.

[0067] At box 303, the process 300 can include locating one or more stacks of gaming chips. Stacks can be composed of gaming chips with different values, colors, and other characteristics. The gaming chips may be located in a float tray, a splash tube, an RFID checkpoint device, a bet spot 239 (FIG. 2), or some other area. The splash tube can be an area to place recently played gaming chips. The RFID checkpoint device can be an RFID reading with an RFID antenna configured to read chips at a position in front of a dealer. In some embodiments, the stacks of gaming chips can be separated by a spacer or be placed in predefined areas, such as, for example, in a row of a chip tray 236. In the checkpoint, at the bet-spots 239, or on a cage counter, the stacks of gaming chips can be placed in front of other stacks, therefore occluding chips partially or even completely. By obtaining different angles from more than one camera 209 and 230, the occluded gaming chips can be seen.

[0068] At box 306, the process 300 can include counting gaming chips in each stack. The image service 227 can determine edges depicted in an image to identify a number of gaming chips in the stack. For example, a height of a stack of gaming chips depicted in an image can be determined. The stack of gaming chips can be isolated in the image, and edges between the gaming chips in the stack can be determined. A count of the chips can be calculated based on the edges.

[0069] At box 309, the process 300 can include evaluating denominations for each of the gaming chips identified. In some embodiments, the gaming chips may be sorted. Different denominations of gaming chips can have different diameters. The gaming chips may be sorted according to size, or otherwise sorted according to denomination. It can be assumed that gaming chips are sorted in specific areas, such as in a chip tray. In other areas, such as a bet spot or splash tube, the gaming chips cannot be assumed to be sorted.

[0070] With reference to FIG. 4, shown are chip transitions 400 generated using a spiking neural network (SNNT) model reconstruction process, according to various embodiments of the present disclosure. The chip transitions 400 can include one or more transitions 403, 406, and 409. The chip transitions 400 can be stored in training data 224 (FIG. 2) and can be generated during a training procedure. An image service 227 (FIG. 2) can retrieve and utilize the chip transitions 400 to perform chip identification processes described herein. In at least one embodiment, the chip transitions 400 can include mirrored versions of transitions. For example, the chip transitions 400 can include mirrored transitions 403’, 406’, and 409’ that are also generated using the spiking neural network (SNNT) model reconstruction process. Mirrored transitions can be used to augment chip identification processes, for example, in instances where a portion of chips in a chip stack are oriented upside down relative to other chips in the stack. In some embodiments, for each stack of gaming chips, the image service 227 can alter the chip transitions 400 to be orientated to match an orientation of the stack of gaming chips.

[0071] With reference to FIG. 5, shown is an image 500 representing exemplary chip transition hits 503, 505, 507, 509, 511, 513, and 515 that may be detected by an image service 227. The image 500 is representative of a contrast enhanced grayscale image that can be captured and provided via one or more camera 209 and/or 230, according to various embodiments of the present disclosure. In at least one embodiment, the chip transition hits 503, 505, 507, 509, 511, 513, and 515 represent areas of the image 500 where the image service 227 has identified a particular pattern, color, color pattern, edge, or other structure indicative that the area within the hit, or a portion thereof, is a portion of a chip. Each chip transition hit can be analyzed to determine if one or more gaming chips and/or one or more stacks of gaming chips are depicted within the image.

[0072] With reference to FIG. 6, shown is an image 600 representing merged hits for the contrast enhanced grayscale image 500 (FIG. 5) according to various

embodiments of the present disclosure. The image service 227 can perform one or more hit resolving processes to merge hits into a new merged hit that includes the areas of the old hits, but is also expanded to include nearby areas determined include a chip or portion thereof. For example, the image service 227 can merge the transition hits 505 and 507 to generate a merged transition hit 603. As another example, the image service 227 can merge the transition hits 511 and 513 to generate a merged hit 605. [0073] In various embodiments, iterative merging of individual and merged hits into newly merged hits can augment the chip recognition processes by incrementally increasing an area of an image recognized as including a chip or portion thereof.

Increasing the recognized chip area can facilitate resolving patterns, edges, colors, and other indicia required to identify a chip and correlate the chip to a particular

denomination. For example, two hits may each include a portion of a chip-identifying pattern. An image service 227, upon analyzing the pattern portion of each hit, may fail to recognize the overall chip-identifying pattern. However, upon generating a merged hit, the merged hit may fully resolve the chip-identifying pattern such that it can be recognized by the image service 227.

[0074] With reference to FIG. 7, shown is an image 700 showing various candidate areas according to various embodiments of the present disclosure. The solid-lined candidate areas 703, 705, 711, and 713 can represent areas of the image 700 determined to possibly be background imagery. The candidate areas 703, 705, 711, and 713 can be generated by an image service 227 that analyzes patterns and other indicia in and around hit areas. The image service 227 can determine areas of the image 700 that are likely to include a chip and areas that are likely to include background imagery. The image service 227 can track the areas determined to likely include a chip or chip stack, which are indicated via generation of dash-lined candidate areas 707 and 709.

[0075] In at least one embodiment, suspected background candidate areas 703, 705, 711, and 713 can be filtered out using mirroring techniques, while the suspected chip candidate areas can be retained as suspected stacks and/or chips. For example, the image service 227 can apply a mirror model algorithm to the image 700 to assess if the candidate areas 703, 705, 711, and 713 constitute background imagery or depicts an object (or portion thereof), such as a chip or chip stack. To execute the mirroring model algorithm, the image service 227 can use the SNNT to learn an area of a background image during training. The background image can correspond to an image taken from the same or similar view of the image 700 while no chips or stacks of chips present. The image service 227 can match the candidate areas 703, 705, 707, 709, 711, and 713 from the image 700 with the same areas from the background image. If a candidate area matches the corresponding area of the background image, the candidate area can be excluded from further chip recognition and analysis processes. For example, the image service 227 can determine that candidate areas 703, 705, 711, and 713 match corresponding areas of the background image, and, accordingly, the image service 227 can exclude the candidate areas from further analyses. In the same example, the image service 227 can determine that candidate areas 707 and 709 do not match corresponding areas of the background image, and the image service 227 can retain the candidate areas for further analysis. The image service 227 can also discard any hits in the excluded candidate areas.

[0076] With reference to FIG. 8, shown is an image 800 indicating an identified stack of gaming chips 803 according to various embodiments of the present disclosure. To identify the stack 803, the image service 227 can merge candidate areas that are not located in excluded areas, and can further process the merged candidate area (e.g., by applying mirror algorithms, etc.) to precisely identify areas of the image 800 that include the stack 803. For example, the image service 227 can merge candidate areas 707 and 709, and process the merged area to identify the stack 803. In at least one embodiment, the image service 227 can apply a size filter to hits and/or candidate areas so that the hits and/or candidate areas containing more than one stack are split into multiple hits and/or multiple candidate areas, each hit and/or candidate area encompassing one stack. [0077] With reference to FIG. 9, shown are example images 901, 903, 905, and 907 from different stages in chip identification processes according to various embodiments of the present disclosure. In image 901, the image service 227 can estimate the top of a stack 904 of gaming chips by identifying the rim 909 between the upper face 912 of a highest gaming chip 902 and an edge 915 of the highest gaming chip 902. The image service 227 can detect transitions between the gaming chips. Once the transitions are detected, the image service 227 can determine a count of gaming chips in the stack 904. Further, the image service 227 can isolate each individual gaming chip for color analysis or other analysis.

[0078] The estimation of the top of the stack 904 relies on multiple concepts. First, the top of a stack 904 is at the bottom of an ellipse representing the upper face 912 of the highest chip 902. The region around the top of the stack transition is generally lighter above and darker below due to the lighting conditions. The upper face 912 just above the top of the stack transition is usually heterogeneous compared to the gaming chip edges. The image 901 can correspond to an identified area that includes a stack 904 of gaming chips. The image service 227 can perform horizontal edge detection on the image 901 to generate the image 903. As an example, the image service 227 can compute a horizontal edge detection using a Sobel filter or other filter to generate one or more horizontal lines emphasizing edges 906 between chips.

[0079] The image service 227 can reduce the image area of an edge 906 to one horizontal line by averaging multiple horizontal lines clustered near the same edge 906. For example, the image service 227 can perform a horizontal line and transition estimation on the image 903 and average the horizontal lines therein to generate the image 905. Within the image 905, the image service 227 can find peaks 908 in the averaged horizontal lines. The peaks 908 can represent the locations of transitions 910 between chips. In image 903, some transitions 910 may be missing due to a lack of contrast in the original image 901 that was captured by cameras 209 and/or 230 (FIG. 2). The image service 227 can fill the missing transitions 910 by estimating the chip thickness. For example, the image service 227 can identify a chip in one or more of the images 901, 903, 905, and 907, and can compute locations of chip transitions 910 based on calculations of the chip thickness (e.g., a transition 910 may occur per every chip’s thickness worth of distance in a stack 904). Further, as shown in image 907, the image service 227 can identify individual gaming chips in the original image 901 using the lines estimated in image 905. In at least one embodiment, the image service 227 can include separation lines 912 between each of the gaming chips in image 903 as shown in the image 907. The separation lines 912 can further facilitate resolving transitions 910 between the chips.

[0080] With reference to FIG. 10, shown are images depicting example gaming chips 1000 of various denominations according to various embodiments of the present disclosure. In some embodiments, an assumption can be made that each denomination of gaming chips 1000 corresponds to a dominant color. The image service 227 can identify a color with the highest number of pixels. The image service 227 can determine the identified color as the dominant color. In some embodiments, the image service 227 can modify an image prior to identifying the color of pixels. The image service 227 can perform a color correction and/or color enhancement algorithm based on the camera used to capture the image, lighting conditions associated with the area, or some other deficiency. As an example, the image service 227 may adjust a contrast, temperature, or other property of the image based on known sensor deficiencies in the camera. In some embodiments, the image service 227 can perform a training process involving analyzing image captures from cameras 209/230 including items of known colors to identify necessary color correction and/or color enhancement.

[0081] The image service 227 can utilize the training data 224 to attribute a denomination to an imaged gaming chip 1001 based on the dominant color. As an example, an imaged gaming chip 1001 may have a denomination of $25 US Dollars when the dominant color is green, have a denomination of $1,000 US Dollars when the dominant color is a first shade of blue, and have a denomination of $5,000 US Dollars when the denomination is a second shade of blue.

[0082] During a training process, a color table can be created with denominations and dominant colors. The color table can include hue saturation and value (HSV) for each of the dominant colors. The color table can be stored in training data 224. The image service 227 can utilize a predefined color-value table. The predefined color-value table can be manually tuned or automatically trained for each casino environment including the specific set of gaming chips 1000 used in the casino, the lighting conditions, the ambient lighting, and other factors. The color table can include definitions of upper and lower HSV values for each color, which can be stored in the training data 224 as a color value (CV) table.

[0083] In one embodiment, the training process can include manually setting the minimum and maximum HSV values corresponding to the dominant color and storing the entered settings in training data 224. In other embodiments, the training process can include using machine learning to learn colors of bands patterned onto chips 1000. For example, the training process can use machine learning to learn colors of bands patterned onto the chips 1000. The image service 227 can utilize a machine learning classifier to leam HSV values corresponding to the dominant colors, such as, for example, using histogram distance classification training. With machine learning, the image service 227 can analyze images of each denomination of gaming chips 1000 from various angles in a casino environment. The image service 227 can learn the colors of bands for each denomination and store the ranges of colors as a CV table. In some embodiments, machine learning can be used to generate the CV table from various images, while in others, machine learning can be used to tune and/or calibrate one or more stored standard CV tables based on processing various images depicting chips. Periodically, a CV table can be manually adjusted or automatically retrained. In at least one embodiment, a light grey zone can be used to compute a white balance in real time in order to reduce and/or remove undesired lighting effects cause by lights emitted by any signage or other light sources.

[0084] The gaming chips 1000 can have multiple colors, such as, for example, from chip inserts, from multiple injections during an injection mold process, or from different materials in a compression mold process. In some embodiments, the gaming chips 1000 can have up to three different colors. By determining a percentage of pixel counts for each of the colors on the gaming chip 1001 within a region of interest, the image service 227 can identify a denomination for the gaming chip 1001. The image service 227 can identify regions of interest, such as region 1021, to ignore the edges 1011 of the gaming chip 1001 to focus on the color of the inserts 1013, 1015, and 1017 located in the central area of the gaming chip 1001. The region of interest 1021 can be determined to correspond to a middle or central area of the gaming chip 1001. As an example, $25 denomination gaming chips can have a green pixel count between 30% and 60%, a blue pixel count between 0% and 30%, and a brown pixel count between 0% and 30%. In this example, if the pixel counts fall in those ranges for a gaming chip 1001, the image service 227 can identify the gaming chip 1001 as a $25 denomination. [0085] With reference to FIG. 11, shown is an example color histogram 1100 according to various embodiments of the present disclosure. For each located chip stack, the image service 227 can count the gaming chips in the stack. The image service 227 can determine a contour for each of the gaming chips in the stack. For each gaming chip in the stack, the image service 227 can calculate a color histogram 1100 using a CV table from training data 224. The color histogram 1100 can include a combined red, green, blue (RGB) histogram 1101, and can also include histograms for each color in an RGB color model. For example, the color histogram 1100 can include a green histogram 1103, a blue histogram 1105, and a red histogram 1107.

[0086] For each color in the CV table, the image service 227 can apply, to a chip image, a color mask, apply morphological transformations, such as erosion, closing, and other transformations, and count a number of pixels corresponding to the respective color. The image service 227 can select a dominant color based on the number of pixels. As an example, the image service 227 can select the dominant color as the color with the highest pixel count. The image service 227 can calculate a percentage of pixels for each color from a histogram 1100, 1101, 1103, and/or 1105. The calculated pixel percentages can represent color percentages that can be compared to values in a CV table in training data 224.

[0087] The image service 227 can determine the chip value or denomination corresponding to the gaming chip with the determined dominant color. If the pixel percentage for each color falls within a range of one of the CV tables in training data 224, the image service 227 can identify the gaming device as the corresponding denomination in training data 224. As an example, for a $25 gaming chip, the image service 227 can determine a region has pixel percentages of 36% green, 12% blue, and 21% brown. The image service 227 can identify the gaming chip as a $25 denomination by determining those values falling within the ranges of 30%-60% green, 0-30% blue, and 0-30% brown.

[0088] Turning to FIG. 12, shown is an image 1200 of an example gaming chip 1201 side by side with a 2D flat view of a side 1203 of the gaming chip 1201 according to various embodiments of the present disclosure. In at least one embodiment, the present system can compute and leverage chip dissimilarity values to support gaming chip recognition and identification. In one or more embodiments, a chip dissimilarity value can refer to a difference (e.g., such as a percent difference) between a control image and a test image of a gaming chip. In various embodiments, if a chip dissimilarity value is within a predefined threshold, the test image can be determined to be equivalent to the control image.

[0089] The image service 227 can calculate a chip dissimilarity for the gaming chip 1201. To calculate the chip dissimilarity, the image service can compare each pixel value of the detected chip 1201 to flattened 2D views of the chip side 1203, such as a flattened 2D view 1205. During a training process, art work for each gaming chip 1201 can be used to generate the 2D flat views 1205 of the sides 1203 of the gaming chips 1201. Images of the side 1203 of the gaming chip 1201 can also be used to generate the 2D flat views 1205 of the sides 1203 of the gaming chips 1201. The 2D flat views 1205 can each be stored in training data 224 and associated with a denomination.

[0090] With reference to FIG. 13, shown is an image 1300 of an image service 227 (FIG. 2) performing a comparison of a detected gaming chip (not illustrated) to a 2D flat view 1303 according to various embodiments of the present disclosure. The image service 227 can identify a gaming chip from one or more images, such as, for example, those received from camera 209 or 230. The image service 227 can extract a region 1301 of the identified gaming chip corresponding to an edge thereof. The image service 227 can compare the region 1301 against regions of various 2D flat views 1303 of different denominations of gaming chips in training data 224. As an example, the region 1301 can be compared against region 1306 from training data 224.

[0091] The image service 227 can slide the region 1301 along the view 1303 to determine if a match exists between regions therein, such as, for example, region 1306. For each position while sliding the region 1301, the image service 227 can calculate a cost function. The cost function can be a sum of absolute pixel differences between the region 1301 and the region of the view 1301. For example, the image service 227 can calculate a cost function between the region 1301 and the region 1306 by calculating and summing pixel differences between the regions. The image service can store the lowest cost function calculated during sliding with respect to views 1303 of each denomination. The image service 227 can compare the lowest cost function result for each

denomination to determine which denomination has the lowest cost function. The image service 227 can identify the detected gaming chip as having the denomination with the lowest cost function.

[0092] In some embodiments, the image service 227 can identify a denomination of a gaming chip by comparing an image of a detected gaming chip with sub-parts of sample images in training data using deformation. In other embodiments, the image service 227 can determine denomination using one or more of a hidden Markov model (HMM), a dynamic time warping (DTW) algorithm, and other algorithms to encode variations in sizes, patterns, colorings, and other features for the gaming chips.

[0093] With reference to FIG. 14, shown is an example of a bimodal stack histogram 1400 with a first peak 1401 and a second peak 1403 according to various embodiments of the present disclosure. The image service 227 can detect chip stacks as discussed herein and identify pixel values to generate the histogram 1400. The image service can recognize gaming chip denominations by computing dominant color percentages in gaming chip images, and comparing the computed percentages to stored percentages associated with specific denominations. Lighting and other factors can obfuscate true pixel values, and result in a histogram 1400 that is appropriately bimodal (e.g., indicating that a gaming chip stack includes two dominant colors), but includes pixel noise that undesirably weights computed color percentages. Because lighting and other factors can obfuscate true pixel values, the image service 227 can resolve the true pixel values by reducing the diversity of pixel values in an image (e.g., thereby decreasing a number of colors therein). The image service 227 can perform clustering techniques on an image in a manner such that only dominant colors are retained. In an image of gaming chips, the image service 227 can order the different colors therein by relative strength based on a computed dominant color histogram 1400 or a dominant color spectrum. The image service 227 can also tag each color in the histogram 1400 spectrum with a label. The image service 227 can then determine equivalency of two or more colors in the image. Two colors can be determined to not be equivalent if the distance between the color components is high. The image service 227 can merge colors determined to be equivalent to a standard dominant color. The dominance of one or more colors in an image depicting gaming chips can be utilized to identify a

denomination of the gaming chips therein. For example, a light shade of blue of a gaming chip may be determined as the dominant color. The image service 227 can identify the denomination of the gaming chip by matching the dominant color to a dominant color of a particular denomination.

[0094] To reduce the range of pixel values and improve dominant color and denomination recognition processes, the image service 227 can perform methods to evaluate and combine image areas of similar color. In at least one embodiment, the image service 227 can classify the denomination of gaming chips by performing one or more K-means clustering techniques. For example, a gaming chip stack image can be partitioned into a number of clusters 1404 to create a first clustered image 1405 based on pixel color and detected edges. In particular embodiments, the K-mean process may provide optimal results if the number of clusters 1404 present is the same as a selected K value. For illustrative and descriptive purposes, a number of cluster 1404 are indicated in FIG. 14. For each cluster 1404, the image service 227 can determine the colors in the cluster 1404 that are as close as possible to each other (e.g., based on pixel value) while being as far as possible from the colors in other clusters 1404.

[0095] Each cluster 1404 can be defined based on pixels, pixel values, and a cluster centroid assigned by the image service 227. In the cluster 1404, the cluster centroid can be positioned therein to minimize the sum of the distances between the pixels of the cluster 1404 and the cluster centroid. The image service 227 can modify all of the pixels in the cluster 1404 to take the color of the cluster centroid, thereby standardizing the pixels to a single color. The image service 227 can reduce the number of distinct colors after performing cluster merging via K-means techniques. For example, the image service 227 can identify clusters 1404 that are in proximity and share substantially similar or identical color, and can merge the clusters 1404 into a merged cluster 1406.

By evaluating and merging various clusters 1404, the image service 227 can generate a second clustered image 1407. In at least one exemplary embodiment, K-means clustering and merging may be performed by generating 10 initial clusters 1404 in a CIELAB Euclidean space, identifying clusters 1404 of similar colors, and merging clusters 1404 of similar colors into 2 merged clusters 1406.

[0096] From a second clustered image 1407, the image service 227 can compute dominant color percentages that may be used to identify a denomination of gaming chips included therein by comparing the computed dominant color percentages to stored color percentages associated with specific denominations. For example, from a second clustered image 1407, the image service 227 can compute dominant color percentages 1409 and 1411 to be 51.2% for a first dominant color and 48.8% for a second dominant color. The image service 227 can compare the dominant color percentages 1409 and 1411 to stored color percentages, and identify a denomination associated with dominant color percentages of about 51.2% of the first dominant color and about 48.2% of the second dominant color. By correlating the percentages to a denomination, the image service 227 can identify a denomination of the gaming chips included in the images 1405 and 1407. It will be understood by one of ordinary skill in the art that the above techniques can be repeated for individual image areas, for example, to identify multiple gaming chip denominations included in a single image.

[0097] Turning to FIG. 15, an example hardware diagram of a computing device 1500 is illustrated. Any of the image service 227, RFID service 228, cameras 209, or functionality described in the gaming table 206 may be implemented, in part, using one or more elements of the computing device 1500. The computing device 1500 can include one or more of a processor 1510, a Random Access Memory (“RAM”) 1520, a Read Only Memory (“ROM”) 1530, a memory device 1540, a network interface 1550, and an Input Output (“I/O”) interface 1560. The elements of the computing device 1500 are communicatively coupled via a bus 1502.

[0098] The processor 1510 can include an arithmetic processor, Application Specific Integrated Circuit (“ASIC”), or other types of hardware or software processors. The RAM and ROM 1520 and 1530 can include a memory that stores computer-readable instructions to be executed by the processor 1510. The memory device 1540 stores computer-readable instructions thereon that, when executed by the processor 1510, direct the processor 1510 to execute various aspects of the present disclosure described herein. When the processor 1510 includes an ASIC, the processes described herein may be executed by the ASIC according to an embedded circuitry design of the ASIC, by firmware of the ASIC, or both an embedded circuitry design and firmware of the ASIC. As a non-limiting example group, the memory device 1540 comprises one or more of an optical disc, a magnetic disc, a semiconductor memory (i.e., a semiconductor, floating gate, or similar flash based memory), a magnetic tape memory, a removable memory, combinations thereof, or any other known memory means for storing computer-readable instructions. The network interface 1550 can include hardware interfaces to

communicate over data networks. The I/O interface 1560 can include device input and output interfaces such as keyboard, pointing device, display, communication, and other interfaces. The bus 1502 can electrically and communicatively couple the processor 1510, the RAM 1520, the ROM 1530, the memory device 1540, the network interface 1550, and the I/O interface 1560, so that data and instructions may be communicated among them.

[0099] In operation, the processor 1510 is configured to retrieve computer-readable instructions stored on the memory device 1540, the RAM 1520, the ROM 1530, or another storage means, and copy the computer-readable instructions to the RAM 1520 or the ROM 1530 for execution, for example. The processor 1510 is further configured to execute the computer-readable instructions to implement various aspects and features of the present disclosure. For example, the processor 1510 may be adapted and configured to execute the processes described above with reference to FIG. 3, including the processes described as being performed by the image service 227 or gaming table 206. Also, the memory device 1540 may store the data stored in the database 215. CONCLUSION

[0100] From the foregoing, it will be understood that various aspects of the processes described herein are software processes that execute on computer systems that form parts of the system. Accordingly, it will be understood that various embodiments of the system described herein are generally implemented as specially-configured computers including various computer hardware components and, in many cases, significant additional features as compared to conventional or known computers, processes, or the like, as discussed in greater detail herein. Embodiments within the scope of the present disclosure also include computer-readable media for carrying or having computer-executable instructions or data structures stored thereon. Such computer-readable media can be any available media which can be accessed by a computer, or downloadable through communication networks. By way of example, and not limitation, such computer-readable media can comprise various forms of data storage devices or media such as RAM, ROM, flash memory, EEPROM, CD-ROM, DVD, or other optical disk storage, magnetic disk storage, solid state drives (SSDs) or other data storage devices, any type of removable non-volatile memories such as secure digital (SD), flash memory, memory stick, etc., or any other medium which can be used to carry or store computer program code in the form of computer-executable instructions or data structures and which can be accessed by a computer.

[0101] When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a computer- readable medium. Thus, any such a connection is properly termed and considered a computer-readable medium. Combinations of the above should also be included within the scope of computer-readable media. Computer-executable instructions comprise, for example, instructions and data which cause a computer to perform one specific function or a group of functions.

[0102] Those skilled in the art will understand the features and aspects of a suitable computing environment in which aspects of the disclosure may be implemented.

Although not required, some of the embodiments of the claimed inventions may be described in the context of computer-executable instructions, such as program modules or engines, as described earlier, being executed by computers in networked

environments. Such program modules are often reflected and illustrated by flow charts, sequence diagrams, exemplary screen displays, and other techniques used by those skilled in the art to communicate how to make and use such computer program modules. Generally, program modules include routines, programs, functions, objects, components, data structures, application programming interface (API) calls to other computers whether local or remote, etc. that perform particular tasks or implement particular defined data types, within the computer. Computer-executable instructions, associated data structures and/or schemas, and program modules represent examples of the program code for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represent examples of corresponding acts for implementing the functions described in such steps.

[0103] Those skilled in the art will also appreciate that the claimed and/or described systems and methods may be practiced in network computing environments with many types of computer system configurations, including personal computers, smartphones, tablets, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, networked PCs, minicomputers, mainframe computers, and the like. Embodiments of the claimed invention are practiced in distributed computing environments where tasks are performed by local and remote processing devices that are linked (either by hardwired links, wireless links, or by a combination of hardwired or wireless links) through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.

[0104] An exemplary system for implementing various aspects of the described operations, which is not illustrated, includes a computing device including a processing unit, a system memory, and a system bus that couples various system components including the system memory to the processing unit. The computer will typically include one or more data storage devices for reading data from and writing data to. The data storage devices provide nonvolatile storage of computer-executable instructions, data structures, program modules, and other data for the computer.

[0105] Computer program code that implements the functionality described herein typically comprises one or more program modules that may be stored on a data storage device. This program code, as is known to those skilled in the art, usually includes an operating system, one or more application programs, other program modules, and program data. A user may enter commands and information into the computer through keyboard, touch screen, pointing device, a script containing computer program code written in a scripting language or other input devices (not shown), such as a microphone, etc. These and other input devices are often connected to the processing unit through known electrical, optical, or wireless connections.

[0106] The computer that effects many aspects of the described processes will typically operate in a networked environment using logical connections to one or more remote computers or data sources, which are described further below. Remote computers may be another personal computer, a server, a router, a network PC, a peer device or other common network node, and typically include many or all of the elements described above relative to the main computer system in which the inventions are embodied. The logical connections between computers include a local area network (LAN), a wide area network (WAN), virtual networks (WAN or LAN), and wireless LANs (WLAN) that are presented here by way of example and not limitation. Such networking environments are commonplace in office-wide or enterprise-wide computer networks, intranets, and the Internet.

[0107] When used in a LAN or WLAN networking environment, a computer system implementing aspects of the invention is connected to the local network through a network interface or adapter. When used in a WAN or WLAN networking environment, the computer may include a modem, a wireless link, or other mechanisms for establishing communications over the wide area network, such as the Internet. In a networked environment, program modules depicted relative to the computer, or portions thereof, may be stored in a remote data storage device. It will be appreciated that the network connections described or shown are exemplary and other mechanisms of establishing communications over wide area networks or the Internet may be used.

[0108] While various aspects have been described in the context of a preferred embodiment, additional aspects, features, and methodologies of the claimed inventions will be readily discernible from the description herein, by those of ordinary skill in the art. Many embodiments and adaptations of the disclosure and claimed inventions other than those herein described, as well as many variations, modifications, and equivalent arrangements and methodologies, will be apparent from or reasonably suggested by the disclosure and the foregoing description thereof, without departing from the substance or scope of the claims. Furthermore, any sequence(s) and/or temporal order of steps of various processes described and claimed herein are those considered to be the best mode contemplated for carrying out the claimed inventions. It should also be understood that, although steps of various processes may be shown and described as being in a preferred sequence or temporal order, the steps of any such processes are not limited to being carried out in any particular sequence or order, absent a specific indication of such to achieve a particular intended result. In most cases, the steps of such processes may be carried out in a variety of different sequences and orders, while still falling within the scope of the claimed inventions. In addition, some steps may be carried out

simultaneously, contemporaneously, or in synchronization with other steps.

[0109] A phrase, such as“at least one of X, Y, or Z,” unless specifically stated otherwise, is to be understood with the context as used in general to present that an item, term, etc., can be either X, Y, or Z, or any combination thereof (e.g., X, Y, and/or Z). Similarly,“at least one of X, Y, and Z,” unless specifically stated otherwise, is to be understood to present that an item, term, etc., can be either X, Y, and Z, or any combination thereof (e.g., X, Y, and/or Z). Thus, as used herein, such phrases are not generally intended to, and should not, imply that certain embodiments require at least one of either X, Y, or Z to be present, but not, for example, one X and one Y. Further, such phrases should not imply that certain embodiments require each of at least one of X, at least one of Y, and at least one of Z to be present.

[0110] The embodiments were chosen and described in order to explain the principles of the claimed inventions and their practical application so as to enable others skilled in the art to utilize the inventions and various embodiments and with various modifications as are suited to the particular use contemplated. Alternative embodiments will become apparent to those skilled in the art to which the claimed inventions pertain without departing from their spirit and scope. Accordingly, the scope of the claimed inventions is defined by the appended claims rather than the foregoing description and the exemplary embodiments described therein.