WO2020072664A1 - Vision based recognition of gaming chips - Google Patents
Vision based recognition of gaming chipsInfo
- Publication number
- WO2020072664A1 WO2020072664A1 PCT/US2019/054320 US2019054320W WO2020072664A1 WO 2020072664 A1 WO2020072664 A1 WO 2020072664A1 US 2019054320 W US2019054320 W US 2019054320W WO 2020072664 A1 WO2020072664 A1 WO 2020072664A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- gaming
- gaming chips
- computing device
- image
- chip
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07D—HANDLING OF COINS OR VALUABLE PAPERS, e.g. TESTING, SORTING BY DENOMINATIONS, COUNTING, DISPENSING, CHANGING OR DEPOSITING
- G07D9/00—Counting coins; Handling of coins not provided for in the other groups of this subclass
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07D—HANDLING OF COINS OR VALUABLE PAPERS, e.g. TESTING, SORTING BY DENOMINATIONS, COUNTING, DISPENSING, CHANGING OR DEPOSITING
- G07D9/00—Counting coins; Handling of coins not provided for in the other groups of this subclass
- G07D9/04—Hand- or motor-driven devices for counting coins
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07D—HANDLING OF COINS OR VALUABLE PAPERS, e.g. TESTING, SORTING BY DENOMINATIONS, COUNTING, DISPENSING, CHANGING OR DEPOSITING
- G07D9/00—Counting coins; Handling of coins not provided for in the other groups of this subclass
- G07D9/06—Devices for stacking or otherwise arranging coins on a support, e.g. apertured plate for use in counting coins
Definitions
- Gaming chips can be used in place of currency and credits utilized in wagering and other gaming environments. Chips are typically used on a gaming table, often positioned in stacks of chips. To track transactions and wagering activities at a gaming table, a person can visually count chips located thereon and correlate the identified chips to one or more denominations of currency, of credits, or other values. A gaming table can include hundreds of gaming chips, and, therefore, human-based manual tracking of chips can become prohibitively costly and time consuming. Previous approaches to tracking chips have utilized image recognition systems to monitor table game activity; however, previous systems are not tolerant to partial visual obfuscations of chips or to atypical chip arrangements, and may be incapable of resolving and identifying gaming chips in images when viewed against background imagery.
- vision-based recognition may also be referred to as image-based recognition and can include performing image-based recognition on individual images, on video feeds, on computed image data, such as by combining two or more images or computing data from one or more images, or on other visual data.
- a vision-based gaming chip recognition system can be deployed in environments with one or more gaming tables, such as, for example, casinos.
- the system can utilize one or more cameras to capture images of the gaming tables.
- the system can include an image service that analyzes captured images to identify, if present therein, gaming chips and gaming chip stacks.
- the image service can also analyze images of gaming chips and gaming chips stacks to identify denominations thereof.
- the system can include, in one or more databases, tables and/or other data objects that relate colors and color patterns to specific denominations.
- the image service can identify colors and color patterns in a gaming chip image, compare the identified colors and color patterns to one or more stored color and/or pattern tables, and, based on matches identified there between, determine a denomination of gaming chips in the gaming image.
- the image service may also perform one or more image processing methods to correct or control for image distortions, remove background imagery, and resolve partial views of gaming chips.
- the image service can perform image processing methods that can include, but are not limited to: 1) removing background imagery, for example, via executing mirror algorithms; 2) resolving partial views, for example, by replacing portions of the partial views using views from other cameras and/or by combining camera views; and 3) controlling for image distortions, for example, by executing one or more K-means processes as described herein.
- image processing methods can include, but are not limited to: 1) edge detection algorithms and methods; 2) gaming chip transition detection algorithms and methods; 3) machine learning methods, for example, to train machine learning models to recognize colors and color patterns of denominations, and recognize those patterns in gaming chip images; and 4) other methods described herein.
- the system can also include RFID elements for detecting gaming chips and gaming chip stacks.
- the gaming chips can include RFID tags that can be read by RFID readers configured within a gaming table.
- the system may receive an RFID identifier that can be correlated with stored RFID identifiers to identify the gaming chip and/or the gaming chip’s denomination.
- a system including: A) a gaming table; B) at least one imaging device; and C) at least one computing device in communication with the at least one imaging device, the at least one computing device being configured to at least: 1) receive an image from the at least one imaging device; 2) locate at least one stack of gaming chips in the image; 3) determine a count of a plurality of gaming chips in the at least one stack of gaming chips; and 4) determine a denomination of each of the plurality of gaming chips in the at least one stack of gaming chips.
- the system of the first clause or any other clause wherein the at least one computing device is further configured to: A) perform a horizontal edge detection on the image; B) perform a horizontal line and transition estimation based at least in part on the horizontal edge detection; and C) determine the count based at least in part on the horizontal line and transition estimation.
- the system of the first clause or any other clause wherein the at least one computing device is further configured to: 1) determine a height of the at least one stack of gaming chips; and 2) determine the count of the plurality of gaming chips by dividing the height by a relative gaming chip thickness.
- the system of the first clause or any other clause wherein the at least one computing device is further configured to locate the at least one stack by excluding background of the image from analysis.
- the system of the first clause or any other clause wherein the at least one computing device is further configured to verify the count of the plurality of gaming chips in the at least one stack of gaming chips based on reading an RFID tag in each of the plurality of gaming chips in the at least one stack of gaming chips.
- the system of the first clause or any other clause wherein the at least one computing device is further configured to: A) determine a dominant color associated with at least one of the plurality of gaming chips; and B) determine the denomination of the at least one of the plurality of gaming chips based at least in part on the dominant color.
- the at least one computing device is further configured to locate the at least one stack of gaming chips in the image based at least in part on a spiking neural network (SNN) model.
- SNN spiking neural network
- the system of the first clause or any other clause wherein the at least one computing device is further configured to generate a histogram corresponding to a region of a gaming chip of the plurality of gaming chips.
- the system of the first clause or any other clause wherein the at least one computing device is further configured to: A) determine a plurality of color percentages corresponding to counts of pixel color; and B) determine each of the plurality of color percentages fall within predefined ranges associated with one of a plurality of denominations of gaming currency.
- the system of the first clause or any other clause wherein the at least one computing device is further configured to partition colors of pixels into clusters.
- the system of the first clause or any other clause wherein the at least one computing device is further configured to partition colors of pixels into clusters based at least in part on K-means clustering.
- a method including: A) processing, via at least one computing device, at least one image to locate a stack of gaming chips; B) determining, via the at least one computing device, a count of a plurality of gaming chips in the stack of gaming chips; and C) determining, via the at least one computing device, a denomination of each of the plurality of gaming chips in the at least one stack of gaming chips.
- the method of the fourteenth clause or any other clause further including: A) performing, via the at least one computing device, a horizontal edge detection on the image; B) performing, via the at least one computing device, a horizontal line and transition estimation based at least in part on the horizontal edge detection; and C) determining, via the at least one computing device, the count based at least in part on the horizontal line and transition estimation.
- the method of the fourteenth clause or any other clause further including: A) determining, via the at least one computing device, a dominant color associated with at least one of the plurality of gaming chips; and B) determining, via the at least one computing device, the denomination of the at least one of the plurality of gaming chips based at least in part on the dominant color.
- the method of the fourteenth clause or any other clause further including generating, via the at least one computing device, a histogram corresponding to a region of a gaming chip of the plurality of gaming chips.
- the method of the fourteenth clause or any other clause further including: A) determining, via the at least one computing device, a plurality of color percentages corresponding to counts of pixel color; and B) determining, via the at least one computing device, each of the plurality of color percentages fall within predefined ranges associated with one of a plurality of denominations of gaming currency.
- the method of the fourteenth clause or any other clause further including partitioning, via the at least one computing device, colors of pixels into clusters.
- partitioning is performed based at least in part on K-means clustering.
- FIG. 1 is an illustration of a gaming table and a camera according to various embodiments of the present disclosure.
- FIG. 2 is a drawing of a networked environment according to various example embodiments.
- FIG. 3 illustrates an example flowchart of certain functionality implemented by portions of image service executed in a computing environment in the networked environment of FIG. 2 according to various embodiments of the present disclosure.
- FIG. 4 illustrates generated chip transitions using a spiking neural network model according to various embodiments of the present disclosure.
- FIG. 5 is an image representing chip transition hits on a contrast enhanced grayscale image according to various embodiments of the present disclosure.
- FIG. 6 is an image representing merged hits for the contrast enhanced grayscale image from FIG. 5 according to various embodiments of the present disclosure.
- FIG. 9 shows example images from different stages in chip identification according to various embodiments of the present disclosure.
- FIG. 10 shows images of example gaming chips with various denominations according to various embodiments of the present disclosure.
- FIG. 11 is an example color histogram according to various embodiments of the present disclosure.
- FIG. 12 is an image of an example gaming chip side by side with a 2D flat view of a side of the gaming chip according to various embodiments of the present disclosure.
- FIG. 13 is an image detecting a comparison of a detected gaming chip to a 2D flat view according to various embodiments of the present disclosure.
- FIG. 14 is an example of a bimodal stack histogram according to various embodiments of the present disclosure.
- FIG. 15 is a schematic block diagram that illustrates an example computing environment employed in the networked environment of FIG. 2 according to various embodiments.
- connection and “coupled” are used broadly and encompass both direct and indirect connections and couplings. In addition, the terms “connected” and “coupled” are not limited to electrical, physical, or mechanical connections or couplings.
- machine and “work station” are not limited to a device with a single processor, but may encompass multiple devices (e.g., computers) linked in a system, devices with multiple processors, special purpose devices, devices with various peripherals and input and output devices, software acting as a computer or server, and combinations of the above.
- FIG. 1 shown is a gaming table 100 in a networked environment according to various embodiments of the present disclosure.
- the gaming table 100 can be monitored by one or more imaging devices 103, such as, for example, a camera.
- One or more gaming chips can be played on the gaming table as part of a wagering game.
- the video feed from the one or more imaging devices 103 can be used to locate stacks of one or more gaming chips, count the gaming chips in each stack, and evaluate a denomination for each of the gaming chips.
- the networked environment 200 includes a computing environment 203, one or more gaming table devices 206, and one or more cameras 209, which are in data communication with each other via a network 212.
- the network 212 includes, for example, the Internet, intranets, extranets, wide area networks (WANs), local area networks (LANs), wired networks, wireless networks, or other suitable networks, etc., or any combination of two or more such networks.
- WANs wide area networks
- LANs local area networks
- wired networks wireless networks, or other suitable networks, etc., or any combination of two or more such networks.
- such networks may comprise satellite networks, cable networks, Ethernet networks, and other types of networks.
- the computing environment 203 can include, for example, a server computer or any other system providing computing capability.
- the computing environment 203 may employ a plurality of computing devices that may be arranged, for example, in one or more server banks or computer banks or other arrangements. Such computing devices may be located in a single installation or may be distributed among many different geographical locations.
- the computing environment 203 may include a plurality of computing devices that together may comprise a hosted computing resource, a grid computing resource, and/or any other distributed computing arrangement.
- the computing environment 203 may correspond to an elastic computing resource where the allotted capacity of processing, network, storage, or other computing-related resources may vary over time.
- Various applications and/or other functionality may be executed in the computing environment 203 according to various embodiments.
- various data is stored in a data store 215 that is accessible to the computing environment 203.
- the data store 215 may be representative of a plurality of data stores 215 as can be appreciated.
- the data stored in the data store 215, for example, is associated with the operation of the various applications and/or functional entities described below.
- the data store 215 can include currency data 218, gaming data 221, and training data 224, among other data.
- the components executed on the computing environment 203 for example, include an image service 227 and an RFID service 228, and other applications, services, processes, systems, engines, or functionality not discussed in detail herein.
- the image service 227 is executed to recognize gaming chips used on one or more tables 206.
- a stack of gaming chips can be positioned in various locations on a gaming table 206 during a wagering game.
- One or more images can be captured of the gaming table 206 and the image service 227 can locate and/or identify the one or more stacks of gaming chips in the images, count the gaming chips in each stack, and evaluate denominations for the gaming chips. It can be appreciated that some or all of the functionality as described with reference to the image service 227 can be executed in a computing device at the gaming table 206.
- the currency data 218 can include a list of all active gaming chips including any identifiers associated with the gaming chips, such as, for example, RFID tag identifiers, barcode identifiers, visual characteristics including color information (e.g., such as color pixel values, thresholds, color patterns, etc.), and other identifiers.
- Active gaming chips can correspond to gaming chips indicated as currently in use in the data store 215 excluding gaming chips that are yet to be deployed, decommissioned, and/or damaged or broken.
- the gaming data 221 can store a history of sensor inputs received as well as any configuration, calibration, and control settings.
- the training data 224 can include data corresponding to transitions (e.g., transitions in color or patterns on a gaming chip) in various denominations of gaming chips, color tables for the various denominations of gaming chips, 2D views of gaming chips, and other training information.
- the training data 224 can include data corresponding to a color table that relates chip colors, chip color patterns, and other color information (e.g., such as pixel values) to denominations (e.g., of currency, credits, prize values, etc.).
- the training data 224 can include 2D views of gaming chips, such as the 2D view shown in FIG. 10.
- the gaming table 206 is representative of a plurality of gaming tables that may be coupled to the network 212.
- the gaming table 206 can include, for example, one or more computing devices with a processor-based system such as a computer system. Such a computer system may be embodied in the form of an embedded computing device or other devices with like capability.
- the gaming table 206 can include one or more cameras 230, one or more sensors 233, a chip tray 236, one or more bet spots 239, a chip recycler 242, and a bill validator 245.
- the cameras 209 and 230 can be imaging devices 103 (FIG. 1).
- the cameras 230 can capture images of a surface of the gaming table 206.
- the gaming table 206 and/or cameras 230 can send the images to the image service 227 via the network 212.
- the images can be sent to the image service 227 as a video stream of the surface of the gaming table 206.
- the image can be sent based on differences from a previous frame in a video, such as based on a key frame.
- the image service 227 can receive images from various angles from cameras 209 and 230.
- the sensors 233 can include RFID antennas, video barcode scanners, weigh scales, and other sensors.
- the sensors 233 can be used to identify gaming chips played on a gaming table.
- an RFID reader can utilize an RFID antennas to read RFID information from RFID tags configured within the gaming chips.
- the RFID information from each RFID tag can include an identifier associated with the gaming chip.
- video barcode scanners can read barcode information, or other information, from barcodes or barcode tags located on the gaming chips.
- the barcode information for each gaming chip can include an identifier associated with the gaming chip.
- weigh scales e.g., located beneath bet spots 239, a chip tray 236, etc.
- Detected weights of the gaming chips can be compared against known weights of a plurality of gaming chip denominations, and the gaming chips can be identified based on the comparisons.
- the RFID service 228 can validate RFID currency based on information read from RFID readers via the sensors 233 corresponding to an RFID antenna.
- An RFID antenna can be positioned at the chip tray 236, at each of the bet spots 239, at the chip recycler 242, and in another positions.
- the gaming table 206 can read RFID tags from RFID-enabled gaming chips using the RFID antennas. The information from the RFID tags can be stored along with data related to the RFID antenna used to read the RFID tag. For example, an identifier from one or more RFID-enabled gaming chips can be read by an RFID antenna at a particular bet spot 239. An identifier can include information that uniquely identifies each of the one or more RFID-enabled gaming chips.
- a gaming table 206 can read one or more RFID-enabled gaming chips via an RFID antenna.
- the gaming table 206 can record and store one or more RFID identifiers that identify the one or more RFID-enabled gaming chips, and can also record and store information that identifies the RFID antenna or table area where the one or more RFID-enabled gaming chips were read.
- Information identifying the RFID antenna or table area can include, but is not limited to: 1) an identifier (e.g., such as a string of characters); 2) a table identifier that identifies a particular gaming table 206 into which the RFID antenna is installed; and 3) a bet spot identifier that identifies a particular bet spot 239 onto which the one or more RFID-enabled gaming chips were disposed.
- the gaming table device 206 can determine a patron placed a wager of the RFID-enabled gaming chips based on a corresponding to the particular bet spot 239 where the wager was placed.
- the gaming table device 206 can transmit a count of the gaming chips read at each of the RFID antennas. The count can include one or more identifiers from each gaming chip.
- the gaming table 206 can perform a read of all RFID antennas at least once per game being played, and can transmit the at least one count to the RFID service 228 after each game.
- the gaming table 206 can perform reads of all RFID antennas several times per game.
- the gaming table 206 automatically and repeatedly sends any detected table and/or gaming chip changes based on readings of RFID-enabled gaming chips that occur during the game.
- the system can verify or validate a count of gaming chips in a stack determined by the image service 227 by reading RFID tags in each of the gaming chips in the stack using an RFID reading including an RFID antenna.
- the image service 227 can validate gaming chips using the cameras 209 and/or the cameras 230.
- One or more cameras 230 can be positioned on the table, and one or more cameras 209 can be positioned separate from the table 206.
- the cameras 230 can be positioned overhead or above the table 206.
- the cameras 209 can also be positioned on the table 206, such as, for example in a chip tray 236, chip recycler 242, on top of a bill validator 245, or at another position.
- the cameras 209 and 230 can record a video stream of various angles or segments of the table 206. For example, multiple cameras 230 can be positioned in the chip tray 236 pointing toward the bet spots 239.
- a camera 230 can be directed to a single bet spot 239 or a group of bet spots 239.
- the gaming table 206 can join or stitch together video feeds from multiple cameras 230 to generate a video feed of an area, such as, for example a video feed for all bet spots 239.
- the image service 227 can join video together from cameras 209 and/or cameras 230.
- the image service 227 can identify stacks of gaming chips, determine a count of the gaming chips in each stack, and determine a denomination for the gaming chips in each stack at a variety of positions on the gaming table 206.
- the gaming table 206 or the image service 227 can perform image recognition on frames of the video feeds to identify information for gaming chips on the gaming table 206. For example, a height of a stack of gaming chips can be determined, and the count of the chips can be calculated based on the height. In the same example, the count of the chips can be calculated based on computations relating the overall stack height to individual chip thickness.
- the image service 227 may receive an image, identify a gaming chip stack depicted in the image, and determine a stack height of 300 pixels.
- the image service 227 may retrieve a stored gaming chip thickness of 15 pixels. To compute a gaming chip count, the image service 227 can divide the 300 pixel stack height by the 15 pixel gaming chip thickness to obtain a gaming chip count of 20. The image service 227 can round the resulting gaming chip count when appropriate. As an example, if the stack height of 307 pixels had been determined, the result of 20 gaming chips can be determined by rounding 20.466 to a count of 20 gaming chips.
- the gaming chip thickness can be associated with a predefined capture angle at which the image was captured and also associated with a position of the gaming chip stack within the image.
- the gaming chip thickness can also vary based on where in the view of view for each camera 209/230 that the gaming chip stacks are identified.
- the image service 227 can identify a thickness of 12 pixels in a first portion of a field of view for a camera 209/230 and a thickness of 15 pixels for a second portion of a field of view.
- the image service 227 can calculate a distance and angle to an identified gaming chip stack, and can calculate a thickness per gaming chip based on the distance and angle.
- the image service 227 can determine a width of an identified gaming chip stack, and can calculate a thickness per gaming chip based on the width of the gaming chip stack.
- the data store 215 can store a variety of gaming chip thicknesses associated with various capture angles and various gaming chip stack image positions.
- the data store 215 can store a variety of dimensional information for various gaming chips, such as a ratio of width to height, width to depth, and height to depth, including ratios at various camera angles and distances.
- the denomination of each of the gaming chips in the stack can be determined based on various visual characteristics.
- a color pattern on the edge of the gaming chips can be used to determine the denomination.
- one or more visual security features can be used to determine the denomination
- each denomination of currency can have a different holographic symbol or other visual security protection.
- the image service 227 or the gaming table device 206 can determine the denomination by identifying which visual security protection each gaming chip in a video feed contains.
- the chip recycler 242 can operate in a similar fashion to a coin recycler.
- the chip recycler 242 can be used in addition to or in place of chip tray 236.
- the gaming chips can be placed into an input area, such as a funnel, hopper, or tube, and then validated (authenticated), counted, sorted, and stored by the chip recycler 242. If gaming chips are to be paid out to players, exchanged for cash, or exchanged for other gaming chips, then the gaming table 206 or a table management system or a control system executed in the computing environment 203 can instruct the chip recycler 242 how much in gaming chips and which denominations to pay out.
- a chip recycler 242 within a cashier cage, a bank or vault, or kiosk can operate in a similar fashion. A user places the gaming chips in the chip recycler 242, the chip recycler 242 processes the gaming chips, and the chip recycler 242 either automatically outputs gaming chips in other
- FIG. 3 shown is a process 300 in a flowchart according to various embodiments of the present disclosure. It is noted that embodiments described herein may be practiced using an alternative order of the steps illustrated in FIG. 3. That is, the process 300 illustrated in FIG. 3 is provided as an example only, and the embodiments may be practiced using process flows that differ from those illustrated. Additionally, it is noted that not all steps are required in every embodiment. In other words, one or more of the steps may be omitted or replaced, without departing from the spirit and scope of the embodiments. Further, steps may be performed in different orders, in parallel with one another, or omitted entirely, and/or certain additional steps may be performed without departing from the scope and spirit of the embodiments.
- the process 300 can include locating one or more stacks of gaming chips.
- Stacks can be composed of gaming chips with different values, colors, and other characteristics.
- the gaming chips may be located in a float tray, a splash tube, an RFID checkpoint device, a bet spot 239 (FIG. 2), or some other area.
- the splash tube can be an area to place recently played gaming chips.
- the RFID checkpoint device can be an RFID reading with an RFID antenna configured to read chips at a position in front of a dealer.
- the stacks of gaming chips can be separated by a spacer or be placed in predefined areas, such as, for example, in a row of a chip tray 236.
- the stacks of gaming chips can be placed in front of other stacks, therefore occluding chips partially or even completely.
- the occluded gaming chips can be seen.
- the process 300 can include counting gaming chips in each stack.
- the image service 227 can determine edges depicted in an image to identify a number of gaming chips in the stack. For example, a height of a stack of gaming chips depicted in an image can be determined. The stack of gaming chips can be isolated in the image, and edges between the gaming chips in the stack can be determined. A count of the chips can be calculated based on the edges.
- the process 300 can include evaluating denominations for each of the gaming chips identified.
- the gaming chips may be sorted. Different denominations of gaming chips can have different diameters.
- the gaming chips may be sorted according to size, or otherwise sorted according to denomination. It can be assumed that gaming chips are sorted in specific areas, such as in a chip tray. In other areas, such as a bet spot or splash tube, the gaming chips cannot be assumed to be sorted.
- chip transitions 400 generated using a spiking neural network (SNNT) model reconstruction process, according to various embodiments of the present disclosure.
- the chip transitions 400 can include one or more transitions 403, 406, and 409.
- the chip transitions 400 can be stored in training data 224 (FIG. 2) and can be generated during a training procedure.
- An image service 227 (FIG. 2) can retrieve and utilize the chip transitions 400 to perform chip identification processes described herein.
- the chip transitions 400 can include mirrored versions of transitions.
- the chip transitions 400 can include mirrored transitions 403’, 406’, and 409’ that are also generated using the spiking neural network (SNNT) model reconstruction process.
- Mirrored transitions can be used to augment chip identification processes, for example, in instances where a portion of chips in a chip stack are oriented upside down relative to other chips in the stack.
- the image service 227 can alter the chip transitions 400 to be orientated to match an orientation of the stack of gaming chips.
- an image 500 representing exemplary chip transition hits 503, 505, 507, 509, 511, 513, and 515 that may be detected by an image service 227.
- the image 500 is representative of a contrast enhanced grayscale image that can be captured and provided via one or more camera 209 and/or 230, according to various embodiments of the present disclosure.
- the chip transition hits 503, 505, 507, 509, 511, 513, and 515 represent areas of the image 500 where the image service 227 has identified a particular pattern, color, color pattern, edge, or other structure indicative that the area within the hit, or a portion thereof, is a portion of a chip.
- Each chip transition hit can be analyzed to determine if one or more gaming chips and/or one or more stacks of gaming chips are depicted within the image.
- FIG. 6 shown is an image 600 representing merged hits for the contrast enhanced grayscale image 500 (FIG. 5) according to various
- the image service 227 can perform one or more hit resolving processes to merge hits into a new merged hit that includes the areas of the old hits, but is also expanded to include nearby areas determined include a chip or portion thereof.
- the image service 227 can merge the transition hits 505 and 507 to generate a merged transition hit 603.
- the image service 227 can merge the transition hits 511 and 513 to generate a merged hit 605.
- iterative merging of individual and merged hits into newly merged hits can augment the chip recognition processes by incrementally increasing an area of an image recognized as including a chip or portion thereof.
- Increasing the recognized chip area can facilitate resolving patterns, edges, colors, and other indicia required to identify a chip and correlate the chip to a particular
- two hits may each include a portion of a chip-identifying pattern.
- An image service 227 upon analyzing the pattern portion of each hit, may fail to recognize the overall chip-identifying pattern. However, upon generating a merged hit, the merged hit may fully resolve the chip-identifying pattern such that it can be recognized by the image service 227.
- the solid-lined candidate areas 703, 705, 711, and 713 can represent areas of the image 700 determined to possibly be background imagery.
- the candidate areas 703, 705, 711, and 713 can be generated by an image service 227 that analyzes patterns and other indicia in and around hit areas.
- the image service 227 can determine areas of the image 700 that are likely to include a chip and areas that are likely to include background imagery.
- the image service 227 can track the areas determined to likely include a chip or chip stack, which are indicated via generation of dash-lined candidate areas 707 and 709.
- suspected background candidate areas 703, 705, 711, and 713 can be filtered out using mirroring techniques, while the suspected chip candidate areas can be retained as suspected stacks and/or chips.
- the image service 227 can apply a mirror model algorithm to the image 700 to assess if the candidate areas 703, 705, 711, and 713 constitute background imagery or depicts an object (or portion thereof), such as a chip or chip stack.
- the image service 227 can use the SNNT to learn an area of a background image during training.
- the background image can correspond to an image taken from the same or similar view of the image 700 while no chips or stacks of chips present.
- the image service 227 can match the candidate areas 703, 705, 707, 709, 711, and 713 from the image 700 with the same areas from the background image. If a candidate area matches the corresponding area of the background image, the candidate area can be excluded from further chip recognition and analysis processes. For example, the image service 227 can determine that candidate areas 703, 705, 711, and 713 match corresponding areas of the background image, and, accordingly, the image service 227 can exclude the candidate areas from further analyses. In the same example, the image service 227 can determine that candidate areas 707 and 709 do not match corresponding areas of the background image, and the image service 227 can retain the candidate areas for further analysis. The image service 227 can also discard any hits in the excluded candidate areas.
- the image service 227 can merge candidate areas that are not located in excluded areas, and can further process the merged candidate area (e.g., by applying mirror algorithms, etc.) to precisely identify areas of the image 800 that include the stack 803.
- the image service 227 can merge candidate areas 707 and 709, and process the merged area to identify the stack 803.
- the image service 227 can apply a size filter to hits and/or candidate areas so that the hits and/or candidate areas containing more than one stack are split into multiple hits and/or multiple candidate areas, each hit and/or candidate area encompassing one stack.
- image 901 the image service 227 can estimate the top of a stack 904 of gaming chips by identifying the rim 909 between the upper face 912 of a highest gaming chip 902 and an edge 915 of the highest gaming chip 902.
- the image service 227 can detect transitions between the gaming chips. Once the transitions are detected, the image service 227 can determine a count of gaming chips in the stack 904. Further, the image service 227 can isolate each individual gaming chip for color analysis or other analysis.
- the estimation of the top of the stack 904 relies on multiple concepts.
- the top of a stack 904 is at the bottom of an ellipse representing the upper face 912 of the highest chip 902.
- the region around the top of the stack transition is generally lighter above and darker below due to the lighting conditions.
- the upper face 912 just above the top of the stack transition is usually heterogeneous compared to the gaming chip edges.
- the image 901 can correspond to an identified area that includes a stack 904 of gaming chips.
- the image service 227 can perform horizontal edge detection on the image 901 to generate the image 903.
- the image service 227 can compute a horizontal edge detection using a Sobel filter or other filter to generate one or more horizontal lines emphasizing edges 906 between chips.
- the image service 227 can reduce the image area of an edge 906 to one horizontal line by averaging multiple horizontal lines clustered near the same edge 906. For example, the image service 227 can perform a horizontal line and transition estimation on the image 903 and average the horizontal lines therein to generate the image 905. Within the image 905, the image service 227 can find peaks 908 in the averaged horizontal lines. The peaks 908 can represent the locations of transitions 910 between chips. In image 903, some transitions 910 may be missing due to a lack of contrast in the original image 901 that was captured by cameras 209 and/or 230 (FIG. 2). The image service 227 can fill the missing transitions 910 by estimating the chip thickness.
- the image service 227 can identify a chip in one or more of the images 901, 903, 905, and 907, and can compute locations of chip transitions 910 based on calculations of the chip thickness (e.g., a transition 910 may occur per every chip’s thickness worth of distance in a stack 904). Further, as shown in image 907, the image service 227 can identify individual gaming chips in the original image 901 using the lines estimated in image 905. In at least one embodiment, the image service 227 can include separation lines 912 between each of the gaming chips in image 903 as shown in the image 907. The separation lines 912 can further facilitate resolving transitions 910 between the chips.
- FIG. 10 shown are images depicting example gaming chips 1000 of various denominations according to various embodiments of the present disclosure.
- an assumption can be made that each denomination of gaming chips 1000 corresponds to a dominant color.
- the image service 227 can identify a color with the highest number of pixels. The image service 227 can determine the identified color as the dominant color. In some embodiments, the image service 227 can modify an image prior to identifying the color of pixels.
- the image service 227 can perform a color correction and/or color enhancement algorithm based on the camera used to capture the image, lighting conditions associated with the area, or some other deficiency. As an example, the image service 227 may adjust a contrast, temperature, or other property of the image based on known sensor deficiencies in the camera. In some embodiments, the image service 227 can perform a training process involving analyzing image captures from cameras 209/230 including items of known colors to identify necessary color correction and/or color enhancement.
- the image service 227 can utilize the training data 224 to attribute a denomination to an imaged gaming chip 1001 based on the dominant color.
- an imaged gaming chip 1001 may have a denomination of $25 US Dollars when the dominant color is green, have a denomination of $1,000 US Dollars when the dominant color is a first shade of blue, and have a denomination of $5,000 US Dollars when the denomination is a second shade of blue.
- a color table can be created with denominations and dominant colors.
- the color table can include hue saturation and value (HSV) for each of the dominant colors.
- the color table can be stored in training data 224.
- the image service 227 can utilize a predefined color-value table.
- the predefined color-value table can be manually tuned or automatically trained for each casino environment including the specific set of gaming chips 1000 used in the casino, the lighting conditions, the ambient lighting, and other factors.
- the color table can include definitions of upper and lower HSV values for each color, which can be stored in the training data 224 as a color value (CV) table.
- CV color value
- the training process can include manually setting the minimum and maximum HSV values corresponding to the dominant color and storing the entered settings in training data 224.
- the training process can include using machine learning to learn colors of bands patterned onto chips 1000.
- the training process can use machine learning to learn colors of bands patterned onto the chips 1000.
- the image service 227 can utilize a machine learning classifier to leam HSV values corresponding to the dominant colors, such as, for example, using histogram distance classification training. With machine learning, the image service 227 can analyze images of each denomination of gaming chips 1000 from various angles in a casino environment. The image service 227 can learn the colors of bands for each denomination and store the ranges of colors as a CV table.
- machine learning can be used to generate the CV table from various images, while in others, machine learning can be used to tune and/or calibrate one or more stored standard CV tables based on processing various images depicting chips.
- a CV table can be manually adjusted or automatically retrained.
- a light grey zone can be used to compute a white balance in real time in order to reduce and/or remove undesired lighting effects cause by lights emitted by any signage or other light sources.
- the gaming chips 1000 can have multiple colors, such as, for example, from chip inserts, from multiple injections during an injection mold process, or from different materials in a compression mold process. In some embodiments, the gaming chips 1000 can have up to three different colors.
- the image service 227 can identify a denomination for the gaming chip 1001.
- the image service 227 can identify regions of interest, such as region 1021, to ignore the edges 1011 of the gaming chip 1001 to focus on the color of the inserts 1013, 1015, and 1017 located in the central area of the gaming chip 1001.
- the region of interest 1021 can be determined to correspond to a middle or central area of the gaming chip 1001.
- $25 denomination gaming chips can have a green pixel count between 30% and 60%, a blue pixel count between 0% and 30%, and a brown pixel count between 0% and 30%.
- the image service 227 can identify the gaming chip 1001 as a $25 denomination.
- the image service 227 can count the gaming chips in the stack.
- the image service 227 can determine a contour for each of the gaming chips in the stack.
- the image service 227 can calculate a color histogram 1100 using a CV table from training data 224.
- the color histogram 1100 can include a combined red, green, blue (RGB) histogram 1101, and can also include histograms for each color in an RGB color model.
- the color histogram 1100 can include a green histogram 1103, a blue histogram 1105, and a red histogram 1107.
- the image service 227 can apply, to a chip image, a color mask, apply morphological transformations, such as erosion, closing, and other transformations, and count a number of pixels corresponding to the respective color.
- the image service 227 can select a dominant color based on the number of pixels. As an example, the image service 227 can select the dominant color as the color with the highest pixel count.
- the image service 227 can calculate a percentage of pixels for each color from a histogram 1100, 1101, 1103, and/or 1105. The calculated pixel percentages can represent color percentages that can be compared to values in a CV table in training data 224.
- the image service 227 can determine the chip value or denomination corresponding to the gaming chip with the determined dominant color. If the pixel percentage for each color falls within a range of one of the CV tables in training data 224, the image service 227 can identify the gaming device as the corresponding denomination in training data 224. As an example, for a $25 gaming chip, the image service 227 can determine a region has pixel percentages of 36% green, 12% blue, and 21% brown. The image service 227 can identify the gaming chip as a $25 denomination by determining those values falling within the ranges of 30%-60% green, 0-30% blue, and 0-30% brown.
- FIG. 12 shown is an image 1200 of an example gaming chip 1201 side by side with a 2D flat view of a side 1203 of the gaming chip 1201 according to various embodiments of the present disclosure.
- the present system can compute and leverage chip dissimilarity values to support gaming chip recognition and identification.
- a chip dissimilarity value can refer to a difference (e.g., such as a percent difference) between a control image and a test image of a gaming chip.
- the test image can be determined to be equivalent to the control image.
- the image service 227 can calculate a chip dissimilarity for the gaming chip 1201.
- the image service can compare each pixel value of the detected chip 1201 to flattened 2D views of the chip side 1203, such as a flattened 2D view 1205.
- art work for each gaming chip 1201 can be used to generate the 2D flat views 1205 of the sides 1203 of the gaming chips 1201.
- Images of the side 1203 of the gaming chip 1201 can also be used to generate the 2D flat views 1205 of the sides 1203 of the gaming chips 1201.
- the 2D flat views 1205 can each be stored in training data 224 and associated with a denomination.
- the image service 227 can slide the region 1301 along the view 1303 to determine if a match exists between regions therein, such as, for example, region 1306. For each position while sliding the region 1301, the image service 227 can calculate a cost function.
- the cost function can be a sum of absolute pixel differences between the region 1301 and the region of the view 1301.
- the image service 227 can calculate a cost function between the region 1301 and the region 1306 by calculating and summing pixel differences between the regions.
- the image service can store the lowest cost function calculated during sliding with respect to views 1303 of each denomination.
- the image service 227 can compare the lowest cost function result for each
- the image service 227 can identify the detected gaming chip as having the denomination with the lowest cost function.
- the image service 227 can identify a denomination of a gaming chip by comparing an image of a detected gaming chip with sub-parts of sample images in training data using deformation. In other embodiments, the image service 227 can determine denomination using one or more of a hidden Markov model (HMM), a dynamic time warping (DTW) algorithm, and other algorithms to encode variations in sizes, patterns, colorings, and other features for the gaming chips.
- HMM hidden Markov model
- DTW dynamic time warping
- FIG. 14 shown is an example of a bimodal stack histogram 1400 with a first peak 1401 and a second peak 1403 according to various embodiments of the present disclosure.
- the image service 227 can detect chip stacks as discussed herein and identify pixel values to generate the histogram 1400.
- the image service can recognize gaming chip denominations by computing dominant color percentages in gaming chip images, and comparing the computed percentages to stored percentages associated with specific denominations. Lighting and other factors can obfuscate true pixel values, and result in a histogram 1400 that is appropriately bimodal (e.g., indicating that a gaming chip stack includes two dominant colors), but includes pixel noise that undesirably weights computed color percentages.
- the image service 227 can resolve the true pixel values by reducing the diversity of pixel values in an image (e.g., thereby decreasing a number of colors therein).
- the image service 227 can perform clustering techniques on an image in a manner such that only dominant colors are retained.
- the image service 227 can order the different colors therein by relative strength based on a computed dominant color histogram 1400 or a dominant color spectrum.
- the image service 227 can also tag each color in the histogram 1400 spectrum with a label.
- the image service 227 can then determine equivalency of two or more colors in the image. Two colors can be determined to not be equivalent if the distance between the color components is high.
- the image service 227 can merge colors determined to be equivalent to a standard dominant color.
- the dominance of one or more colors in an image depicting gaming chips can be utilized to identify a
- the image service 227 can identify the denomination of the gaming chip by matching the dominant color to a dominant color of a particular denomination.
- the image service 227 can perform methods to evaluate and combine image areas of similar color.
- the image service 227 can classify the denomination of gaming chips by performing one or more K-means clustering techniques. For example, a gaming chip stack image can be partitioned into a number of clusters 1404 to create a first clustered image 1405 based on pixel color and detected edges.
- the K-mean process may provide optimal results if the number of clusters 1404 present is the same as a selected K value. For illustrative and descriptive purposes, a number of cluster 1404 are indicated in FIG. 14. For each cluster 1404, the image service 227 can determine the colors in the cluster 1404 that are as close as possible to each other (e.g., based on pixel value) while being as far as possible from the colors in other clusters 1404.
- Each cluster 1404 can be defined based on pixels, pixel values, and a cluster centroid assigned by the image service 227.
- the cluster centroid can be positioned therein to minimize the sum of the distances between the pixels of the cluster 1404 and the cluster centroid.
- the image service 227 can modify all of the pixels in the cluster 1404 to take the color of the cluster centroid, thereby standardizing the pixels to a single color.
- the image service 227 can reduce the number of distinct colors after performing cluster merging via K-means techniques. For example, the image service 227 can identify clusters 1404 that are in proximity and share substantially similar or identical color, and can merge the clusters 1404 into a merged cluster 1406.
- K-means clustering and merging may be performed by generating 10 initial clusters 1404 in a CIELAB Euclidean space, identifying clusters 1404 of similar colors, and merging clusters 1404 of similar colors into 2 merged clusters 1406.
- the image service 227 can compute dominant color percentages that may be used to identify a denomination of gaming chips included therein by comparing the computed dominant color percentages to stored color percentages associated with specific denominations. For example, from a second clustered image 1407, the image service 227 can compute dominant color percentages 1409 and 1411 to be 51.2% for a first dominant color and 48.8% for a second dominant color. The image service 227 can compare the dominant color percentages 1409 and 1411 to stored color percentages, and identify a denomination associated with dominant color percentages of about 51.2% of the first dominant color and about 48.2% of the second dominant color.
- the image service 227 can identify a denomination of the gaming chips included in the images 1405 and 1407. It will be understood by one of ordinary skill in the art that the above techniques can be repeated for individual image areas, for example, to identify multiple gaming chip denominations included in a single image.
- FIG. 15 an example hardware diagram of a computing device 1500 is illustrated. Any of the image service 227, RFID service 228, cameras 209, or functionality described in the gaming table 206 may be implemented, in part, using one or more elements of the computing device 1500.
- the computing device 1500 can include one or more of a processor 1510, a Random Access Memory (“RAM”) 1520, a Read Only Memory (“ROM”) 1530, a memory device 1540, a network interface 1550, and an Input Output (“I/O”) interface 1560.
- the elements of the computing device 1500 are communicatively coupled via a bus 1502.
- the processor 1510 can include an arithmetic processor, Application Specific Integrated Circuit (“ASIC”), or other types of hardware or software processors.
- the RAM and ROM 1520 and 1530 can include a memory that stores computer-readable instructions to be executed by the processor 1510.
- the memory device 1540 stores computer-readable instructions thereon that, when executed by the processor 1510, direct the processor 1510 to execute various aspects of the present disclosure described herein.
- the processor 1510 includes an ASIC
- the processes described herein may be executed by the ASIC according to an embedded circuitry design of the ASIC, by firmware of the ASIC, or both an embedded circuitry design and firmware of the ASIC.
- the memory device 1540 comprises one or more of an optical disc, a magnetic disc, a semiconductor memory (i.e., a semiconductor, floating gate, or similar flash based memory), a magnetic tape memory, a removable memory, combinations thereof, or any other known memory means for storing computer-readable instructions.
- the network interface 1550 can include hardware interfaces to
- the I/O interface 1560 can include device input and output interfaces such as keyboard, pointing device, display, communication, and other interfaces.
- the bus 1502 can electrically and communicatively couple the processor 1510, the RAM 1520, the ROM 1530, the memory device 1540, the network interface 1550, and the I/O interface 1560, so that data and instructions may be communicated among them.
- the processor 1510 is configured to retrieve computer-readable instructions stored on the memory device 1540, the RAM 1520, the ROM 1530, or another storage means, and copy the computer-readable instructions to the RAM 1520 or the ROM 1530 for execution, for example.
- the processor 1510 is further configured to execute the computer-readable instructions to implement various aspects and features of the present disclosure.
- the processor 1510 may be adapted and configured to execute the processes described above with reference to FIG. 3, including the processes described as being performed by the image service 227 or gaming table 206.
- the memory device 1540 may store the data stored in the database 215. CONCLUSION
- such computer-readable media can comprise various forms of data storage devices or media such as RAM, ROM, flash memory, EEPROM, CD-ROM, DVD, or other optical disk storage, magnetic disk storage, solid state drives (SSDs) or other data storage devices, any type of removable non-volatile memories such as secure digital (SD), flash memory, memory stick, etc., or any other medium which can be used to carry or store computer program code in the form of computer-executable instructions or data structures and which can be accessed by a computer.
- data storage devices or media such as RAM, ROM, flash memory, EEPROM, CD-ROM, DVD, or other optical disk storage, magnetic disk storage, solid state drives (SSDs) or other data storage devices, any type of removable non-volatile memories such as secure digital (SD), flash memory, memory stick, etc., or any other medium which can be used to carry or store computer program code in the form of computer-executable instructions or data structures and which can be accessed by a computer.
- SSDs solid state drives
- Computer-executable instructions comprise, for example, instructions and data which cause a computer to perform one specific function or a group of functions.
- program modules are often reflected and illustrated by flow charts, sequence diagrams, exemplary screen displays, and other techniques used by those skilled in the art to communicate how to make and use such computer program modules.
- program modules include routines, programs, functions, objects, components, data structures, application programming interface (API) calls to other computers whether local or remote, etc. that perform particular tasks or implement particular defined data types, within the computer.
- API application programming interface
- Computer-executable instructions, associated data structures and/or schemas, and program modules represent examples of the program code for executing steps of the methods disclosed herein.
- the particular sequence of such executable instructions or associated data structures represent examples of corresponding acts for implementing the functions described in such steps.
- An exemplary system for implementing various aspects of the described operations includes a computing device including a processing unit, a system memory, and a system bus that couples various system components including the system memory to the processing unit.
- the computer will typically include one or more data storage devices for reading data from and writing data to.
- the data storage devices provide nonvolatile storage of computer-executable instructions, data structures, program modules, and other data for the computer.
- Computer program code that implements the functionality described herein typically comprises one or more program modules that may be stored on a data storage device.
- This program code usually includes an operating system, one or more application programs, other program modules, and program data.
- a user may enter commands and information into the computer through keyboard, touch screen, pointing device, a script containing computer program code written in a scripting language or other input devices (not shown), such as a microphone, etc.
- input devices are often connected to the processing unit through known electrical, optical, or wireless connections.
- the computer that effects many aspects of the described processes will typically operate in a networked environment using logical connections to one or more remote computers or data sources, which are described further below.
- Remote computers may be another personal computer, a server, a router, a network PC, a peer device or other common network node, and typically include many or all of the elements described above relative to the main computer system in which the inventions are embodied.
- the logical connections between computers include a local area network (LAN), a wide area network (WAN), virtual networks (WAN or LAN), and wireless LANs (WLAN) that are presented here by way of example and not limitation.
- LAN local area network
- WAN wide area network
- WAN or LAN virtual networks
- WLAN wireless LANs
- a computer system When used in a LAN or WLAN networking environment, a computer system implementing aspects of the invention is connected to the local network through a network interface or adapter.
- the computer When used in a WAN or WLAN networking environment, the computer may include a modem, a wireless link, or other mechanisms for establishing communications over the wide area network, such as the Internet.
- program modules depicted relative to the computer, or portions thereof may be stored in a remote data storage device. It will be appreciated that the network connections described or shown are exemplary and other mechanisms of establishing communications over wide area networks or the Internet may be used.
- steps of various processes may be shown and described as being in a preferred sequence or temporal order, the steps of any such processes are not limited to being carried out in any particular sequence or order, absent a specific indication of such to achieve a particular intended result. In most cases, the steps of such processes may be carried out in a variety of different sequences and orders, while still falling within the scope of the claimed inventions. In addition, some steps may be carried out
- “at least one of X, Y, and Z,” unless specifically stated otherwise, is to be understood to present that an item, term, etc. can be either X, Y, and Z, or any combination thereof (e.g., X, Y, and/or Z).
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Image Analysis (AREA)
Abstract
Vision-based recognition systems and methods can be used to identify gaming chips and stacks of gaming chips on a gaming table. The vision-based recognition systems and methods can include imaging devices that capture images of gaming tables, and an image service that can analyze captured images to identify gaming chips therein and determine denominations of the identified gaming chips. To identify gaming chips, an image service can identify gaming chip edges, gaming chip transitions, and other features. To determine gaming chip denominations, an image service can identify colors, patterns, and other gaming chip indicia, and match or correlate the identified colors, patterns, and indicia to stored colors, patterns, indicia, and other metrics associated with gaming chip denominations. To augment gaming chip identification and denomination determination, an imaging service can perform image processing methods, such as, for example, executing machine learning models and various applying algorithms.
Description
VISION BASED RECOGNITION OF GAMING CHIPS
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit of and priority to U.S. Provisional Patent Application No. 62/739,918, filed October 2, 2019, and entitled“VISION BASED RECOGNITION OF GAMING CHIPS,” which is incorporated herein by reference in its entirety.
BACKGROUND
[0002] Gaming chips can be used in place of currency and credits utilized in wagering and other gaming environments. Chips are typically used on a gaming table, often positioned in stacks of chips. To track transactions and wagering activities at a gaming table, a person can visually count chips located thereon and correlate the identified chips to one or more denominations of currency, of credits, or other values. A gaming table can include hundreds of gaming chips, and, therefore, human-based manual tracking of chips can become prohibitively costly and time consuming. Previous approaches to tracking chips have utilized image recognition systems to monitor table game activity; however, previous systems are not tolerant to partial visual obfuscations of chips or to atypical chip arrangements, and may be incapable of resolving and identifying gaming chips in images when viewed against background imagery.
Accordingly, there is a long-felt, but unresolved need for vision-based chip recognition systems and methods that can accurately and precisely identify gaming chips of varying denominations on a gaming table in a manner that resolves partial visual obfuscations, demonstrates tolerance of atypical chip arrangements, and resolves chip images against background imagery.
SUMMARY
[0003] Briefly described, and according to one embodiment, aspects of the present disclosure generally relate to systems and methods for vision-based recognition of gaming chips. In various embodiments, vision-based recognition may also be referred to as image-based recognition and can include performing image-based recognition on individual images, on video feeds, on computed image data, such as by combining two or more images or computing data from one or more images, or on other visual data.
[0004] Described herein, in various embodiments, are systems and methods for: 1) identifying gaming chips and stacks of gaming chips on a gaming table; and 2) determining a denomination of identified gaming chips. A vision-based gaming chip recognition system can be deployed in environments with one or more gaming tables, such as, for example, casinos. The system can utilize one or more cameras to capture images of the gaming tables. The system can include an image service that analyzes captured images to identify, if present therein, gaming chips and gaming chip stacks.
The image service can also analyze images of gaming chips and gaming chips stacks to identify denominations thereof.
[0005] For example, the system can include, in one or more databases, tables and/or other data objects that relate colors and color patterns to specific denominations. The image service can identify colors and color patterns in a gaming chip image, compare the identified colors and color patterns to one or more stored color and/or pattern tables, and, based on matches identified there between, determine a denomination of gaming chips in the gaming image. In the same example, the image service may also perform one or more image processing methods to correct or control for image distortions, remove background imagery, and resolve partial views of gaming chips.
[0006] The image service can perform image processing methods that can include, but are not limited to: 1) removing background imagery, for example, via executing mirror algorithms; 2) resolving partial views, for example, by replacing portions of the partial views using views from other cameras and/or by combining camera views; and 3) controlling for image distortions, for example, by executing one or more K-means processes as described herein. To recognize gaming chips and gaming chip stacks, the image service can perform techniques including, but not limited to: 1) edge detection algorithms and methods; 2) gaming chip transition detection algorithms and methods; 3) machine learning methods, for example, to train machine learning models to recognize colors and color patterns of denominations, and recognize those patterns in gaming chip images; and 4) other methods described herein.
[0007] The system can also include RFID elements for detecting gaming chips and gaming chip stacks. For example, the gaming chips can include RFID tags that can be read by RFID readers configured within a gaming table. Upon reading a gaming chip’s RFID tag, the system may receive an RFID identifier that can be correlated with stored RFID identifiers to identify the gaming chip and/or the gaming chip’s denomination.
[0008] According to a first clause, a system including: A) a gaming table; B) at least one imaging device; and C) at least one computing device in communication with the at least one imaging device, the at least one computing device being configured to at least: 1) receive an image from the at least one imaging device; 2) locate at least one stack of gaming chips in the image; 3) determine a count of a plurality of gaming chips in the at least one stack of gaming chips; and 4) determine a denomination of each of the plurality of gaming chips in the at least one stack of gaming chips.
[0009] According to a second clause, the system of the first clause or any other clause, wherein the at least one computing device is further configured to: A) perform a
horizontal edge detection on the image; B) perform a horizontal line and transition estimation based at least in part on the horizontal edge detection; and C) determine the count based at least in part on the horizontal line and transition estimation.
[0010] According to a third clause, the system of the first clause or any other clause, wherein the at least one computing device is further configured to: 1) determine a height of the at least one stack of gaming chips; and 2) determine the count of the plurality of gaming chips by dividing the height by a relative gaming chip thickness.
[0011] According to a fourth clause, the system of the third clause or any other clause, wherein the relative gaming chip thickness is associated with a capture angle of the image.
[0012] According to a fifth clause, the system of the first clause or any other clause, wherein the at least one computing device is further configured to locate the at least one stack by excluding background of the image from analysis.
[0013] According to a sixth clause, the system of the first clause or any other clause, wherein the background is excluded at least in part by applying a mirror algorithm.
[0014] According to a seventh clause, the system of the first clause or any other clause, wherein the at least one computing device is further configured to verify the count of the plurality of gaming chips in the at least one stack of gaming chips based on reading an RFID tag in each of the plurality of gaming chips in the at least one stack of gaming chips.
[0015] According to an eighth clause, the system of the first clause or any other clause, wherein the at least one computing device is further configured to: A) determine a dominant color associated with at least one of the plurality of gaming chips; and B) determine the denomination of the at least one of the plurality of gaming chips based at least in part on the dominant color.
[0016] According to a ninth clause, the system of the first clause or any other clause, wherein the at least one computing device is further configured to locate the at least one stack of gaming chips in the image based at least in part on a spiking neural network (SNN) model.
[0017] According to a tenth clause, the system of the first clause or any other clause, wherein the at least one computing device is further configured to generate a histogram corresponding to a region of a gaming chip of the plurality of gaming chips.
[0018] According to an eleventh clause, the system of the first clause or any other clause, wherein the at least one computing device is further configured to: A) determine a plurality of color percentages corresponding to counts of pixel color; and B) determine each of the plurality of color percentages fall within predefined ranges associated with one of a plurality of denominations of gaming currency.
[0019] According to a twelfth clause, the system of the first clause or any other clause, wherein the at least one computing device is further configured to partition colors of pixels into clusters.
[0020] According to a thirteenth clause, the system of the first clause or any other clause, wherein the at least one computing device is further configured to partition colors of pixels into clusters based at least in part on K-means clustering.
[0021] According to a fourteenth clause, a method including: A) processing, via at least one computing device, at least one image to locate a stack of gaming chips; B) determining, via the at least one computing device, a count of a plurality of gaming chips in the stack of gaming chips; and C) determining, via the at least one computing device, a denomination of each of the plurality of gaming chips in the at least one stack of gaming chips.
[0022] According to a fifteenth clause, the method of the fourteenth clause or any other clause, further including: A) performing, via the at least one computing device, a horizontal edge detection on the image; B) performing, via the at least one computing device, a horizontal line and transition estimation based at least in part on the horizontal edge detection; and C) determining, via the at least one computing device, the count based at least in part on the horizontal line and transition estimation.
[0023] According to a sixteenth clause, the method of the fourteenth clause or any other clause, further including: A) determining, via the at least one computing device, a dominant color associated with at least one of the plurality of gaming chips; and B) determining, via the at least one computing device, the denomination of the at least one of the plurality of gaming chips based at least in part on the dominant color.
[0024] According to a seventeenth clause, the method of the fourteenth clause or any other clause, further including generating, via the at least one computing device, a histogram corresponding to a region of a gaming chip of the plurality of gaming chips.
[0025] According to an eighteenth clause, the method of the fourteenth clause or any other clause, further including: A) determining, via the at least one computing device, a plurality of color percentages corresponding to counts of pixel color; and B) determining, via the at least one computing device, each of the plurality of color percentages fall within predefined ranges associated with one of a plurality of denominations of gaming currency.
[0026] According to a nineteenth clause, the method of the fourteenth clause or any other clause, further including partitioning, via the at least one computing device, colors of pixels into clusters.
[0027] According to a twentieth clause, the method of the fourteenth clause or any other clause, wherein partitioning is performed based at least in part on K-means clustering.
BRIEF DESCRIPTION OF THE DRAWINGS
[0028] For a more complete understanding of the embodiments and the advantages thereof, reference is now made to the following description, in conjunction with the accompanying figures briefly described as follows:
[0029] FIG. 1 is an illustration of a gaming table and a camera according to various embodiments of the present disclosure.
[0030] FIG. 2 is a drawing of a networked environment according to various example embodiments.
[0031] FIG. 3 illustrates an example flowchart of certain functionality implemented by portions of image service executed in a computing environment in the networked environment of FIG. 2 according to various embodiments of the present disclosure.
[0032] FIG. 4 illustrates generated chip transitions using a spiking neural network model according to various embodiments of the present disclosure.
[0033] FIG. 5 is an image representing chip transition hits on a contrast enhanced grayscale image according to various embodiments of the present disclosure.
[0034] FIG. 6 is an image representing merged hits for the contrast enhanced grayscale image from FIG. 5 according to various embodiments of the present disclosure.
[0035] FIG. 7 is an image showing candidate areas according to various embodiments of the present disclosure.
[0036] FIG. 8 is an image indicating identified stacks of gaming chips according to various embodiments of the present disclosure.
[0037] FIG. 9 shows example images from different stages in chip identification according to various embodiments of the present disclosure.
[0038] FIG. 10 shows images of example gaming chips with various denominations according to various embodiments of the present disclosure.
[0039] FIG. 11 is an example color histogram according to various embodiments of the present disclosure.
[0040] FIG. 12 is an image of an example gaming chip side by side with a 2D flat view of a side of the gaming chip according to various embodiments of the present disclosure.
[0041] FIG. 13 is an image detecting a comparison of a detected gaming chip to a 2D flat view according to various embodiments of the present disclosure.
[0042] FIG. 14 is an example of a bimodal stack histogram according to various embodiments of the present disclosure.
[0043] FIG. 15 is a schematic block diagram that illustrates an example computing environment employed in the networked environment of FIG. 2 according to various embodiments.
[0044] The drawings illustrate only example embodiments and are therefore not to be considered limiting of the scope described herein, as other equally effective embodiments are within the scope and spirit of this disclosure. The elements and features shown in the drawings are not necessarily drawn to scale, emphasis instead being placed upon clearly illustrating the principles of the embodiments. Additionally, certain dimensions may be exaggerated to help visually convey certain principles. In the
drawings, similar reference numerals between figures designate like or corresponding, but not necessarily the same, elements.
DETAILED DESCRIPTION
[0045] In the following paragraphs, the embodiments are described in further detail by way of example with reference to the attached drawings. In the description, well known components, methods, and/or processing techniques are omitted or briefly described so as not to obscure the embodiments. As used herein, the“present disclosure” refers to any one of the embodiments described herein and any equivalents. Furthermore, reference to various feature(s) of the“present embodiment” is not to suggest that all embodiments must include the referenced feature(s).
[0046] Among embodiments, some aspects of the present disclosure are implemented by a computer program executed by one or more processors, as described and illustrated. As would be apparent to one having ordinary skill in the art, one or more embodiments may be implemented, at least in part, by computer-readable instructions in various forms, and the present disclosure is not intended to be limiting to a particular set or sequence of instructions executed by the processor.
[0047] The embodiments described herein are not limited in application to the details set forth in the following description or illustrated in the drawings. The disclosed subject matter is capable of other embodiments and of being practiced or carried out in various ways. Also, the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of "including,"
"comprising," or "having" and variations thereof herein is meant to encompass the items listed thereafter, additional items, and equivalents thereof. The terms "connected" and "coupled" are used broadly and encompass both direct and indirect connections and couplings. In addition, the terms "connected" and "coupled" are not limited to electrical, physical, or mechanical connections or couplings. As used herein the terms "machine," "computer," "server," and "work station" are not limited to a device with a single
processor, but may encompass multiple devices (e.g., computers) linked in a system, devices with multiple processors, special purpose devices, devices with various peripherals and input and output devices, software acting as a computer or server, and combinations of the above.
[0048] The gaming chips or chip as used herein can include any chip, plaque, jeton, or other gaming currency that may be used in a casino, gaming room, or digital game. Each gaming chip can represent a value that is predetermined or not. The gaming chips can be made from a rigid plastic material or clay to obtain a structure that is solid enough to resist conditions of use in casinos. The gaming chips can be used throughout a casino. For example, at gaming tables, gaming chips can be received for play or the conclusion of a game or hand, cash can be received and gaming chips paid out (buy-in), and gaming chips may be paid out during play. In a cashier area, gaming chips can be received and cash can be paid out (cash out). Alternatively, cash can be received and gaming chips can be paid out (buy-in).
[0049] Turning now to the drawings, exemplary embodiments are described in detail. With reference to FIG. 1, shown is a gaming table 100 in a networked environment according to various embodiments of the present disclosure. The gaming table 100 can be monitored by one or more imaging devices 103, such as, for example, a camera. One or more gaming chips can be played on the gaming table as part of a wagering game. The video feed from the one or more imaging devices 103 can be used to locate stacks of one or more gaming chips, count the gaming chips in each stack, and evaluate a denomination for each of the gaming chips.
[0050] With reference to FIG. 2, shown is a networked environment 200 according to various embodiments. The networked environment 200 includes a computing environment 203, one or more gaming table devices 206, and one or more cameras 209,
which are in data communication with each other via a network 212. The network 212 includes, for example, the Internet, intranets, extranets, wide area networks (WANs), local area networks (LANs), wired networks, wireless networks, or other suitable networks, etc., or any combination of two or more such networks. For example, such networks may comprise satellite networks, cable networks, Ethernet networks, and other types of networks.
[0051] The computing environment 203 can include, for example, a server computer or any other system providing computing capability. Alternatively, the computing environment 203 may employ a plurality of computing devices that may be arranged, for example, in one or more server banks or computer banks or other arrangements. Such computing devices may be located in a single installation or may be distributed among many different geographical locations. For example, the computing environment 203 may include a plurality of computing devices that together may comprise a hosted computing resource, a grid computing resource, and/or any other distributed computing arrangement. In some cases, the computing environment 203 may correspond to an elastic computing resource where the allotted capacity of processing, network, storage, or other computing-related resources may vary over time.
[0052] Various applications and/or other functionality may be executed in the computing environment 203 according to various embodiments. Also, various data is stored in a data store 215 that is accessible to the computing environment 203. The data store 215 may be representative of a plurality of data stores 215 as can be appreciated. The data stored in the data store 215, for example, is associated with the operation of the various applications and/or functional entities described below. The data store 215 can include currency data 218, gaming data 221, and training data 224, among other data.
[0053] The components executed on the computing environment 203, for example, include an image service 227 and an RFID service 228, and other applications, services, processes, systems, engines, or functionality not discussed in detail herein. The image service 227 is executed to recognize gaming chips used on one or more tables 206. As ' an example, a stack of gaming chips can be positioned in various locations on a gaming table 206 during a wagering game. One or more images can be captured of the gaming table 206 and the image service 227 can locate and/or identify the one or more stacks of gaming chips in the images, count the gaming chips in each stack, and evaluate denominations for the gaming chips. It can be appreciated that some or all of the functionality as described with reference to the image service 227 can be executed in a computing device at the gaming table 206.
[0054] The currency data 218 can include a list of all active gaming chips including any identifiers associated with the gaming chips, such as, for example, RFID tag identifiers, barcode identifiers, visual characteristics including color information (e.g., such as color pixel values, thresholds, color patterns, etc.), and other identifiers. Active gaming chips can correspond to gaming chips indicated as currently in use in the data store 215 excluding gaming chips that are yet to be deployed, decommissioned, and/or damaged or broken. The gaming data 221 can store a history of sensor inputs received as well as any configuration, calibration, and control settings. The training data 224 can include data corresponding to transitions (e.g., transitions in color or patterns on a gaming chip) in various denominations of gaming chips, color tables for the various denominations of gaming chips, 2D views of gaming chips, and other training information. For example, the training data 224 can include data corresponding to a color table that relates chip colors, chip color patterns, and other color information (e.g., such as pixel values) to denominations (e.g., of currency, credits, prize values, etc.). As
another example, the training data 224 can include 2D views of gaming chips, such as the 2D view shown in FIG. 10.
[0055] The gaming table 206 is representative of a plurality of gaming tables that may be coupled to the network 212. The gaming table 206 can include, for example, one or more computing devices with a processor-based system such as a computer system. Such a computer system may be embodied in the form of an embedded computing device or other devices with like capability. The gaming table 206 can include one or more cameras 230, one or more sensors 233, a chip tray 236, one or more bet spots 239, a chip recycler 242, and a bill validator 245. The cameras 209 and 230 can be imaging devices 103 (FIG. 1).
[0056] Similar to cameras 209, the cameras 230 can capture images of a surface of the gaming table 206. The gaming table 206 and/or cameras 230 can send the images to the image service 227 via the network 212. The images can be sent to the image service 227 as a video stream of the surface of the gaming table 206. In some embodiments, the image can be sent based on differences from a previous frame in a video, such as based on a key frame. The image service 227 can receive images from various angles from cameras 209 and 230.
[0057] The sensors 233 can include RFID antennas, video barcode scanners, weigh scales, and other sensors. The sensors 233 can be used to identify gaming chips played on a gaming table. For example, an RFID reader can utilize an RFID antennas to read RFID information from RFID tags configured within the gaming chips. The RFID information from each RFID tag can include an identifier associated with the gaming chip. As another example, video barcode scanners can read barcode information, or other information, from barcodes or barcode tags located on the gaming chips. The barcode information for each gaming chip can include an identifier associated with the
gaming chip. In another example, weigh scales (e.g., located beneath bet spots 239, a chip tray 236, etc.) can read weights of one or more and/or each of the gaming chips. Detected weights of the gaming chips can be compared against known weights of a plurality of gaming chip denominations, and the gaming chips can be identified based on the comparisons.
[0058] The RFID service 228 can validate RFID currency based on information read from RFID readers via the sensors 233 corresponding to an RFID antenna. An RFID antenna can be positioned at the chip tray 236, at each of the bet spots 239, at the chip recycler 242, and in another positions. The gaming table 206 can read RFID tags from RFID-enabled gaming chips using the RFID antennas. The information from the RFID tags can be stored along with data related to the RFID antenna used to read the RFID tag. For example, an identifier from one or more RFID-enabled gaming chips can be read by an RFID antenna at a particular bet spot 239. An identifier can include information that uniquely identifies each of the one or more RFID-enabled gaming chips. In one example, a gaming table 206 can read one or more RFID-enabled gaming chips via an RFID antenna. In the same example, the gaming table 206 can record and store one or more RFID identifiers that identify the one or more RFID-enabled gaming chips, and can also record and store information that identifies the RFID antenna or table area where the one or more RFID-enabled gaming chips were read. Information identifying the RFID antenna or table area can include, but is not limited to: 1) an identifier (e.g., such as a string of characters); 2) a table identifier that identifies a particular gaming table 206 into which the RFID antenna is installed; and 3) a bet spot identifier that identifies a particular bet spot 239 onto which the one or more RFID-enabled gaming chips were disposed.
[0059] The gaming table device 206 can determine a patron placed a wager of the RFID-enabled gaming chips based on a corresponding to the particular bet spot 239 where the wager was placed. The gaming table device 206 can transmit a count of the gaming chips read at each of the RFID antennas. The count can include one or more identifiers from each gaming chip. In one embodiment, the gaming table 206 can perform a read of all RFID antennas at least once per game being played, and can transmit the at least one count to the RFID service 228 after each game. The gaming table 206 can perform reads of all RFID antennas several times per game. In some embodiments, the gaming table 206 automatically and repeatedly sends any detected table and/or gaming chip changes based on readings of RFID-enabled gaming chips that occur during the game. The system can verify or validate a count of gaming chips in a stack determined by the image service 227 by reading RFID tags in each of the gaming chips in the stack using an RFID reading including an RFID antenna.
[0060] The image service 227 can validate gaming chips using the cameras 209 and/or the cameras 230. One or more cameras 230 can be positioned on the table, and one or more cameras 209 can be positioned separate from the table 206. The cameras 230 can be positioned overhead or above the table 206. The cameras 209 can also be positioned on the table 206, such as, for example in a chip tray 236, chip recycler 242, on top of a bill validator 245, or at another position. The cameras 209 and 230 can record a video stream of various angles or segments of the table 206. For example, multiple cameras 230 can be positioned in the chip tray 236 pointing toward the bet spots 239. A camera 230 can be directed to a single bet spot 239 or a group of bet spots 239. In one embodiment, the gaming table 206 can join or stitch together video feeds from multiple cameras 230 to generate a video feed of an area, such as, for example a video feed for all
bet spots 239. Similarly, the image service 227 can join video together from cameras 209 and/or cameras 230.
[0061] The image service 227 can identify stacks of gaming chips, determine a count of the gaming chips in each stack, and determine a denomination for the gaming chips in each stack at a variety of positions on the gaming table 206. The gaming table 206 or the image service 227 can perform image recognition on frames of the video feeds to identify information for gaming chips on the gaming table 206. For example, a height of a stack of gaming chips can be determined, and the count of the chips can be calculated based on the height. In the same example, the count of the chips can be calculated based on computations relating the overall stack height to individual chip thickness. In an exemplary scenario, the image service 227 may receive an image, identify a gaming chip stack depicted in the image, and determine a stack height of 300 pixels. The image service 227 may retrieve a stored gaming chip thickness of 15 pixels. To compute a gaming chip count, the image service 227 can divide the 300 pixel stack height by the 15 pixel gaming chip thickness to obtain a gaming chip count of 20. The image service 227 can round the resulting gaming chip count when appropriate. As an example, if the stack height of 307 pixels had been determined, the result of 20 gaming chips can be determined by rounding 20.466 to a count of 20 gaming chips.
[0062] The gaming chip thickness can be associated with a predefined capture angle at which the image was captured and also associated with a position of the gaming chip stack within the image. The gaming chip thickness can also vary based on where in the view of view for each camera 209/230 that the gaming chip stacks are identified. As an example, the image service 227 can identify a thickness of 12 pixels in a first portion of a field of view for a camera 209/230 and a thickness of 15 pixels for a second portion of a field of view. In some embodiments, the image service 227 can calculate a distance and
angle to an identified gaming chip stack, and can calculate a thickness per gaming chip based on the distance and angle. In some embodiments, the image service 227 can determine a width of an identified gaming chip stack, and can calculate a thickness per gaming chip based on the width of the gaming chip stack.
[0063] The data store 215 can store a variety of gaming chip thicknesses associated with various capture angles and various gaming chip stack image positions. The data store 215 can store a variety of dimensional information for various gaming chips, such as a ratio of width to height, width to depth, and height to depth, including ratios at various camera angles and distances.
[0064] The denomination of each of the gaming chips in the stack can be determined based on various visual characteristics. In one example, a color pattern on the edge of the gaming chips can be used to determine the denomination. In another example, one or more visual security features can be used to determine the
denomination. For example, each denomination of currency can have a different holographic symbol or other visual security protection. The image service 227 or the gaming table device 206 can determine the denomination by identifying which visual security protection each gaming chip in a video feed contains.
[0065] The chip recycler 242 can operate in a similar fashion to a coin recycler. The chip recycler 242 can be used in addition to or in place of chip tray 236. At the end of the game or hand, if a dealer has collected gaming chips from players, the gaming chips can be placed into an input area, such as a funnel, hopper, or tube, and then validated (authenticated), counted, sorted, and stored by the chip recycler 242. If gaming chips are to be paid out to players, exchanged for cash, or exchanged for other gaming chips, then the gaming table 206 or a table management system or a control system executed in the computing environment 203 can instruct the chip recycler 242 how much in gaming
chips and which denominations to pay out. A chip recycler 242 within a cashier cage, a bank or vault, or kiosk (not shown) can operate in a similar fashion. A user places the gaming chips in the chip recycler 242, the chip recycler 242 processes the gaming chips, and the chip recycler 242 either automatically outputs gaming chips in other
denominations or outputs cash equal to the input value.
[0066] With reference to FIG. 3, shown is a process 300 in a flowchart according to various embodiments of the present disclosure. It is noted that embodiments described herein may be practiced using an alternative order of the steps illustrated in FIG. 3. That is, the process 300 illustrated in FIG. 3 is provided as an example only, and the embodiments may be practiced using process flows that differ from those illustrated. Additionally, it is noted that not all steps are required in every embodiment. In other words, one or more of the steps may be omitted or replaced, without departing from the spirit and scope of the embodiments. Further, steps may be performed in different orders, in parallel with one another, or omitted entirely, and/or certain additional steps may be performed without departing from the scope and spirit of the embodiments.
[0067] At box 303, the process 300 can include locating one or more stacks of gaming chips. Stacks can be composed of gaming chips with different values, colors, and other characteristics. The gaming chips may be located in a float tray, a splash tube, an RFID checkpoint device, a bet spot 239 (FIG. 2), or some other area. The splash tube can be an area to place recently played gaming chips. The RFID checkpoint device can be an RFID reading with an RFID antenna configured to read chips at a position in front of a dealer. In some embodiments, the stacks of gaming chips can be separated by a spacer or be placed in predefined areas, such as, for example, in a row of a chip tray 236. In the checkpoint, at the bet-spots 239, or on a cage counter, the stacks of gaming chips can be placed in front of other stacks, therefore occluding chips partially or even
completely. By obtaining different angles from more than one camera 209 and 230, the occluded gaming chips can be seen.
[0068] At box 306, the process 300 can include counting gaming chips in each stack. The image service 227 can determine edges depicted in an image to identify a number of gaming chips in the stack. For example, a height of a stack of gaming chips depicted in an image can be determined. The stack of gaming chips can be isolated in the image, and edges between the gaming chips in the stack can be determined. A count of the chips can be calculated based on the edges.
[0069] At box 309, the process 300 can include evaluating denominations for each of the gaming chips identified. In some embodiments, the gaming chips may be sorted. Different denominations of gaming chips can have different diameters. The gaming chips may be sorted according to size, or otherwise sorted according to denomination. It can be assumed that gaming chips are sorted in specific areas, such as in a chip tray. In other areas, such as a bet spot or splash tube, the gaming chips cannot be assumed to be sorted.
[0070] With reference to FIG. 4, shown are chip transitions 400 generated using a spiking neural network (SNNT) model reconstruction process, according to various embodiments of the present disclosure. The chip transitions 400 can include one or more transitions 403, 406, and 409. The chip transitions 400 can be stored in training data 224 (FIG. 2) and can be generated during a training procedure. An image service 227 (FIG. 2) can retrieve and utilize the chip transitions 400 to perform chip identification processes described herein. In at least one embodiment, the chip transitions 400 can include mirrored versions of transitions. For example, the chip transitions 400 can include mirrored transitions 403’, 406’, and 409’ that are also generated using the spiking neural network (SNNT) model reconstruction process. Mirrored transitions can
be used to augment chip identification processes, for example, in instances where a portion of chips in a chip stack are oriented upside down relative to other chips in the stack. In some embodiments, for each stack of gaming chips, the image service 227 can alter the chip transitions 400 to be orientated to match an orientation of the stack of gaming chips.
[0071] With reference to FIG. 5, shown is an image 500 representing exemplary chip transition hits 503, 505, 507, 509, 511, 513, and 515 that may be detected by an image service 227. The image 500 is representative of a contrast enhanced grayscale image that can be captured and provided via one or more camera 209 and/or 230, according to various embodiments of the present disclosure. In at least one embodiment, the chip transition hits 503, 505, 507, 509, 511, 513, and 515 represent areas of the image 500 where the image service 227 has identified a particular pattern, color, color pattern, edge, or other structure indicative that the area within the hit, or a portion thereof, is a portion of a chip. Each chip transition hit can be analyzed to determine if one or more gaming chips and/or one or more stacks of gaming chips are depicted within the image.
[0072] With reference to FIG. 6, shown is an image 600 representing merged hits for the contrast enhanced grayscale image 500 (FIG. 5) according to various
embodiments of the present disclosure. The image service 227 can perform one or more hit resolving processes to merge hits into a new merged hit that includes the areas of the old hits, but is also expanded to include nearby areas determined include a chip or portion thereof. For example, the image service 227 can merge the transition hits 505 and 507 to generate a merged transition hit 603. As another example, the image service 227 can merge the transition hits 511 and 513 to generate a merged hit 605.
[0073] In various embodiments, iterative merging of individual and merged hits into newly merged hits can augment the chip recognition processes by incrementally increasing an area of an image recognized as including a chip or portion thereof.
Increasing the recognized chip area can facilitate resolving patterns, edges, colors, and other indicia required to identify a chip and correlate the chip to a particular
denomination. For example, two hits may each include a portion of a chip-identifying pattern. An image service 227, upon analyzing the pattern portion of each hit, may fail to recognize the overall chip-identifying pattern. However, upon generating a merged hit, the merged hit may fully resolve the chip-identifying pattern such that it can be recognized by the image service 227.
[0074] With reference to FIG. 7, shown is an image 700 showing various candidate areas according to various embodiments of the present disclosure. The solid-lined candidate areas 703, 705, 711, and 713 can represent areas of the image 700 determined to possibly be background imagery. The candidate areas 703, 705, 711, and 713 can be generated by an image service 227 that analyzes patterns and other indicia in and around hit areas. The image service 227 can determine areas of the image 700 that are likely to include a chip and areas that are likely to include background imagery. The image service 227 can track the areas determined to likely include a chip or chip stack, which are indicated via generation of dash-lined candidate areas 707 and 709.
[0075] In at least one embodiment, suspected background candidate areas 703, 705, 711, and 713 can be filtered out using mirroring techniques, while the suspected chip candidate areas can be retained as suspected stacks and/or chips. For example, the image service 227 can apply a mirror model algorithm to the image 700 to assess if the candidate areas 703, 705, 711, and 713 constitute background imagery or depicts an object (or portion thereof), such as a chip or chip stack. To execute the mirroring model
algorithm, the image service 227 can use the SNNT to learn an area of a background image during training. The background image can correspond to an image taken from the same or similar view of the image 700 while no chips or stacks of chips present. The image service 227 can match the candidate areas 703, 705, 707, 709, 711, and 713 from the image 700 with the same areas from the background image. If a candidate area matches the corresponding area of the background image, the candidate area can be excluded from further chip recognition and analysis processes. For example, the image service 227 can determine that candidate areas 703, 705, 711, and 713 match corresponding areas of the background image, and, accordingly, the image service 227 can exclude the candidate areas from further analyses. In the same example, the image service 227 can determine that candidate areas 707 and 709 do not match corresponding areas of the background image, and the image service 227 can retain the candidate areas for further analysis. The image service 227 can also discard any hits in the excluded candidate areas.
[0076] With reference to FIG. 8, shown is an image 800 indicating an identified stack of gaming chips 803 according to various embodiments of the present disclosure. To identify the stack 803, the image service 227 can merge candidate areas that are not located in excluded areas, and can further process the merged candidate area (e.g., by applying mirror algorithms, etc.) to precisely identify areas of the image 800 that include the stack 803. For example, the image service 227 can merge candidate areas 707 and 709, and process the merged area to identify the stack 803. In at least one embodiment, the image service 227 can apply a size filter to hits and/or candidate areas so that the hits and/or candidate areas containing more than one stack are split into multiple hits and/or multiple candidate areas, each hit and/or candidate area encompassing one stack.
[0077] With reference to FIG. 9, shown are example images 901, 903, 905, and 907 from different stages in chip identification processes according to various embodiments of the present disclosure. In image 901, the image service 227 can estimate the top of a stack 904 of gaming chips by identifying the rim 909 between the upper face 912 of a highest gaming chip 902 and an edge 915 of the highest gaming chip 902. The image service 227 can detect transitions between the gaming chips. Once the transitions are detected, the image service 227 can determine a count of gaming chips in the stack 904. Further, the image service 227 can isolate each individual gaming chip for color analysis or other analysis.
[0078] The estimation of the top of the stack 904 relies on multiple concepts. First, the top of a stack 904 is at the bottom of an ellipse representing the upper face 912 of the highest chip 902. The region around the top of the stack transition is generally lighter above and darker below due to the lighting conditions. The upper face 912 just above the top of the stack transition is usually heterogeneous compared to the gaming chip edges. The image 901 can correspond to an identified area that includes a stack 904 of gaming chips. The image service 227 can perform horizontal edge detection on the image 901 to generate the image 903. As an example, the image service 227 can compute a horizontal edge detection using a Sobel filter or other filter to generate one or more horizontal lines emphasizing edges 906 between chips.
[0079] The image service 227 can reduce the image area of an edge 906 to one horizontal line by averaging multiple horizontal lines clustered near the same edge 906. For example, the image service 227 can perform a horizontal line and transition estimation on the image 903 and average the horizontal lines therein to generate the image 905. Within the image 905, the image service 227 can find peaks 908 in the averaged horizontal lines. The peaks 908 can represent the locations of transitions 910
between chips. In image 903, some transitions 910 may be missing due to a lack of contrast in the original image 901 that was captured by cameras 209 and/or 230 (FIG. 2). The image service 227 can fill the missing transitions 910 by estimating the chip thickness. For example, the image service 227 can identify a chip in one or more of the images 901, 903, 905, and 907, and can compute locations of chip transitions 910 based on calculations of the chip thickness (e.g., a transition 910 may occur per every chip’s thickness worth of distance in a stack 904). Further, as shown in image 907, the image service 227 can identify individual gaming chips in the original image 901 using the lines estimated in image 905. In at least one embodiment, the image service 227 can include separation lines 912 between each of the gaming chips in image 903 as shown in the image 907. The separation lines 912 can further facilitate resolving transitions 910 between the chips.
[0080] With reference to FIG. 10, shown are images depicting example gaming chips 1000 of various denominations according to various embodiments of the present disclosure. In some embodiments, an assumption can be made that each denomination of gaming chips 1000 corresponds to a dominant color. The image service 227 can identify a color with the highest number of pixels. The image service 227 can determine the identified color as the dominant color. In some embodiments, the image service 227 can modify an image prior to identifying the color of pixels. The image service 227 can perform a color correction and/or color enhancement algorithm based on the camera used to capture the image, lighting conditions associated with the area, or some other deficiency. As an example, the image service 227 may adjust a contrast, temperature, or other property of the image based on known sensor deficiencies in the camera. In some embodiments, the image service 227 can perform a training process involving analyzing
image captures from cameras 209/230 including items of known colors to identify necessary color correction and/or color enhancement.
[0081] The image service 227 can utilize the training data 224 to attribute a denomination to an imaged gaming chip 1001 based on the dominant color. As an example, an imaged gaming chip 1001 may have a denomination of $25 US Dollars when the dominant color is green, have a denomination of $1,000 US Dollars when the dominant color is a first shade of blue, and have a denomination of $5,000 US Dollars when the denomination is a second shade of blue.
[0082] During a training process, a color table can be created with denominations and dominant colors. The color table can include hue saturation and value (HSV) for each of the dominant colors. The color table can be stored in training data 224. The image service 227 can utilize a predefined color-value table. The predefined color-value table can be manually tuned or automatically trained for each casino environment including the specific set of gaming chips 1000 used in the casino, the lighting conditions, the ambient lighting, and other factors. The color table can include definitions of upper and lower HSV values for each color, which can be stored in the training data 224 as a color value (CV) table.
[0083] In one embodiment, the training process can include manually setting the minimum and maximum HSV values corresponding to the dominant color and storing the entered settings in training data 224. In other embodiments, the training process can include using machine learning to learn colors of bands patterned onto chips 1000. For example, the training process can use machine learning to learn colors of bands patterned onto the chips 1000. The image service 227 can utilize a machine learning classifier to leam HSV values corresponding to the dominant colors, such as, for example, using histogram distance classification training. With machine learning, the image service 227
can analyze images of each denomination of gaming chips 1000 from various angles in a casino environment. The image service 227 can learn the colors of bands for each denomination and store the ranges of colors as a CV table. In some embodiments, machine learning can be used to generate the CV table from various images, while in others, machine learning can be used to tune and/or calibrate one or more stored standard CV tables based on processing various images depicting chips. Periodically, a CV table can be manually adjusted or automatically retrained. In at least one embodiment, a light grey zone can be used to compute a white balance in real time in order to reduce and/or remove undesired lighting effects cause by lights emitted by any signage or other light sources.
[0084] The gaming chips 1000 can have multiple colors, such as, for example, from chip inserts, from multiple injections during an injection mold process, or from different materials in a compression mold process. In some embodiments, the gaming chips 1000 can have up to three different colors. By determining a percentage of pixel counts for each of the colors on the gaming chip 1001 within a region of interest, the image service 227 can identify a denomination for the gaming chip 1001. The image service 227 can identify regions of interest, such as region 1021, to ignore the edges 1011 of the gaming chip 1001 to focus on the color of the inserts 1013, 1015, and 1017 located in the central area of the gaming chip 1001. The region of interest 1021 can be determined to correspond to a middle or central area of the gaming chip 1001. As an example, $25 denomination gaming chips can have a green pixel count between 30% and 60%, a blue pixel count between 0% and 30%, and a brown pixel count between 0% and 30%. In this example, if the pixel counts fall in those ranges for a gaming chip 1001, the image service 227 can identify the gaming chip 1001 as a $25 denomination.
[0085] With reference to FIG. 11, shown is an example color histogram 1100 according to various embodiments of the present disclosure. For each located chip stack, the image service 227 can count the gaming chips in the stack. The image service 227 can determine a contour for each of the gaming chips in the stack. For each gaming chip in the stack, the image service 227 can calculate a color histogram 1100 using a CV table from training data 224. The color histogram 1100 can include a combined red, green, blue (RGB) histogram 1101, and can also include histograms for each color in an RGB color model. For example, the color histogram 1100 can include a green histogram 1103, a blue histogram 1105, and a red histogram 1107.
[0086] For each color in the CV table, the image service 227 can apply, to a chip image, a color mask, apply morphological transformations, such as erosion, closing, and other transformations, and count a number of pixels corresponding to the respective color. The image service 227 can select a dominant color based on the number of pixels. As an example, the image service 227 can select the dominant color as the color with the highest pixel count. The image service 227 can calculate a percentage of pixels for each color from a histogram 1100, 1101, 1103, and/or 1105. The calculated pixel percentages can represent color percentages that can be compared to values in a CV table in training data 224.
[0087] The image service 227 can determine the chip value or denomination corresponding to the gaming chip with the determined dominant color. If the pixel percentage for each color falls within a range of one of the CV tables in training data 224, the image service 227 can identify the gaming device as the corresponding denomination in training data 224. As an example, for a $25 gaming chip, the image service 227 can determine a region has pixel percentages of 36% green, 12% blue, and 21% brown. The image service 227 can identify the gaming chip as a $25 denomination
by determining those values falling within the ranges of 30%-60% green, 0-30% blue, and 0-30% brown.
[0088] Turning to FIG. 12, shown is an image 1200 of an example gaming chip 1201 side by side with a 2D flat view of a side 1203 of the gaming chip 1201 according to various embodiments of the present disclosure. In at least one embodiment, the present system can compute and leverage chip dissimilarity values to support gaming chip recognition and identification. In one or more embodiments, a chip dissimilarity value can refer to a difference (e.g., such as a percent difference) between a control image and a test image of a gaming chip. In various embodiments, if a chip dissimilarity value is within a predefined threshold, the test image can be determined to be equivalent to the control image.
[0089] The image service 227 can calculate a chip dissimilarity for the gaming chip 1201. To calculate the chip dissimilarity, the image service can compare each pixel value of the detected chip 1201 to flattened 2D views of the chip side 1203, such as a flattened 2D view 1205. During a training process, art work for each gaming chip 1201 can be used to generate the 2D flat views 1205 of the sides 1203 of the gaming chips 1201. Images of the side 1203 of the gaming chip 1201 can also be used to generate the 2D flat views 1205 of the sides 1203 of the gaming chips 1201. The 2D flat views 1205 can each be stored in training data 224 and associated with a denomination.
[0090] With reference to FIG. 13, shown is an image 1300 of an image service 227 (FIG. 2) performing a comparison of a detected gaming chip (not illustrated) to a 2D flat view 1303 according to various embodiments of the present disclosure. The image service 227 can identify a gaming chip from one or more images, such as, for example, those received from camera 209 or 230. The image service 227 can extract a region 1301 of the identified gaming chip corresponding to an edge thereof. The image service
227 can compare the region 1301 against regions of various 2D flat views 1303 of different denominations of gaming chips in training data 224. As an example, the region 1301 can be compared against region 1306 from training data 224.
[0091] The image service 227 can slide the region 1301 along the view 1303 to determine if a match exists between regions therein, such as, for example, region 1306. For each position while sliding the region 1301, the image service 227 can calculate a cost function. The cost function can be a sum of absolute pixel differences between the region 1301 and the region of the view 1301. For example, the image service 227 can calculate a cost function between the region 1301 and the region 1306 by calculating and summing pixel differences between the regions. The image service can store the lowest cost function calculated during sliding with respect to views 1303 of each denomination. The image service 227 can compare the lowest cost function result for each
denomination to determine which denomination has the lowest cost function. The image service 227 can identify the detected gaming chip as having the denomination with the lowest cost function.
[0092] In some embodiments, the image service 227 can identify a denomination of a gaming chip by comparing an image of a detected gaming chip with sub-parts of sample images in training data using deformation. In other embodiments, the image service 227 can determine denomination using one or more of a hidden Markov model (HMM), a dynamic time warping (DTW) algorithm, and other algorithms to encode variations in sizes, patterns, colorings, and other features for the gaming chips.
[0093] With reference to FIG. 14, shown is an example of a bimodal stack histogram 1400 with a first peak 1401 and a second peak 1403 according to various embodiments of the present disclosure. The image service 227 can detect chip stacks as discussed herein and identify pixel values to generate the histogram 1400. The image
service can recognize gaming chip denominations by computing dominant color percentages in gaming chip images, and comparing the computed percentages to stored percentages associated with specific denominations. Lighting and other factors can obfuscate true pixel values, and result in a histogram 1400 that is appropriately bimodal (e.g., indicating that a gaming chip stack includes two dominant colors), but includes pixel noise that undesirably weights computed color percentages. Because lighting and other factors can obfuscate true pixel values, the image service 227 can resolve the true pixel values by reducing the diversity of pixel values in an image (e.g., thereby decreasing a number of colors therein). The image service 227 can perform clustering techniques on an image in a manner such that only dominant colors are retained. In an image of gaming chips, the image service 227 can order the different colors therein by relative strength based on a computed dominant color histogram 1400 or a dominant color spectrum. The image service 227 can also tag each color in the histogram 1400 spectrum with a label. The image service 227 can then determine equivalency of two or more colors in the image. Two colors can be determined to not be equivalent if the distance between the color components is high. The image service 227 can merge colors determined to be equivalent to a standard dominant color. The dominance of one or more colors in an image depicting gaming chips can be utilized to identify a
denomination of the gaming chips therein. For example, a light shade of blue of a gaming chip may be determined as the dominant color. The image service 227 can identify the denomination of the gaming chip by matching the dominant color to a dominant color of a particular denomination.
[0094] To reduce the range of pixel values and improve dominant color and denomination recognition processes, the image service 227 can perform methods to evaluate and combine image areas of similar color. In at least one embodiment, the
image service 227 can classify the denomination of gaming chips by performing one or more K-means clustering techniques. For example, a gaming chip stack image can be partitioned into a number of clusters 1404 to create a first clustered image 1405 based on pixel color and detected edges. In particular embodiments, the K-mean process may provide optimal results if the number of clusters 1404 present is the same as a selected K value. For illustrative and descriptive purposes, a number of cluster 1404 are indicated in FIG. 14. For each cluster 1404, the image service 227 can determine the colors in the cluster 1404 that are as close as possible to each other (e.g., based on pixel value) while being as far as possible from the colors in other clusters 1404.
[0095] Each cluster 1404 can be defined based on pixels, pixel values, and a cluster centroid assigned by the image service 227. In the cluster 1404, the cluster centroid can be positioned therein to minimize the sum of the distances between the pixels of the cluster 1404 and the cluster centroid. The image service 227 can modify all of the pixels in the cluster 1404 to take the color of the cluster centroid, thereby standardizing the pixels to a single color. The image service 227 can reduce the number of distinct colors after performing cluster merging via K-means techniques. For example, the image service 227 can identify clusters 1404 that are in proximity and share substantially similar or identical color, and can merge the clusters 1404 into a merged cluster 1406.
By evaluating and merging various clusters 1404, the image service 227 can generate a second clustered image 1407. In at least one exemplary embodiment, K-means clustering and merging may be performed by generating 10 initial clusters 1404 in a CIELAB Euclidean space, identifying clusters 1404 of similar colors, and merging clusters 1404 of similar colors into 2 merged clusters 1406.
[0096] From a second clustered image 1407, the image service 227 can compute dominant color percentages that may be used to identify a denomination of gaming chips
included therein by comparing the computed dominant color percentages to stored color percentages associated with specific denominations. For example, from a second clustered image 1407, the image service 227 can compute dominant color percentages 1409 and 1411 to be 51.2% for a first dominant color and 48.8% for a second dominant color. The image service 227 can compare the dominant color percentages 1409 and 1411 to stored color percentages, and identify a denomination associated with dominant color percentages of about 51.2% of the first dominant color and about 48.2% of the second dominant color. By correlating the percentages to a denomination, the image service 227 can identify a denomination of the gaming chips included in the images 1405 and 1407. It will be understood by one of ordinary skill in the art that the above techniques can be repeated for individual image areas, for example, to identify multiple gaming chip denominations included in a single image.
[0097] Turning to FIG. 15, an example hardware diagram of a computing device 1500 is illustrated. Any of the image service 227, RFID service 228, cameras 209, or functionality described in the gaming table 206 may be implemented, in part, using one or more elements of the computing device 1500. The computing device 1500 can include one or more of a processor 1510, a Random Access Memory (“RAM”) 1520, a Read Only Memory (“ROM”) 1530, a memory device 1540, a network interface 1550, and an Input Output (“I/O”) interface 1560. The elements of the computing device 1500 are communicatively coupled via a bus 1502.
[0098] The processor 1510 can include an arithmetic processor, Application Specific Integrated Circuit (“ASIC”), or other types of hardware or software processors. The RAM and ROM 1520 and 1530 can include a memory that stores computer-readable instructions to be executed by the processor 1510. The memory device 1540 stores computer-readable instructions thereon that, when executed by the processor 1510, direct
the processor 1510 to execute various aspects of the present disclosure described herein. When the processor 1510 includes an ASIC, the processes described herein may be executed by the ASIC according to an embedded circuitry design of the ASIC, by firmware of the ASIC, or both an embedded circuitry design and firmware of the ASIC. As a non-limiting example group, the memory device 1540 comprises one or more of an optical disc, a magnetic disc, a semiconductor memory (i.e., a semiconductor, floating gate, or similar flash based memory), a magnetic tape memory, a removable memory, combinations thereof, or any other known memory means for storing computer-readable instructions. The network interface 1550 can include hardware interfaces to
communicate over data networks. The I/O interface 1560 can include device input and output interfaces such as keyboard, pointing device, display, communication, and other interfaces. The bus 1502 can electrically and communicatively couple the processor 1510, the RAM 1520, the ROM 1530, the memory device 1540, the network interface 1550, and the I/O interface 1560, so that data and instructions may be communicated among them.
[0099] In operation, the processor 1510 is configured to retrieve computer-readable instructions stored on the memory device 1540, the RAM 1520, the ROM 1530, or another storage means, and copy the computer-readable instructions to the RAM 1520 or the ROM 1530 for execution, for example. The processor 1510 is further configured to execute the computer-readable instructions to implement various aspects and features of the present disclosure. For example, the processor 1510 may be adapted and configured to execute the processes described above with reference to FIG. 3, including the processes described as being performed by the image service 227 or gaming table 206. Also, the memory device 1540 may store the data stored in the database 215.
CONCLUSION
[0100] From the foregoing, it will be understood that various aspects of the processes described herein are software processes that execute on computer systems that form parts of the system. Accordingly, it will be understood that various embodiments of the system described herein are generally implemented as specially-configured computers including various computer hardware components and, in many cases, significant additional features as compared to conventional or known computers, processes, or the like, as discussed in greater detail herein. Embodiments within the scope of the present disclosure also include computer-readable media for carrying or having computer-executable instructions or data structures stored thereon. Such computer-readable media can be any available media which can be accessed by a computer, or downloadable through communication networks. By way of example, and not limitation, such computer-readable media can comprise various forms of data storage devices or media such as RAM, ROM, flash memory, EEPROM, CD-ROM, DVD, or other optical disk storage, magnetic disk storage, solid state drives (SSDs) or other data storage devices, any type of removable non-volatile memories such as secure digital (SD), flash memory, memory stick, etc., or any other medium which can be used to carry or store computer program code in the form of computer-executable instructions or data structures and which can be accessed by a computer.
[0101] When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a computer- readable medium. Thus, any such a connection is properly termed and considered a computer-readable medium. Combinations of the above should also be included within the scope of computer-readable media. Computer-executable instructions comprise, for
example, instructions and data which cause a computer to perform one specific function or a group of functions.
[0102] Those skilled in the art will understand the features and aspects of a suitable computing environment in which aspects of the disclosure may be implemented.
Although not required, some of the embodiments of the claimed inventions may be described in the context of computer-executable instructions, such as program modules or engines, as described earlier, being executed by computers in networked
environments. Such program modules are often reflected and illustrated by flow charts, sequence diagrams, exemplary screen displays, and other techniques used by those skilled in the art to communicate how to make and use such computer program modules. Generally, program modules include routines, programs, functions, objects, components, data structures, application programming interface (API) calls to other computers whether local or remote, etc. that perform particular tasks or implement particular defined data types, within the computer. Computer-executable instructions, associated data structures and/or schemas, and program modules represent examples of the program code for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represent examples of corresponding acts for implementing the functions described in such steps.
[0103] Those skilled in the art will also appreciate that the claimed and/or described systems and methods may be practiced in network computing environments with many types of computer system configurations, including personal computers, smartphones, tablets, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, networked PCs, minicomputers, mainframe computers, and the like. Embodiments of the claimed invention are practiced in distributed computing environments where tasks are performed by local and remote
processing devices that are linked (either by hardwired links, wireless links, or by a combination of hardwired or wireless links) through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
[0104] An exemplary system for implementing various aspects of the described operations, which is not illustrated, includes a computing device including a processing unit, a system memory, and a system bus that couples various system components including the system memory to the processing unit. The computer will typically include one or more data storage devices for reading data from and writing data to. The data storage devices provide nonvolatile storage of computer-executable instructions, data structures, program modules, and other data for the computer.
[0105] Computer program code that implements the functionality described herein typically comprises one or more program modules that may be stored on a data storage device. This program code, as is known to those skilled in the art, usually includes an operating system, one or more application programs, other program modules, and program data. A user may enter commands and information into the computer through keyboard, touch screen, pointing device, a script containing computer program code written in a scripting language or other input devices (not shown), such as a microphone, etc. These and other input devices are often connected to the processing unit through known electrical, optical, or wireless connections.
[0106] The computer that effects many aspects of the described processes will typically operate in a networked environment using logical connections to one or more remote computers or data sources, which are described further below. Remote computers may be another personal computer, a server, a router, a network PC, a peer device or other common network node, and typically include many or all of the elements
described above relative to the main computer system in which the inventions are embodied. The logical connections between computers include a local area network (LAN), a wide area network (WAN), virtual networks (WAN or LAN), and wireless LANs (WLAN) that are presented here by way of example and not limitation. Such networking environments are commonplace in office-wide or enterprise-wide computer networks, intranets, and the Internet.
[0107] When used in a LAN or WLAN networking environment, a computer system implementing aspects of the invention is connected to the local network through a network interface or adapter. When used in a WAN or WLAN networking environment, the computer may include a modem, a wireless link, or other mechanisms for establishing communications over the wide area network, such as the Internet. In a networked environment, program modules depicted relative to the computer, or portions thereof, may be stored in a remote data storage device. It will be appreciated that the network connections described or shown are exemplary and other mechanisms of establishing communications over wide area networks or the Internet may be used.
[0108] While various aspects have been described in the context of a preferred embodiment, additional aspects, features, and methodologies of the claimed inventions will be readily discernible from the description herein, by those of ordinary skill in the art. Many embodiments and adaptations of the disclosure and claimed inventions other than those herein described, as well as many variations, modifications, and equivalent arrangements and methodologies, will be apparent from or reasonably suggested by the disclosure and the foregoing description thereof, without departing from the substance or scope of the claims. Furthermore, any sequence(s) and/or temporal order of steps of various processes described and claimed herein are those considered to be the best mode contemplated for carrying out the claimed inventions. It should also be understood that,
although steps of various processes may be shown and described as being in a preferred sequence or temporal order, the steps of any such processes are not limited to being carried out in any particular sequence or order, absent a specific indication of such to achieve a particular intended result. In most cases, the steps of such processes may be carried out in a variety of different sequences and orders, while still falling within the scope of the claimed inventions. In addition, some steps may be carried out
simultaneously, contemporaneously, or in synchronization with other steps.
[0109] A phrase, such as“at least one of X, Y, or Z,” unless specifically stated otherwise, is to be understood with the context as used in general to present that an item, term, etc., can be either X, Y, or Z, or any combination thereof (e.g., X, Y, and/or Z). Similarly,“at least one of X, Y, and Z,” unless specifically stated otherwise, is to be understood to present that an item, term, etc., can be either X, Y, and Z, or any combination thereof (e.g., X, Y, and/or Z). Thus, as used herein, such phrases are not generally intended to, and should not, imply that certain embodiments require at least one of either X, Y, or Z to be present, but not, for example, one X and one Y. Further, such phrases should not imply that certain embodiments require each of at least one of X, at least one of Y, and at least one of Z to be present.
[0110] The embodiments were chosen and described in order to explain the principles of the claimed inventions and their practical application so as to enable others skilled in the art to utilize the inventions and various embodiments and with various modifications as are suited to the particular use contemplated. Alternative embodiments will become apparent to those skilled in the art to which the claimed inventions pertain without departing from their spirit and scope. Accordingly, the scope of the claimed inventions is defined by the appended claims rather than the foregoing description and the exemplary embodiments described therein.
Claims
1. A system comprising:
a gaming table;
at least one imaging device; and
at least one computing device in communication with the at least one imaging device, the at least one computing device being configured to at least:
receive an image from the at least one imaging device;
locate at least one stack of gaming chips in the image;
determine a count of a plurality of gaming chips in the at least one stack of gaming chips; and
determine a denomination of each of the plurality of gaming chips in the at least one stack of gaming chips.
2. The system of claim 1, wherein the at least one computing device is further configured to:
perform a horizontal edge detection on the image;
perform a horizontal line and transition estimation based at least in part on the horizontal edge detection; and
determine the count based at least in part on the horizontal line and transition estimation.
3. The system of claim 1, wherein the at least one computing device is further configured to:
determine a height of the at least one stack of gaming chips; and determine the count of the plurality of gaming chips by dividing the height by a relative gaming chip thickness.
4. The system of claim 3, wherein the relative gaming chip thickness is associated with a capture angle of the image.
5. The system of claim 1, wherein the at least one computing device is further configured to locate the at least one stack by excluding background of the image from analysis.
6. The system of claim 1, wherein the background is excluded at least in part by applying a mirror algorithm.
7. The system of claim 1, wherein the at least one computing device is further configured to verify the count of the plurality of gaming chips in the at least one stack of gaming chips based on reading an RFID tag in each of the plurality of gaming chips in the at least one stack of gaming chips.
8 The system of claim 1, wherein the at least one computing device is further configured to:
determine a dominant color associated with at least one of the plurality of gaming chips; and
determine the denomination of the at least one of the plurality of gaming chips based at least in part on the dominant color.
9. The system of claim 1, wherein the at least one computing device is further configured to locate the at least one stack of gaming chips in the image based at least in part on a spiking neural network (SNN) model.
10. The system of claim 1, wherein the at least one computing device is further configured to generate a histogram corresponding to a region of a gaming chip of the plurality of gaming chips.
11. The system of claim 1, wherein the at least one computing device is further configured to:
determine a plurality of color percentages corresponding to counts of pixel color; and
determine each of the plurality of color percentages fall within predefined ranges associated with one of a plurality of denominations of gaming currency.
12. The system of claim 1, wherein the at least one computing device is further configured to partition colors of pixels into clusters.
13. The system of claim 1, wherein the at least one computing device is further configured to partition colors of pixels into clusters based at least in part on K- means clustering.
14. A method comprising:
processing, via at least one computing device, at least one image to locate a stack of gaming chips;
determining, via the at least one computing device, a count of a plurality of gaming chips in the stack of gaming chips; and
determining, via the at least one computing device, a denomination of each of the plurality of gaming chips in the at least one stack of gaming chips.
15. The method of claim 14, further comprising:
performing, via the at least one computing device, a horizontal edge detection on the image;
performing, via the at least one computing device, a horizontal line and transition estimation based at least in part on the horizontal edge detection; and
determining, via the at least one computing device, the count based at least in part on the horizontal line and transition estimation.
16. The method of claim 14, further comprising:
determining, via the at least one computing device, a dominant color associated with at least one of the plurality of gaming chips; and
determining, via the at least one computing device, the denomination of the at least one of the plurality of gaming chips based at least in part on the dominant color.
17. The method of claim 14, further comprising generating, via the at least one computing device, a histogram corresponding to a region of a gaming chip of the plurality of gaming chips.
18. The method of claim 14, further comprising:
determining, via the at least one computing device, a plurality of color percentages corresponding to counts of pixel color; and
determining, via the at least one computing device, each of the plurality of color percentages fall within predefined ranges associated with one of a plurality of denominations of gaming currency.
19. The method of claim 14, further comprising partitioning, via the at least one computing device, colors of pixels into clusters.
20. The method of claim 14, wherein partitioning is performed based at least in part on K-means clustering.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201862739918P | 2018-10-02 | 2018-10-02 | |
US62/739,918 | 2018-10-02 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020072664A1 true WO2020072664A1 (en) | 2020-04-09 |
Family
ID=70055855
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2019/054320 WO2020072664A1 (en) | 2018-10-02 | 2019-10-02 | Vision based recognition of gaming chips |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2020072664A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113546398A (en) * | 2021-07-30 | 2021-10-26 | 重庆五诶科技有限公司 | Chess and card game method and system based on artificial intelligence algorithm |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5742656A (en) * | 1996-03-21 | 1998-04-21 | The Casino Software Corporation Of America | Gaming token tray employing ultrasonic token counting |
US6567159B1 (en) * | 1999-10-13 | 2003-05-20 | Gaming Analysis, Inc. | System for recognizing a gaming chip and method of use |
US20030220136A1 (en) * | 2002-02-05 | 2003-11-27 | Mindplay Llc | Determining gaming information |
US20050111730A1 (en) * | 2003-11-20 | 2005-05-26 | Bin Zhang | Method and system of image segmentation using regression clustering |
US20050164781A1 (en) * | 1995-10-05 | 2005-07-28 | Thomas Lindquist | Gambling chip recognition system |
US20070077987A1 (en) * | 2005-05-03 | 2007-04-05 | Tangam Gaming Technology Inc. | Gaming object recognition |
US20110052049A1 (en) * | 2009-08-26 | 2011-03-03 | Bally Gaming, Inc. | Apparatus, method and article for evaluating a stack of objects in an image |
US20110227703A1 (en) * | 2010-03-22 | 2011-09-22 | Kotab Dominic M | Systems and methods of reading gaming chips and other stacked items |
WO2016053521A1 (en) * | 2014-09-29 | 2016-04-07 | Bally Gaming, Inc. | Bet sensing apparatuses and methods |
US20180075698A1 (en) * | 2016-09-12 | 2018-03-15 | Angel Playing Cards Co., Ltd. | Chip measurement system |
US20180247134A1 (en) * | 2015-05-29 | 2018-08-30 | Arb Labs Inc. | Systems, methods and devices for monitoring betting activities |
-
2019
- 2019-10-02 WO PCT/US2019/054320 patent/WO2020072664A1/en active Application Filing
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050164781A1 (en) * | 1995-10-05 | 2005-07-28 | Thomas Lindquist | Gambling chip recognition system |
US5742656A (en) * | 1996-03-21 | 1998-04-21 | The Casino Software Corporation Of America | Gaming token tray employing ultrasonic token counting |
US6567159B1 (en) * | 1999-10-13 | 2003-05-20 | Gaming Analysis, Inc. | System for recognizing a gaming chip and method of use |
US20030220136A1 (en) * | 2002-02-05 | 2003-11-27 | Mindplay Llc | Determining gaming information |
US20050111730A1 (en) * | 2003-11-20 | 2005-05-26 | Bin Zhang | Method and system of image segmentation using regression clustering |
US20070077987A1 (en) * | 2005-05-03 | 2007-04-05 | Tangam Gaming Technology Inc. | Gaming object recognition |
US20110052049A1 (en) * | 2009-08-26 | 2011-03-03 | Bally Gaming, Inc. | Apparatus, method and article for evaluating a stack of objects in an image |
US20110227703A1 (en) * | 2010-03-22 | 2011-09-22 | Kotab Dominic M | Systems and methods of reading gaming chips and other stacked items |
WO2016053521A1 (en) * | 2014-09-29 | 2016-04-07 | Bally Gaming, Inc. | Bet sensing apparatuses and methods |
US20180247134A1 (en) * | 2015-05-29 | 2018-08-30 | Arb Labs Inc. | Systems, methods and devices for monitoring betting activities |
US20180075698A1 (en) * | 2016-09-12 | 2018-03-15 | Angel Playing Cards Co., Ltd. | Chip measurement system |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113546398A (en) * | 2021-07-30 | 2021-10-26 | 重庆五诶科技有限公司 | Chess and card game method and system based on artificial intelligence algorithm |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11749053B2 (en) | Systems, methods and devices for monitoring betting activities | |
US11694336B2 (en) | System and method for machine learning-driven object detection | |
US8488881B2 (en) | Object segmentation at a self-checkout | |
US20060160608A1 (en) | Card game system with automatic bet recognition | |
US20060160600A1 (en) | Card game system with automatic bet recognition | |
Baek et al. | Detection of counterfeit banknotes using multispectral images | |
AU2019201016B2 (en) | Systems, methods and devices for monitoring betting activities | |
Zeggeye et al. | Automatic recognition and counterfeit detection of Ethiopian paper currency | |
Geusebroek et al. | Learning banknote fitness for sorting | |
US10599925B2 (en) | Method of detecting fraud of an iris recognition system | |
CN110619336B (en) | Goods identification algorithm based on image processing | |
WO2020072664A1 (en) | Vision based recognition of gaming chips | |
JPWO2010035314A1 (en) | Coin damage determination apparatus and damage determination method | |
Shinde et al. | Identification of fake currency using soft computing | |
US10438436B2 (en) | Method and system for detecting staining | |
CN117496201B (en) | Identification method for electronic cigarette, atomizer and battery rod | |
JP2018156605A (en) | Paper sheet processor, paper sheet processing method and paper sheet process program | |
Huan et al. | STATISTICAL NOISE BAND REMOVAL FOR SURFACE CLUSTERING OF HYPERSPECTRAL DATA |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19869445 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19869445 Country of ref document: EP Kind code of ref document: A1 |