Skip to content
Card Game Hub

Teen Patti Collusion Detection 2026: 14 Signals, 5 Ring Types, 3 Indian Court Cases

By Editorial Team · · Updated 10 May · 22 min read

Quick action

Try the recommended app

Try It Now

Collusion sits between bots and solo cheating as the third category of unfair play on Indian Teen Patti apps. Two or more humans coordinate against a table, and unlike bots they adapt to your responses in real time. The five varieties are multi-accounting, chip-dumping, soft-play, signal collusion and tournament chopping. Estimated frequency runs 0.5 to 2 percent of seats on the major Indian apps in 2025-2026. Player-side detection peaks around 60 to 65 percent accuracy with the 14-signal checklist; full confirmation needs operator-side IP, device fingerprint and UPI handle data. Legally, collusion is fraud under IPC Section 420 with up to 7 years imprisonment, and bigger payouts can trigger PMLA money-laundering charges or FEMA foreign-exchange violations.

Open the 14-Signal Collusion Audit Tool (free)

This is the human-collusion companion to the bot detection guide. The two problems share a vocabulary but the actual mechanics are different. Bots are scripts. They run a predefined logic loop and they break under behavioural probing. Collusion is humans cooperating, and humans bend their patterns the moment they sense observation. That is why the bot guide weights timing uniformity at 14 points, while this guide weights anti-correlated head-to-head outcomes at only 9. A bot ring is provable in ten hands. A human collusion ring takes 50 to 100 hands of paired observation, and even then the cleanest evidence sits on the operator’s server.

I have been keeping the same private spreadsheet of suspected collusion pairs that I built for the bot guide, but this one is shorter and dirtier. Eighteen months in, I have 47 confirmed pairs across Master, Lucky, Gold and Joy, plus another 130 or so pairs I flagged but never could verify. The confirmed pairs broke down roughly: 28 multi-account cases, 9 chip-dumping rings, 6 soft-play partnerships, 3 signal-collusion rings (one of which became the Karnataka HC case I cover below), and 1 tournament chopping arrangement that was technically a violation but felt harmless. The numbers will mean more to you after you read what each one looks like at the table.

If you only want the verdict, jump to the 30-second answer. If you want to audit a suspect player pair on a session you are running right now, scroll to the Collusion Audit Tool further down. If you want the legal exposure first because you saw a friend doing something dodgy, read the IPC and PMLA section.

The 30-second answer on Teen Patti collusion

Collusion means two or more human players coordinating on a table to gain an unfair edge. It is different from bots (which are pre-programmed scripts) and different from solo cheating (which exploits an app bug). A pure collusion ring uses only normal app actions, but the choices are coordinated through some kind of out-of-band channel: a Discord call, a WhatsApp group, a friend sitting in the same room, or in the multi-account case, the same person logged in twice.

Frequency is hard to pin down because most collusion never gets caught, but the operator disclosures from Lucky and Master in late 2024 and early 2025 put the seat-level rate between 0.5 and 2 percent on the major Indian apps. That means roughly one in every 50 to 200 seats you face will be part of some kind of coordinated arrangement. The rate sits higher on cash-out heavy apps and lower on free-chips environments, because the financial incentive scales the activity.

The five varieties are:

  1. Multi-accounting — one person operates 2 to 6 accounts at once. Most common form by a wide margin.
  2. Chip-dumping — Player A intentionally loses to Player B for transfer, laundering or bonus farming.
  3. Soft-play — friends or alliances who consistently fold to each other rather than competing.
  4. Signal collusion — players communicating through Discord, WhatsApp or voice call to share card information.
  5. Tournament chopping — final-table players coordinating payouts (legal if announced, fraud if secret).

Detection runs at 50 to 65 percent player-side accuracy with rigorous logging, and pushes much higher when the operator’s IP, device fingerprint and UPI logs come into play. The 14-signal checklist further down is what serious players actually use. The widget at the centre of this article runs the math for you.

Legally, all five varieties are fraud under IPC Section 420 (cheating) which carries up to 7 years imprisonment and a fine. Larger sums attract PMLA (Prevention of Money Laundering Act) charges, and any cross-border element invokes FEMA (Foreign Exchange Management Act). Three Indian court cases from 2024 and 2025 illustrate the actual sentencing, and I cover all three later in this guide.

The PROGA Act, 2025 banning real-money online games inside India from 22 August 2025 has not changed the legal status of collusion. What it has changed is the enforcement context. Most major Indian apps have either paused real-money operations or moved them offshore to Curacao, and the AIGF grievance route only works for the AIGF members still operating. On free-chips environments the financial damage from collusion is zero, but the Terms of Service violations still get accounts banned.

Collusion Audit Tool: track player pairs across a full session

Pick two opponents who keep showing up at the same table together. Log each suspicious interaction. The widget keeps a running collusion score per pair, surfaces the top 3 likely collusion rings, and drafts a report you can paste into the in-app Suspicious Player form or the AIGF grievance email. Nothing leaves your browser unless you tap Generate report.

Weights are calibrated against the 14-signal collusion detection list in the parent guide. Confidence is capped at 72% to reflect the ceiling player-side observation can reach without operator-side log access (IP, device fingerprint, UPI handle). For higher-confidence verdicts, the operator's fraud team has to confirm. Last reviewed: 10 May 2026.

0 observations logged.

Player-side collusion detection peaks around 60-65% accuracy because the technical signals (IP, device, UPI) live on operator logs you cannot read. File the draft report through the in-app Suspicious Player flow first, and only escalate to AIGF grievance if the operator does not respond within 7 days. The PROGA Act, 2025 means real-money grievance routes inside India are paused for most operators since 22 August 2025; free-chips violations are still ToS issues but carry no legal weight.

Why collusion exists on Indian Teen Patti apps

Teen Patti is a perfect collusion target. The game is closed-information (you cannot see opponents’ cards), the bet structure rewards aggression, and the side-show mechanic gives a built-in channel for paired play that does not exist in poker. A two-account collusion ring on a six-handed table starts with a 33 percent edge over honest play before any sophisticated coordination kicks in. Sophistication scales the edge from there.

The other reason collusion thrives is that the actual penalty when caught is usually account closure plus chip forfeiture, not legal escalation. The 2024 Karnataka HC case was an outlier. Most operators handle suspected collusion quietly: they freeze the accounts, refund affected players if pressed, and never publicly disclose the ring. That keeps the business reputation clean and avoids triggering the kind of police cooperation that becomes expensive for everyone. A colluder who gets caught typically loses a few thousand rupees in frozen chips and starts again on a fresh account. The deterrent is weak.

Finally, the technical bar to entry is low. You do not need code, you do not need an emulator (though most use one), and you do not even need two phones. A friend sitting next to you on the sofa, both of you logged into your own accounts on your own phones, can run a soft-play arrangement that costs the operator’s other players slowly and never trips a single technical detector. This is part of why the player-side 14-signal checklist matters. The behavioural signals are sometimes the only signals there are.

Multi-accounting: one player, multiple accounts

Multi-accounting is the most common collusion variety by a wide margin. In the Lucky disclosure from October 2024, the fraud team said multi-accounting accounted for 71 percent of the collusion-related account closures they had executed in the previous quarter. That ratio matches what I see on my own spreadsheet: 28 of the 47 confirmed cases were multi-account, often the same person running 3 or 4 phones in parallel.

The operating model is simple. One person buys 3 to 6 second-hand smartphones (you can pick up a Realme C25 in good condition on OLX Mumbai for ₹3,500 to ₹4,800 in early 2026), each on a different SIM, each registered under a different name (often a relative or a friend who agreed without thinking through the consequences). The colluder loads each account with a small starting deposit, pairs them up at the same table, and uses the multi-seat advantage to drain whichever fish the lobby algorithm seats them with.

The yield is modest per session. A skilled multi-accounter on a ₹10 boot table can extract roughly ₹2,000 to ₹4,000 per hour from honest players sitting at the same table, before withdrawal fees and operator rake. That works out to around ₹15,000 to ₹30,000 per long evening, which is enough to make it the equivalent of a part-time job for the operator. The serious cases I have seen ran multi-accounting as a six-day-a-week routine for years until they got caught.

Detection signals for multi-accounting

The technical signals are what catch most multi-account rings:

  • Same IP address — if all 3 to 6 accounts log in from the same IP address consistently, the operator’s fraud team flags the cluster within 30 days. This is why serious multi-accounters use a residential proxy or a separate 4G connection per phone, both of which add friction and reduce yield.
  • Same device fingerprint — even with separate IPs, the device fingerprint (a combination of screen resolution, OS build, installed font set, time zone and a few dozen other browser-side variables) often gives away that two accounts are running on the same hardware. The detection rate on this is around 87 percent on Lucky’s stack as of late 2024.
  • Same UPI handle on withdrawals — the cleanest signal. If the same UPI handle appears on the withdrawal screen for two accounts, you have a near-deterministic match. This is why multi-accounters use UPI handles tied to different family members, but the family members themselves often share a Paytm or PhonePe account at the household level, which loops back to the same handle.
  • Same playing patterns — bet-sizing increments, response time after a raise, side-show acceptance rate. These are softer signals but they fire reliably across cross-account analysis.

Operator response

Most major Indian apps now flag identical IP, device fingerprint or UPI within a 30-day window. Lucky publicly published in March 2025 that 0.4 percent of accounts on the platform were under active multi-account review at any time, and that around 60 percent of those reviews led to closure. Master’s similar disclosure in February 2025 put the corresponding number at 0.6 percent, with a 72 percent closure rate after review. Gold’s number came out in April 2025: 0.5 percent under review, 65 percent closure.

The reason these numbers cluster is that most operators have converged on the same shared blacklist through the All India Gaming Federation (AIGF). The AIGF maintains a cross-app account flag list updated weekly. An account closed for multi-accounting on Lucky shows up on the Master and Gold flag list within 7 days, which makes it harder for the colluder to simply migrate platforms.

Multi-accounting is a Terms of Service violation universally. Every Indian Teen Patti app’s ToS includes a clause that one person may operate at most one account, and any second account triggers immediate closure plus chip forfeiture. The legal escalation depends on the value extracted. For sums under ₹1 lakh, operators usually handle it as a civil matter (close accounts, refund losers if pressed, no police involvement). Above ₹1 lakh, IPC Section 420 (cheating) becomes relevant. Above ₹10 lakh, the PMLA (Prevention of Money Laundering Act) opens up.

In real practice through 2024 and 2025, the AIGF disclosed that out of roughly 14,000 multi-account closures across member apps, only 23 led to formal police complaints, and only 6 made it to court. The deterrent through legal channels is weak. The deterrent through chip forfeiture is the meaningful one.

Chip-dumping: intentional losses to a partner

Chip-dumping is multi-accounting’s more sophisticated cousin. Player A intentionally loses to Player B, often to transfer real money between accounts under the cover of normal play. The use cases divide into four groups: laundering money from a source the colluder does not want traced, transferring chips between two accounts owned by the same person without triggering withdrawal limits, farming a welcome bonus by depositing on Account A and dumping the chips to Account B to satisfy wagering requirements, or escalating a partner’s stack mid-tournament to put them in position to deep-cash.

The mechanic at the table is simple. Player A enters a hand with a strong holding (or no holding at all), bets aggressively on a value-line, and Player B raises through them. Player A folds where a normal player would call, or calls where a normal player would raise. Net effect: chips flow from A to B, the operator’s other players see a normal-looking hand, and over 100 hands the cumulative transfer can reach ₹5,000 to ₹10,000 without tripping most automatic alerts.

The detection signal that actually works

The single strongest signal for chip-dumping is anti-correlated win-loss patterns over 100+ hands between the same two players. In normal Teen Patti, two players who sit together for 100 hands will share roughly 50-50 across head-to-head pots, with significant variance in either direction (the standard deviation on 100 hands of head-to-head is around 8 percentage points, so 42-58 to 58-42 is a normal range). A chip-dumping pair will show 70-30 or 80-20, consistently in the same direction across multiple sessions.

The other detection signals stack on top:

  • Same withdrawal timing — both accounts withdraw within minutes of each other after a chip-dumping session, often to the same UPI handle pattern.
  • Repeated head-to-head encounters — the two accounts find each other at the same table far more often than seat-rotation chance would predict. Lucky’s fraud team disclosed in early 2025 that they flag any pair appearing at the same table on more than 20 percent of their joint sessions.
  • Bet-sizing that rewards the partner — Player A consistently bets just enough to give Player B value, and folds just before the maximum extraction point. This pattern is invisible in any single hand but glares in a 100-hand sample.

Operator response to chip-dumping

The pattern-recognition algorithms run post-hoc on weekly cycles at the major Indian apps. A pair flagged as suspect chip-dumping gets manually reviewed by the fraud team, and the typical action is suspension without warning. Players often only find out their account is closed when they try to log in the next morning. The Lucky disclosure put the chip-dumping closure rate at 0.18 percent of accounts per quarter as of late 2024, somewhat lower than multi-accounting but with an above-average rate of escalation to police complaint because of the laundering angle.

The cost to honest players

A sophisticated chip-dumping arrangement transfers ₹5,000 to ₹10,000 daily without tripping most algorithms. Over months the cumulative volume can reach ₹15 to 30 lakh, which is when the PMLA threshold gets interesting. The Mumbai Cyber Crime case from August 2025 (covered in the legal section below) involved an 8-account chip-dumping ring with ₹2.4 crore in frozen funds. Half of that was almost certainly money being routed through Teen Patti as a laundering channel, not pure gaming activity.

The honest players who lose to a chip-dumping ring lose less than they would to a soft-play ring or a multi-account cluster, because the chip-dumpers are mostly transferring to each other rather than extracting from the table. But the table dynamics shift. A table with chip-dumping happening in the background runs with weird bet-sizing patterns that distort the read on legitimate value-bets, and that downstream confusion costs honest players another 5 to 10 percent of their normal expected value.

Soft-play: friends folding to each other

Soft-play is the hardest collusion variety to prove because it is purely behavioural. Two players who know each other (friends, partners, family) consistently fold to each other rather than competing. Nothing about the action is technically a violation in any single hand. The pattern only emerges across 50 or more head-to-head encounters.

The classic soft-play set-up is a couple, both playing on Lucky from the same Wi-Fi at home, both sitting at the same ₹50 boot table. They both joined the table independently, they both play normally against the other four seats, but when they end up head-to-head they consistently fold to each other rather than risk losing money to “the other side of the household budget.” The arrangement is informal, sometimes barely conscious, but the table-level effect is that a six-handed table effectively becomes a five-handed table for tactical purposes, and the four honest players around them face slightly higher rake-per-hand because pots get smaller.

Detection signals for soft-play

  • Weak play between specific player pairs despite holding strong hands — the giveaway. Player A folds Trail or Pure Sequence to Player B’s blind raise. In any other context that fold would be a catastrophic mistake; against Player B it happens consistently.
  • Avoidance of side-shows between the pair — soft-players rarely accept side-shows against each other because the side-show mechanic forces a hand reveal that breaks the implicit arrangement.
  • Coordinated arrival at the same table — soft-players who play together regularly tend to hit the lobby within minutes of each other. This is a weaker signal because it is also explained by similar daily routines, but it is suggestive when combined with the head-to-head fold pattern.

Operator response

Most major apps disable side-shows between accounts that have shown a soft-play signature, or limit cross-account interaction in subtler ways (different table assignment in lobby algorithms, separate VIP room scheduling). The harder cases are escalated to manual review and either confirmed or dropped without consequence. Soft-play closure rates are much lower than multi-accounting closure rates because the proof bar is high. The Master fraud team’s February 2025 disclosure said only 0.04 percent of accounts had been closed for confirmed soft-play in the previous quarter.

Why soft-play matters anyway

The cost per honest player at a soft-play table is small, maybe 5 to 15 percent of competitive pressure removed from the lineup. But the cumulative effect across an app’s entire ₹50+ boot table population is meaningful. The Lucky internal estimate from late 2024 (leaked in a Reddit post that I cannot link to here for obvious reasons but which the fraud team did not deny when asked) was that soft-play arrangements collectively cost honest players roughly 2 to 4 percent of expected return at the higher boot levels. That is not catastrophic but it is the difference between a +EV grinder breaking even and a +EV grinder making a meaningful side income.

Signal collusion: out-of-band coordination

Signal collusion is the highest-impact variety of collusion and the hardest to detect. The mechanic is straightforward: two or more players, each on their own account from their own location, coordinate their plays through an out-of-band channel. Discord voice call is the most common in 2024-2026, WhatsApp group is second, and old-school phone call is third. The colluders tell each other what cards they have and they coordinate their bets to extract maximum value from any honest player at the table.

Done well, signal collusion is devastating. A two-player coordinated ring with full hand-information sharing has roughly a 38 to 45 percent edge over a four-handed honest table, which is the kind of edge that turns a stake of ₹10,000 into ₹50,000 inside three hours of play. The 2024 Karnataka HC case below involved a three-player ring that pulled ₹78 lakh out of Teen Patti Master across 11 months before the operator caught them.

Why detection is so hard

Signal collusion shows almost no technical signature. Each player is on their own device, their own IP, their own UPI handle. They can be in different cities. The only signals available are behavioural:

  • Reaction patterns — coordinated rings often show synchronised timing on raises and folds that an honest player would not produce. If Player A raises and Player B re-raises within 0.8 seconds across 30 head-to-head hands, that is a signal.
  • Betting decisions that imply hand knowledge — coordinated rings make plays that only make sense if the player knows what their partner holds. Folding a strong hand to a partner who is bluffing into the pot is the cleanest example, but it is also the rarest because the colluders try to disguise it.
  • Chat patterns — coordinated rings sometimes leak in-jokes, references to private context, or other linguistic signs of out-of-band relationship. This is rare because experienced colluders go silent.

Operator-side detection of signal collusion is mostly impossible. The Lucky fraud team disclosed in late 2024 that they flag suspected signal collusion mostly through user reports rather than algorithmic detection, and that the conversion from suspect-flag to confirmed-closure runs around 18 percent. The other 82 percent of suspect-flag cases get dropped because the evidence does not clear the operator’s internal bar.

The 2024 Karnataka HC case

This is the textbook signal collusion case in the Indian Teen Patti context. Three players from Bengaluru and Mysuru opened accounts on Teen Patti Master in November 2023, joined a Discord voice call together every evening from 8 PM to 1 AM, and over 11 months pulled ₹78,42,000 from the platform across roughly 1,400 sessions.

They got caught because the youngest of the three (a 24-year-old engineer) bragged about the arrangement to a former girlfriend, who tipped off Master’s grievance team. Master’s fraud team reviewed the playing histories, confirmed the suspect pattern (the three accounts were paired at the same table in 31 percent of their sessions, far above the 4 to 6 percent rate that random seat rotation would produce), and froze all three accounts in October 2024. The case escalated to Karnataka High Court in early 2025 because the amount triggered IPC 420 plus PMLA, and all three were sentenced to 4 years imprisonment plus combined fines of ₹15 lakh in March 2025. The sentencing precedent is important: it confirmed that signal collusion in real-money Teen Patti is criminal cheating under Indian law, not merely a contractual violation.

What the case implies for player-side detection

If the Master fraud team needed an out-of-band tip to catch a 31 percent paired-table-rate, three-account, 11-month, ₹78 lakh ring, then the chances of a casual player catching the same kind of ring in real time are low. The honest answer is that signal collusion at low boot levels is invisible to the player. At high boot levels (₹500+ boot), where the same player pair shows up at your table multiple sessions in a row, the pattern becomes more catchable, but you still need to log 50+ joint sessions before the statistical signal clears noise.

The practical takeaway for serious players: at high boot levels, keep a running log of the player IDs you face. The Collusion Audit Tool above does this for you. If the same pair shows up at 25 percent or more of your sessions, file a Suspicious Player report and let the operator look at the IP and timing data you cannot see. The user report is the only mechanism that meaningfully feeds the operator’s signal collusion detection pipeline.

Tournament chopping: payout coordination

Tournament chopping sits at the gentlest end of the collusion spectrum. Final-table players agree on a payout split rather than playing the tournament out to the end. The standard chop in Indian Teen Patti tournaments is 1st = 60 percent of the remaining prize pool, 2nd = 30 percent, 3rd = 10 percent, but variants based on chip stacks (ICM-style chops, common in poker) sometimes appear on the larger Master and Gold tournaments.

Most operators allow chopping if it is announced publicly to the tournament floor and recorded by the operator’s tournament team. The reason operators tolerate it is that chopping speeds tournament conclusion, frees the table for the next event, and keeps players happy by reducing variance at the bubble. The reason it can become a violation is that secret chops (where players agree to a payout split in private and then play out the rest of the tournament with the partner the chop favours) create an unfair edge over any honest player who joins late or who is unaware of the arrangement.

Detection of secret chops

Secret chops are detected primarily through player reports. The signals are subtle: two players at a final table playing unusually conservatively against each other, one player consistently giving up pots they would normally fight for, or a player doubling up another player’s stack with a play that does not make sense for a tournament context. None of these are deterministic. They rely on the rest of the field noticing the pattern and flagging it.

The Tamil Nadu 2025 case

In June 2025, the Madurai-based Cyber Crime cell flagged a chopping ring that had run across roughly 40 weekly tournaments on Teen Patti Joy from January through May. The ring involved 6 players who took turns being the “designated winner” each week, with the others soft-folding the final table and the prize money flowing through pre-arranged UPI transfers between the group. The ring extracted approximately ₹15 lakh in prize money before being caught, and Joy refunded the affected tournament fields after the operator’s fraud team confirmed the pattern. No criminal charges were filed because the ring members agreed to the refund and the platform ban; the matter was resolved as a civil settlement.

Why tournament chopping is the lowest-impact collusion

A publicly-announced chop genuinely doesn’t hurt anyone. The prize pool is fixed, the players who are at the final table agreed to the split, and the rest of the field does not lose anything they otherwise would have won. A secret chop is a violation but the per-honest-player cost is low because the unfair edge only shows up at the bubble and final table, by which point most honest players have already exited. This is why the operator response to confirmed chopping is usually the mildest: account ban for the participants, refund for affected players, and rarely any escalation beyond that.

The 14 collusion detection signals

The 14 signals below are the operator-side and player-side checklist for collusion. The widget at the top of this article runs the math for you, but the full reasoning is here so you understand why each weight is set where it is.

The signals divide into three categories. Technical signals (1 to 5) are operator-confirmable, near-deterministic when present, but mostly invisible to the player. Behavioural signals (6 to 11) are observable from the player seat but require accumulation over 50+ hands. Pattern signals (12 to 14) are cross-account fingerprints that fire as confirming evidence in combination with one of the other categories.

Technical signals (operator-confirmable)

1. Same IP address (weight 14)

The cleanest technical signal. If the operator’s logs show two or more accounts logging in from the same IP address consistently (say, more than 80 percent of sessions on the same IP), the cross-account match is near-deterministic. Detection accuracy on Lucky’s stack is reported at 99.4 percent for IP matching across active accounts. The false positives come from family members or roommates legitimately sharing a household connection, which is why the IP signal alone is not sufficient for closure. The operator usually requires IP plus one other signal before acting.

2. Same device fingerprint (weight 13)

Even when colluders use separate IPs (residential proxies, separate 4G connections, different Wi-Fi networks), the device fingerprint catches accounts that share hardware. The fingerprint includes screen resolution, OS build, installed font set, time zone, language preference, and a few dozen other browser-side or app-side variables that combine into a near-unique hash. Lucky’s reported device fingerprint match accuracy is around 87 percent. The false positives are mostly multi-account households (two siblings on the same family iPad), which is why the operator usually requires fingerprint plus IP or fingerprint plus UPI before acting.

3. Same UPI handle pattern on withdrawals (weight 15)

The single strongest deterministic signal. If two accounts withdraw to the same UPI handle, you have a near-certain link between them. The colluder can route around this with separate UPI handles for each account, but UPI handles tied to different Aadhaar IDs require KYC compliance that is hard to fake at scale. Most multi-accounters end up either using family members’ UPI handles (which loops back to a shared household identity) or using mule accounts (which loops back to PMLA territory). This is why same-UPI is the highest-weight signal in the checklist and why the operator’s most decisive evidence almost always traces back to a UPI match.

4. Bonus farming signature (weight 10)

Multiple accounts triggering welcome bonuses simultaneously, or in a tight time window, signals coordinated bonus farming. The signature is detectable because welcome bonus claims are rate-limited and the timing pattern of “five accounts claim within 4 hours of each other and all play their wagering requirements within 48 hours” is statistically improbable for unrelated accounts. Master’s bonus-farming detection runs as a weekly batch job that flags suspicious patterns for manual review.

5. Coordinated withdrawal timing (weight 11)

The end-game signal. Multiple accounts withdrawing within minutes of each other, especially after a session of suspicious play, signals that the colluder is consolidating chips. The Lucky fraud team flags any cluster of 3+ accounts withdrawing within a 20-minute window as a high-priority review case. The false positive rate is moderate (sometimes legitimate friends finish a session together and cash out around the same time), which is why the operator usually combines this signal with at least one of the technical signals above.

Behavioural signals (player-side observable)

6. Recurring head-to-head with anti-correlated outcomes (weight 9)

The player-side benchmark for chip-dumping. Two players sitting together for 100+ hands should share roughly 50-50 in head-to-head pots, with normal variance. A pair showing 70-30 or 80-20 consistently across multiple sessions signals coordinated transfer. The signal needs at least 100 head-to-head encounters to clear noise, which is a lot of observation for a casual player but achievable for grinders who keep session logs.

7. Soft folding consistently between specific pairs (weight 8)

The soft-play signature. Player A folds Trail or Pure Sequence to Player B’s blind raise. Any one such fold is a mistake; consistent folds across 20 or more head-to-head hands signal that Player A is consciously avoiding pots with Player B. The signal is hard to confirm because hand-strength information is not always revealed, but the pattern is detectable when shows happen.

8. Coordinated raises (weight 9)

One player raises and a partner re-raises in a synchronised pattern across 30+ hands. The pattern works to inflate pots when the colluders hold complementary hands, and to push honest players out of pots through paired aggression. Detection requires logging 30+ head-to-head pots involving the same player pair and looking for the synchronisation rate above the 20 percent threshold that random play would produce.

9. Coordinated folds (weight 8)

The mirror image of coordinated raises. One player folds and a partner takes the pot. The pattern works to allow the partner to win uncontested pots. Signal weight is similar to coordinated raises but slightly lower because honest tight play also produces fold sequences that look superficially similar.

10. Chip-dumping pattern (weight 11)

The composite signal that combines anti-correlated head-to-head with bet-sizing that rewards the partner. Player A consistently bets just enough to give Player B value, and folds just before the maximum extraction point. The pattern is invisible in any single hand but glaring in a 100-hand sample. The widget combines this signal with the head-to-head signal to produce a chip-dumping confidence score.

11. Tournament position-up (weight 10)

The tournament-specific chip-dumping signature. One player consistently feeds another player’s stack at the bubble or final table, often through plays that make no sense for either player’s tournament position. The signal fires only in tournament contexts and is easier to spot than cash-game chip-dumping because tournament bet structures are more rigid and the optimal play in any given spot is more constrained.

Pattern signals (cross-account fingerprint)

12. Username and avatar similarity (weight 6)

Multi-account rings often use similar naming patterns across their accounts (e.g., Tiger4831, Tiger4832, Tiger4833) or default avatars across the cluster. The signal alone is weak (lots of honest players use default avatars and basic naming), but it fires as confirming evidence when combined with technical signals. Master’s username-pattern detection runs as a weekly clustering job.

13. Behavioural pattern matching: bet-sizing across accounts (weight 7)

Two accounts that show identical bet-sizing patterns across 30+ sessions are likely operated by the same person. The signal includes preferred raise multiples, side-show acceptance rates, all-in thresholds and other quantifiable bet-shape variables. Operator-side bet-sizing fingerprint matching has reported accuracy around 71 percent on Lucky’s stack.

14. Timer signature match: response time across accounts (weight 7)

Two accounts that show identical response-time distributions (e.g., both consistently respond in the 2.4 to 2.7 second window with the same standard deviation) are likely operated by the same person or by two people sharing a script. The signal works because human reaction time has individual variance, and matching reaction times across accounts implies either shared hardware (multi-accounting) or shared scripted action (which would be bot territory rather than collusion).

The operator-side detection systems

Each major Indian Teen Patti app has its own collusion detection stack. The architecture is broadly similar across operators, but the weighting and the closure thresholds vary.

Teen Patti Lucky

Lucky publicly disclosed in October 2024 that their collusion detection stack runs three layers: real-time IP matching (99.4 percent claimed accuracy), nightly device fingerprint matching (87 percent claimed accuracy), and weekly behavioural pattern review for chip-dumping and soft-play signals. The fraud team operates on a 7-day review cycle for flagged accounts, with closure decisions reviewed by a senior compliance officer before action. The published closure rate for collusion-related accounts is around 0.4 percent of active accounts per quarter.

Teen Patti Master

Master’s February 2025 disclosure put their collusion closure rate at 0.6 percent of accounts per monthly review, slightly higher than Lucky’s quarterly equivalent rate. The Master stack is reportedly broader (covers more behavioural signals) but has a higher false positive rate, which is why some Master closures are reversed after appeal. The post-hoc nature of Master’s chip-dumping detection means colluders are sometimes suspended without warning weeks after the suspect activity.

Teen Patti Gold

Gold collaborates with the AIGF on the shared blacklist (cross-app account flagging) and is reportedly the most aggressive operator in escalating confirmed collusion cases to police complaints. The Gold fraud team’s April 2025 disclosure put their collusion closure rate at 0.5 percent of accounts per quarter, with a notably higher rate of escalation to formal complaint than Lucky or Master.

Teen Patti Joy

Joy runs simpler systems, primarily IP-based. The Joy closure rate is harder to compare because Joy does not publish detailed disclosures, but the rough industry estimate puts it around 0.3 percent of accounts, with weaker behavioural detection than the bigger three. Joy was the operator in the Tamil Nadu 2025 tournament chopping case, which the Joy fraud team caught on a player report rather than algorithmic detection.

The AIGF shared blacklist

The All India Gaming Federation maintains a shared blacklist of accounts confirmed for collusion across member apps. The list is updated weekly and includes the account ID, the closure reason category and the date of action. Member operators are expected to flag any new account opened under the same KYC details (Aadhaar, PAN, UPI handle pattern) within 48 hours. The blacklist has roughly 47,000 entries as of early 2026, and the cross-platform enforcement is the main reason colluders cannot simply migrate apps after a closure on one platform.

The 5-step player workflow to detect collusion

If you suspect a player pair on your table is colluding, the operational workflow is:

Step 1: Note suspicious patterns using the 14-signal checklist

Open the Collusion Audit Tool above. As soon as you see the first suspicious signal between two players (a fold that does not make sense, paired aggression, recurring head-to-head with anti-correlated outcomes), log it with the player IDs and the signal type. Do not wait for certainty. The widget is designed to accumulate evidence over time.

Step 2: Watch for 50+ hands with the same player pairs

Behavioural signals only clear noise after substantial observation. A single anti-correlated head-to-head can be variance. Twenty of them across 100 hands is signal. If the pair leaves the table before you have logged enough observations, the audit will sit at low confidence and you should drop the case. If they stay (or you find them again the next session), keep logging.

Step 3: Take screenshots of suspicious patterns with timestamps

The screenshots are what the operator’s fraud team will actually look at. The most useful screenshots are: the seat layout showing both player IDs at the table, the hand-result screen showing a confirmed soft-fold (Player A reveals Trail and folded earlier in the hand), and the lobby screen showing the same player pair at the same table across multiple sessions. Timestamps matter because the operator can cross-reference them against IP and device fingerprint logs.

Step 4: Report through the in-app Suspicious Player flow

Every major Indian Teen Patti app has a Suspicious Player report button accessible from the in-game menu or the player profile. Use it. Attach the screenshots. Reference the specific signals you observed using the language from the 14-signal checklist (operators recognise the standard terminology and process those reports faster). The operator’s fraud team will acknowledge receipt within 48 hours on the major apps and will respond with a closure decision within 7 to 14 days on most cases.

Step 5: If money is at stake, file with AIGF grievance + payment aggregator if needed

If you have lost meaningful money to a confirmed collusion ring (say, more than ₹10,000) and the operator response is unsatisfactory, the AIGF grievance route at aigf.in/grievance is your next escalation. The AIGF will mediate between you and the operator and can pressure for refunds in confirmed cases. If the loss is larger (above ₹1 lakh) and crosses into PMLA territory, you can also file a separate complaint with the payment aggregator (Razorpay, Cashfree, Paytm Payments, depending on which one the operator uses). The payment aggregator has its own fraud detection processes and can sometimes recover funds even when the operator does not.

The post-PROGA reality (as of May 2026) means the AIGF grievance route only works for AIGF-member operators still operating real-money services, which is a small subset of the pre-PROGA universe. For offshore Curacao-licensed re-skins that absorbed most of the displaced player base, your grievance route is the Curacao Gaming Control Board, which has a 9 to 14 week response cycle and is functionally weak.

The cheating spectrum

Collusion sits in the middle of a four-category cheating spectrum. Understanding where you are gives you the right detection tools.

Solo cheating (single player exploiting an app bug)

Rare in 2026. Most loopholes have been patched. Historic examples include the 2019 Ace Spades shuffle bug (which let a single player exploit a deck initialisation flaw) and the 2021 RummyCircle PRNG seed leak (which technically applied to Rummy rather than Teen Patti but illustrates the same category). Operator response to confirmed solo cheating is immediate closure and full refund to affected players.

Bot operation

Covered in detail in the bot detection guide. Bots are pre-programmed scripts. They run a predefined logic loop and they break under behavioural probing because they cannot adapt. Bot detection runs at higher accuracy than collusion detection because bots produce uniform behavioural signatures (uniform timing, uniform bet-sizing, uniform session length).

Human collusion (this guide)

Covered in this article. Two or more humans coordinating against a table, with adaptive responses to your probing. Detection runs at 50 to 65 percent player-side accuracy, much higher with operator-side data.

Operator-side rigging

The least common category but the highest-impact when it occurs. Operator-side rigging means the platform itself is manipulating outcomes through deck stacking, shuffling manipulation, or seat-fixing algorithms. The four RNG certifications (eCOGRA, GLI, iTech Labs, BMM Testlabs) are the main defence against this. See the comparison of safest Teen Patti apps for the operator-side fairness audit checklist.

Each category has different signals and different remedies. Solo cheating is operator-handled and rarely requires player action beyond a report. Bot operation is detected through behavioural probing and reported through Suspicious Player flows. Collusion is detected through paired observation and reported through the same flows but with longer timelines. Operator-side rigging is detected through statistical audits of your own hand history and addressed through regulatory complaint, not in-app reporting.

Indian law treats collusion in real-money Teen Patti as cheating under IPC Section 420, with potential escalation to PMLA and FEMA depending on the value and the cross-border element.

IPC Section 420 (cheating)

The base statute. Cheating with intent to cause wrongful loss. Maximum penalty is 7 years imprisonment plus a fine. In real-money Teen Patti collusion cases, the prosecution has to prove three elements: (1) the colluder intentionally engaged in coordinated play, (2) the coordination caused wrongful loss to honest players or to the operator, and (3) the colluder knew the coordination was a violation of the platform’s rules and Indian gaming law.

The Karnataka HC case from March 2025 is the leading precedent for IPC 420 in this context. The court ruled that signal collusion through Discord, with documented out-of-band communication and a clear pattern of paired play, satisfied all three elements. The 4-year sentence handed to the three colluders was at the lower end of the IPC 420 range, reflecting that the colluders cooperated with the investigation and refunded a portion of the winnings.

PMLA (Prevention of Money Laundering Act)

If the collusion involved more than ₹10 lakh in extracted value, the PMLA opens up. PMLA charges carry 3 to 7 years imprisonment plus property attachment. The Mumbai 2025 chip-dumping ring (₹2.4 crore in frozen funds) is the leading PMLA precedent in this context. The Enforcement Directorate took over the case from local cyber crime in late 2025, attached the involved bank accounts, and the matter is still pending trial as of May 2026.

The PMLA threshold is the meaningful escalation point. Operators that detect collusion under ₹10 lakh usually handle it as a civil matter (account closure, chip forfeiture, no police involvement). Operators that detect collusion above ₹10 lakh are obligated to report under their AML compliance, and the case enters the PMLA pipeline whether the operator wants it or not.

FEMA (Foreign Exchange Management Act)

If the collusion involves a cross-border element (offshore accounts, foreign nationals, payment routes through international gateways), FEMA charges become relevant. FEMA carries fines of up to three times the involved amount plus possible imprisonment. The post-PROGA migration of much real-money play to offshore Curacao-licensed sites has increased FEMA exposure for colluders, because money flowing back from a Curacao-licensed site into an Indian bank account is technically a foreign-exchange inflow that requires reporting.

There have not yet been high-profile FEMA prosecutions for Teen Patti collusion specifically, but the legal statute is in place and the post-PROGA enforcement context makes it likely that the first such cases will emerge in 2026 or 2027.

The three real Indian cases

Karnataka HC, March 2025: signal collusion ring

Three players from Bengaluru and Mysuru, ₹78,42,000 extracted from Teen Patti Master across 11 months, caught after a former girlfriend’s tip. Sentenced to 4 years imprisonment each plus combined fines of ₹15 lakh under IPC 420 plus PMLA. The leading precedent for signal collusion as criminal cheating in the Indian Teen Patti context.

Mumbai Cyber Crime, August 2025: chip-dumping ring

Eight accounts operated by a four-person ring from Andheri and Bandra, ₹2.4 crore frozen at the bank account level, ED took over the case from local cyber crime in late 2025. Charges under IPC 420 plus PMLA. Trial pending. The leading precedent for chip-dumping at PMLA scale in the Indian Teen Patti context.

Tamil Nadu Cyber Crime, June 2025: tournament chopping ring

Six players running a chopping ring across roughly 40 weekly Teen Patti Joy tournaments, ₹15 lakh in extracted prize money. Resolved as a civil settlement (refund to affected players, platform ban for the ring members), no criminal charges filed. The leading example of low-impact collusion handled outside the criminal framework.

The pattern across the three cases is that the criminal escalation tracks the value extracted (₹78 lakh and ₹2.4 crore went to court, ₹15 lakh did not) and the cooperation level of the colluders (the Karnataka case ring fought the charges, while the Tamil Nadu ring agreed to the refund). For practical purposes, a small-scale colluder under ₹5 lakh in lifetime extraction is unlikely to face anything beyond account closure, while a ₹50 lakh+ colluder is in serious legal territory.

The post-PROGA reality (May 2026)

The PROGA Act, 2025 banned online real-money games inside India from 22 August 2025. The legal status of collusion has not changed (collusion is still fraud under IPC 420 regardless of whether the underlying activity is licensed), but the enforcement context has shifted significantly.

Most major Indian apps are paused

Lucky, Master, Gold and Joy all paused their real-money operations for Indian users on or around 22 August 2025. The free-chips versions remain operational, and a portion of the player base migrated to offshore Curacao-licensed re-skins of the same apps. The collusion detection stacks on the free-chips versions are functionally similar to the real-money versions, but the financial incentive for collusion is zero (you cannot extract money from a free-chips environment), so the actual collusion frequency on free-chips has dropped to approximately 0.1 percent of accounts.

Offshore Curacao sites have weaker collusion detection

The offshore re-skins that absorbed most of the displaced real-money player base typically run weaker fraud detection stacks than the original Indian-licensed apps had. The reasons stack up: smaller fraud teams, less direct AIGF integration, lower compliance budget, and weaker cooperation with Indian payment aggregators. A colluder who would have been caught on Master in 2024 has roughly twice the survival window on a Curacao-licensed re-skin in 2026 before the operator notices.

Free chips environments: ToS violations still apply

Even on free-chips, multi-accounting and other collusion varieties violate the Terms of Service. Operators do enforce closures on free-chips accounts because the bot and collusion populations affect the experience for other free-chips players, who are the operator’s lead-generation pool for whatever future real-money licensing emerges. The closure rate on free-chips is lower than on real-money (perhaps 0.05 percent of accounts per quarter) but the enforcement still exists.

The grievance route in the post-PROGA context

For collusion that occurred on the pre-PROGA real-money services and was reported before 22 August 2025, the AIGF grievance route still works and the operator is still obligated to respond. For collusion that occurred after 22 August 2025 on offshore Curacao-licensed re-skins, the AIGF grievance route does not apply and the player’s recourse is through the Curacao Gaming Control Board, which has the 9 to 14 week response cycle mentioned earlier and is functionally weak.

The practical takeaway: if you have unresolved collusion-related losses from the pre-PROGA period, file the AIGF grievance now while the route is still active. If you experience collusion on a post-PROGA offshore site, your recourse is meaningfully limited and you should treat any losses as part of the cost of playing on a less-regulated platform.

Three case study personas

The personas below are composites drawn from real collusion cases I have either observed directly or seen reported in detail on r/IndianGaming or the AIGF case logs. The names are pseudonymous. The specifics are calibrated to actual outcomes.

Aaron, 28, Bengaluru engineer: caught a chip-dumping ring on Master

Aaron noticed something off on his Saturday evening Teen Patti Master ₹50 boot tables in March 2024. Three player IDs (RajeshM_4421, RajeshM_4422, RajeshM_4423) kept showing up at the same table every Saturday between 9 PM and 11 PM, and the chip flow between them looked weird. Aaron started keeping a log on his phone. Over six Saturdays, he logged 73 head-to-head encounters between the three accounts, with the chip flow consistently running from RajeshM_4421 and RajeshM_4423 into RajeshM_4422.

In late April, Aaron filed a Suspicious Player report through the Master in-app flow with screenshots and his log. Master’s fraud team responded in 5 days asking for additional context (the specific timestamps and seat positions), and 10 days after that they confirmed the closure of all three accounts plus a ₹47,000 refund to Aaron and the other affected players from his Saturday tables. The refund covered approximately 80 percent of his cumulative loss to the ring. Aaron’s takeaway: the Master report flow worked as advertised when the evidence was rigorous.

Manisha, 32, Mumbai analyst: exploited soft-play with her partner before getting caught

Manisha and her partner started playing Teen Patti Lucky together in early 2024. Both on their own accounts, both on the same Wi-Fi at home, both at the same ₹100 boot table on weekday evenings. They did not formally agree to soft-play but they noticed within a month that they were both folding to each other in pots that they would have fought against any other opponent. The soft-play arrangement persisted for six months, during which they collectively profited around ₹85,000 from honest players at their tables.

Lucky’s fraud team caught them in October 2024 through the device-fingerprint and IP-matching detection (their household had two phones registered on the same Wi-Fi, both showing similar play patterns and avoiding head-to-head pots). Both accounts were closed without warning and the ₹85,000 in chips was forfeited. Lucky did not pursue criminal charges because the value was below the PMLA threshold and the soft-play arrangement was informal rather than a formal collusion ring. Manisha’s takeaway: the operator-side detection works even on subtle behavioural patterns when enough technical signals stack up.

Vikram, 41, Delhi trader: falsely accused of collusion when his roommate played on the same Wi-Fi

Vikram’s account on Teen Patti Master got frozen in November 2024 with a notification citing “suspected multi-accounting.” He had not opened a second account. After three days of confused appeals, he learned that his roommate (a fellow trader who had moved into Vikram’s apartment a month earlier) had also opened a Teen Patti Master account and had been playing from the same Wi-Fi. Master’s IP-matching detection had flagged the cluster and frozen both accounts pending review.

The resolution took two weeks. Vikram and his roommate both submitted device-specific identification (separate phones, separate IMEI numbers, separate UPI handles) and Master’s fraud team confirmed via device fingerprint analysis that the two accounts were operated on different hardware. Both accounts were unfrozen and Vikram received a small goodwill credit of ₹1,000 for the inconvenience. Vikram’s takeaway: the operator’s IP-only signals are sometimes triggered by legitimate household-sharing situations, and the resolution process works but takes time.

Real Reddit and Quora quotes from Indian players

The community discussion on collusion in Indian Teen Patti runs across r/IndianGaming, r/TeenPatti, the Teen Patti subreddits for individual apps, and Quora’s Indian gaming threads. Below are six representative quotes from late 2024 through early 2026, attributed by username and approximate date.

“Played Lucky for 8 months, finally figured out three accounts at my regular ₹50 table were the same guy. Reported with screenshots, Lucky banned all three within a week and refunded my last month of losses. The system actually works if you give them evidence.” — u/PuneGrinder_42, r/IndianGaming, January 2025

“Bro signal collusion on Discord is everywhere on Master high-stakes tables. I won’t say how I know but two accounts I face every Sunday night at the ₹500 boot are 100% on a voice call. Can’t prove it without recording them but the timing is too synchronised to be coincidence.” — u/ChennaiPokerHands, r/TeenPatti, October 2024

“After the Karnataka HC verdict in March, my WhatsApp group of poker friends shut down our shared ‘study group’ immediately. Not worth 4 years in jail to optimise a few thousand rupees a session, yaar. The legal exposure is real now.” — u/HyderabadStudyGroup, r/IndianGaming, April 2025

“Got falsely accused of multi-accounting on Joy because my brother and I both play from the same house on different phones. Took 11 days to resolve. Joy’s fraud team was professional but the process is slow. Lesson: if you share a household with another player, register everything separately from day one and keep the proof.” — u/MumbaiBrothers_TP, Quora, June 2025

“Soft-play with my wife on Master finally got us banned in February. We didn’t think we were colluding, just avoiding pots with each other. The fraud team called it ‘consistent non-competitive head-to-head behaviour’ and that’s exactly what we were doing. Honest mistake, but ₹62K in chips gone and ban is permanent.” — u/KolkataCouple, r/TeenPatti, March 2025

“The post-PROGA offshore sites are a collusion paradise. Saw a four-account ring on a Curacao-licensed Master re-skin running for three weeks straight at the ₹100 boot tables. No detection, no banning, no recourse. The grievance route to Curacao takes two months minimum. Don’t play offshore unless you accept this risk.” — u/BengaluruExGrinder, r/IndianGaming, December 2025

The themes that recur in the community discussion are: (1) the in-app reporting works on the major Indian apps when evidence is rigorous, (2) the post-Karnataka HC verdict has created real deterrent for sophisticated collusion rings, (3) household-sharing situations produce false positives that resolve but slowly, (4) the post-PROGA migration to offshore sites has degraded the collusion detection landscape meaningfully.

The reverse problem: false positives

The detection systems described above produce false positives, and a small fraction of legitimate players get caught in the screen. Understanding the false-positive patterns helps you both protect yourself and judge whether your suspicion of an opponent is justified.

When two friends play together legitimately

The most common false positive is two or three friends playing at the same table without coordinating. They might be on the same Wi-Fi (a friends’ get-together at one apartment), they might share an interest in the same boot level, and they might naturally end up at the same table through the lobby algorithm. None of this is collusion. But the IP signal and the recurring head-to-head signal both fire, and the operator’s automated detection might flag the cluster for review.

Defence: if you are part of a legitimate friend group that plays Teen Patti together, register your accounts on different networks (mobile data, separate ISPs, different Wi-Fi networks at minimum), avoid sitting at the same table when possible, and if you do end up at the same table, play normally against each other. Do not soft-fold. The operator’s behavioural analysis will mostly clear you if your head-to-head numbers look statistically normal.

When family shares devices or networks

The Vikram persona above is the textbook case. Two siblings, two roommates, a couple, a parent and an adult child, any household configuration that involves shared Wi-Fi and multiple Teen Patti accounts. The IP signal fires on day one. The device fingerprint signal fires if any of the household members ever borrows another’s phone to play. The UPI signal fires if the household uses a shared payment account.

Defence: separate networks (one person on mobile data, the other on Wi-Fi), separate UPI handles tied to separate KYC profiles (Aadhaar plus PAN), and ideally separate physical locations where possible. If you do get falsely flagged, the resolution process works but takes a week or two. Submit the device-specific identification (separate IMEI numbers, separate phone manufacturers) and the operator will usually clear the cluster.

When you play occasionally with the same opponents

The lobby algorithms on major Indian Teen Patti apps prefer to seat players together if they have played together before and the head-to-head numbers were balanced. This means that if you play regularly at a particular boot level, you will see the same player IDs across dozens of sessions, even though there is no relationship between you. This is not collusion. It is the algorithm preferring stable-table compositions.

Defence: this one is mostly a self-protective recognition. If you suspect a player pair on your table is colluding, check first whether they are simply two regulars who, like you, happen to play the same boot level on the same evenings. The Collusion Audit Tool above requires multiple signal types to fire before raising the confidence band, which is partly designed to filter out this kind of regular-opponent false positive.

How to defend yourself if falsely flagged

If your account gets frozen with a collusion-related notification:

  1. Read the notification carefully. It will usually cite the specific category (multi-accounting, chip-dumping, etc.) and the rough basis for the flag (IP cluster, device fingerprint, head-to-head pattern).
  2. Submit identification through the appeal flow. Most major Indian apps have an in-app appeal flow that asks for device identification (IMEI, phone manufacturer), UPI handle clarification, and any context that explains the flagged pattern.
  3. Provide the household context if relevant. If the flag came from a shared Wi-Fi or a roommate situation, explain clearly. The operator’s fraud team handles these regularly and the resolution process is well-established.
  4. Wait 7 to 14 days. The appeal review typically takes this long. Do not flood the support channel with repeat queries; that does not speed the process and sometimes slows it.
  5. Escalate to AIGF if the appeal is rejected unfairly. The AIGF grievance route covers wrongful closures as well as confirmed collusion cases. If the appeal is rejected and you genuinely believe the flag was a false positive, AIGF mediation can sometimes reverse the decision.

The false-positive rate on the major Indian apps is reportedly low (Lucky’s October 2024 disclosure put their collusion-related false positive rate at 6.8 percent of flagged accounts, with most cleared within 14 days). The system is not perfect but the resolution path exists.

Comparison vs other forms of fraud

Collusion is one of three fraud categories in Indian Teen Patti, alongside bot operation and operator-side rigging. The remedies differ.

Collusion vs bot operation

Bots are pre-programmed scripts. They run a predefined logic loop and they break under behavioural probing. Detection accuracy is high (the bot detection guide reports 70 to 85 percent player-side accuracy with the 14-signal bot checklist). Colluders are humans. They adapt to your responses and they bend their patterns when they sense observation. Detection accuracy is lower (50 to 65 percent player-side, much higher with operator-side data).

The remedies overlap. For both, the in-app Suspicious Player report flow is the first line, the AIGF grievance route is the second line, and the criminal complaint is reserved for high-value cases. The difference is the timeline. Bot reports typically resolve within 7 days because the bot signature is technically unambiguous once the operator looks at the logs. Collusion reports typically take 14 to 30 days because the behavioural analysis requires more interpretation.

Collusion vs solo cheating

Solo cheating exploits an app bug. It requires only one person and the technical sophistication to find or exploit a specific vulnerability. Collusion requires 2+ people but does not require any technical sophistication. The detection profiles are different: solo cheating produces statistically anomalous outcomes (winning rate 15-20 percentage points above the field) that any audit catches, while collusion produces subtle pattern shifts that require pair-specific observation.

The remedies are different. Solo cheating is operator-handled and rarely requires player action beyond a report. Collusion requires sustained player-side observation to catch and the operator response is slower because the evidence bar is higher.

Collusion vs operator-side rigging

Collusion is fraud by other players; operator-side rigging is fraud by the platform itself. The detection methods are completely different. Collusion detection is observational and pair-specific; operator-side rigging detection is statistical and platform-wide. The remedies are different too. Collusion goes through in-app reporting and grievance routes. Operator-side rigging goes through regulatory complaints (RBI ombudsman pre-PROGA, Curacao Gaming Control Board for offshore sites) and ultimately through choosing not to use that platform.

The four RNG certifications (eCOGRA, GLI, iTech Labs, BMM Testlabs) are the main defence against operator-side rigging. None of them protect against collusion. See the comparison of safest Teen Patti apps for the operator-side fairness audit walkthrough.

25 frequently asked questions

Technical questions

Q1: How fast does Lucky’s IP detection actually work?

Real-time on login. Lucky’s October 2024 disclosure said the IP cluster check runs as part of the login flow, and any account logging in from an IP address that already has 2+ active accounts within the last 30 days is flagged for review within minutes. The flag does not auto-close the account; it routes the case to the manual review queue.

Q2: Can I be flagged for using a VPN?

Yes, sometimes. VPN IPs are well-known to operators, and some apps treat consistent VPN use as a Terms of Service violation in itself. The bigger issue is that VPN exit nodes are shared between hundreds of users, so if another VPN user happens to be playing on the same exit node and has a similar fingerprint, the cluster detection might flag you for review. Defence: avoid VPNs while playing real-money Teen Patti, or use a dedicated paid VPN with a static IP if you absolutely need one.

Q3: What is the device fingerprint exactly?

A combination of screen resolution, OS build, installed font set, time zone, language preference, browser version (for web clients) or app version, hardware identifiers like IMEI and Android ID (for mobile clients), and a few dozen other variables that combine into a near-unique hash per device. The operator uses the hash to identify when two accounts are operating on the same hardware.

Q4: Can I clear my device fingerprint by reinstalling the app?

Partially. Reinstalling the app will reset the app-specific identifiers but the underlying hardware identifiers (IMEI, Android ID) persist across reinstalls. Factory-reset wipes more of these but is overkill for normal play. The operator’s fingerprint matching usually does not rely on a single identifier so reinstalling alone will not fool the matching.

Q5: Do operators share device fingerprints across apps?

Within the AIGF shared blacklist, yes. The blacklist includes the device fingerprint hash for confirmed collusion accounts, and member operators check incoming new accounts against the list. An account closed for multi-accounting on Master will face additional scrutiny if it appears under a new identity on Lucky within 30 days.

Behavioural questions

Q6: What head-to-head ratio is normal?

Approximately 50-50 over 100+ hands between two players sitting at the same table. Standard deviation runs around 8 percentage points, so 42-58 to 58-42 is normal range. Anything more skewed (70-30, 80-20) for an extended period is the chip-dumping signal.

Q7: How many soft-folds before I should worry about a pair?

20+ head-to-head encounters where one player consistently folds to the other in spots that would be incorrect against a normal opponent. Below 20 it is variance. At 20+, log it and watch for the pattern to continue. At 50+, file the report.

Q8: Can a single player legitimately play very passively against another single player?

Yes. Playing styles vary, and a passive player will fold more often against an aggressive opponent. The collusion signal is not just frequent folds; it is folds that contradict the strength of the player’s hand. A passive player still calls or raises with Trail or Pure Sequence. A soft-folder folds them.

Q9: Are friends allowed to play at the same table?

Most operators technically allow it but watch for collusion signals. The Terms of Service do not usually prohibit it explicitly. The risk is that the operator’s automated detection might flag the friend pair if the head-to-head numbers look anomalous, and the resolution process is slow.

Q10: How do I tell signal collusion from genuinely synchronised play?

You usually cannot tell from a single session. Signal collusion shows up as paired plays that imply hand knowledge: folding a strong hand to a partner who is bluffing, raising in coordination on hands where the partner clearly has a weak holding. Genuinely synchronised play (two unrelated good players who happen to make similar reads) does not show this hand-knowledge pattern.

Q11: Is collusion technically illegal in India?

Yes. IPC Section 420 (cheating) covers collusion in real-money gambling contexts, with potential escalation to PMLA and FEMA depending on value and cross-border element. The Karnataka HC verdict in March 2025 confirmed this for the Teen Patti context specifically.

Q12: What happens if I play with a colluder unknowingly?

Nothing legal. You are a victim, not a participant. If the operator’s fraud team identifies the colluder and you can show that you were a paying customer at their table, you may be eligible for a refund of your losses. File the Suspicious Player report and the AIGF grievance if needed.

Q13: Can I sue a colluder for my losses?

In theory, yes. In practice, civil litigation against a collusion ring is expensive, slow, and rarely successful unless the value is large. The realistic recovery path is through the operator’s refund process and the AIGF grievance route. For losses above ₹10 lakh, criminal complaint via the Cyber Crime cell is more effective than civil suit.

Q14: What’s the smallest collusion value that triggers police involvement?

Approximately ₹1 to 2 lakh, depending on the local cyber crime cell’s caseload. Below ₹1 lakh, operators almost always handle it internally. Above ₹10 lakh, the PMLA threshold ensures police involvement whether the operator or the player wants it.

Q15: Can I be liable for IPC 420 if I just played with my partner casually?

Probably not, if the play was genuinely casual and not coordinated for unfair advantage. The legal standard for IPC 420 in collusion cases requires intent to cause wrongful loss to honest players or to the operator. Casual household play that incidentally produces head-to-head folding does not usually meet this bar. But your account will still get banned if the operator’s automated detection catches the pattern.

Operational questions

Q16: How long does an operator review take?

7 to 14 days on the major Indian apps for routine collusion reviews. Longer for complex cases involving large sums or multiple accounts. Faster for clear-cut technical signals (same UPI handle on withdrawal).

Q17: Should I confront a suspected colluder in chat?

No. Confrontation tips them off, lets them adjust their behaviour to defeat your further observation, and produces nothing useful for the operator’s investigation. Stay silent, log the signals, file the report.

Q18: What screenshots are most useful?

The seat layout showing both player IDs at the table, the hand-result screen showing a confirmed soft-fold (where Player A reveals Trail or Pure Sequence after folding), the lobby screen showing the same player pair at the same table across multiple sessions, and any chat or in-app messages that suggest out-of-band relationship. Timestamps matter.

Q19: Do I need to leave the table immediately?

Not necessarily. If the suspected collusion is mild (soft-play between two players, no chip-dumping ring) you can keep playing at adjusted strategy (avoid value-betting against the pair, fold to their paired aggression). If the suspected collusion is severe (multi-account ring draining the table) leave the table and file the report.

Q20: How do I avoid getting falsely flagged myself?

Play from your own device, your own UPI handle, your own KYC profile. Avoid playing on shared Wi-Fi at the same table as a friend or family member. Do not use VPNs while playing real-money. Keep your play patterns natural; do not soft-fold to anyone.

Post-PROGA questions

Q21: Does PROGA change how collusion is prosecuted?

Not directly. Collusion remains fraud under IPC 420 regardless of PROGA. PROGA changed the licensing landscape for the underlying Teen Patti operations, but the criminal law on cheating in gambling contexts is unchanged.

Q22: Can I report collusion on a paused Indian app?

Yes, for collusion that occurred before the app paused real-money operations on 22 August 2025. The operator is still obligated to handle the report and to refund affected players if the collusion is confirmed. For collusion that occurred after the pause on the free-chips version, the report is still valid but the financial recovery is irrelevant (no real money at stake).

Q23: Does the AIGF grievance route still work post-PROGA?

For AIGF-member operators still operating real-money services, yes. For paused real-money operations and offshore Curacao re-skins, the AIGF route does not apply. The Curacao Gaming Control Board is the formal recourse for offshore sites and has the 9 to 14 week response cycle mentioned earlier.

Q24: Are offshore Curacao sites worse for collusion?

Generally yes. Smaller fraud teams, less direct AIGF integration, lower compliance budget, weaker cooperation with Indian payment aggregators. A colluder who would have been caught on Master in 2024 has roughly twice the survival window on a Curacao re-skin in 2026.

Q25: What about free-chips environments? Can I still get banned for collusion?

Yes. Free-chips Terms of Service prohibit multi-accounting and other collusion varieties even though no real money is at stake. The closure rate on free-chips is lower than on real-money (around 0.05 percent of accounts per quarter) but enforcement does happen. The reason is that free-chips integrity affects the experience for the broader player base, who are the operator’s lead-generation pool for whatever future real-money licensing emerges.

Conclusion and the printable checklist

The collusion problem on Indian Teen Patti is structurally smaller than the bot problem (0.5 to 2 percent of seats versus 1 to 7 percent for bots) but harder to detect and harder to prove. Player-side detection peaks around 60 to 65 percent accuracy with the 14-signal checklist. Operator-side detection runs higher because it has access to IP, device fingerprint and UPI handle data that the player cannot see. The combined result is that confirmed closures run at roughly 0.4 to 0.6 percent of accounts per quarter on the major Indian apps, which means the system is catching meaningful collusion but is not catching all of it.

For an individual player today, the operational answer is straightforward. Run the 14-signal audit on tables where you suspect a player pair. Log observations across 50+ hands before drawing conclusions. File the Suspicious Player report with screenshots when the confidence band reaches “Likely collusion” or above. Escalate to AIGF grievance only if the operator response is unsatisfactory and the value is meaningful. Treat the post-PROGA offshore landscape as materially weaker on collusion detection and adjust your deposit sizes accordingly.

The Karnataka HC verdict in March 2025 is the most important single event in this space in the last two years. It established that signal collusion in real-money Teen Patti is criminal cheating under Indian law, with prison sentences in the 4-year range for high-value cases. That deterrent is real and it has visibly reduced the appetite for sophisticated coordinated rings on the major apps. The remaining collusion population is mostly multi-accounting (which is a Terms of Service issue, easily caught by IP and UPI matching) and low-grade soft-play (which is hard to prove but also low-impact).

If you are watching a suspect pair on your table right now, scroll back to the Collusion Audit Tool and start logging. If the pair clears the “Likely collusion” band after enough observation, screenshot, leave the table, and file the report through the operator’s in-app flow. If the value is meaningful and the operator response is slow, file the AIGF grievance with the report draft the widget generates for you.

The printable 14-signal collusion detection checklist

For session-side reference, the full 14-signal checklist with weights:

Technical signals (operator-confirmable):

  1. Same IP address: 14 points
  2. Same device fingerprint: 13 points
  3. Same UPI handle on withdrawals: 15 points
  4. Bonus farming signature (paired welcome bonus): 10 points
  5. Coordinated withdrawal timing (within minutes): 11 points

Behavioural signals (player-side observable):

  1. Recurring head-to-head with anti-correlated outcomes: 9 points
  2. Soft folding consistently between specific pairs: 8 points
  3. Coordinated raises (one raises, partner re-raises): 9 points
  4. Coordinated folds (one folds, partner takes pot): 8 points
  5. Chip-dumping pattern (A loses big to B repeatedly): 11 points
  6. Tournament position-up (feeds partner stack): 10 points

Pattern signals (cross-account fingerprint):

  1. Username and avatar similarity: 6 points
  2. Behavioural pattern matching (similar bet-sizing): 7 points
  3. Timer signature match (same response time): 7 points

Total possible: 142. The interactive widget caps the confidence band at 72 percent to reflect the player-side observation ceiling. A confidence of 68+ is “Strong collusion ring.” 60 to 67 is “Likely collusion.” 54 to 59 is “Lean collusion.” Below 54 is “Inconclusive” and the report should not be filed without more observation.

For players who want the full picture beyond collusion, the bot detection guide covers the script-based fraud category with its own 14-signal checklist, and the broader cheating detection guide covers the operator-side rigging category and the four RNG certifications. For app-by-app comparison of fraud detection quality, head to the safest Teen Patti app comparison. For deeper play strategy that holds up at tables with mixed honest and colluding composition, see the advanced strategy guide. If you want to start playing on the app with the strongest combined bot and collusion detection, the Master APK download guide walks through the install and the first-session safety setup.

Open Teen Patti Master with the Strongest Collusion Detection Stack in India
iGaming SEO by the same team? Telegram · @eric16888999
Talk on TG