I was thinking about this again (because someone mentioned mahjong and so I was wondering whether it was practical to scale up) and I realised there's another big optimisation that I missed before (that can be made without changing the general idea — of course the Knuth–Fisher–Yates shuffle is still a much better algorithm overall).[…]
- For each card in turn, choose a player to deal that card to, uniformly at random (or in the biased case, i.i.d. at random)
- If at any time a player has received more than 13 cards, invalidate the entire deal and start again.
Waiting for 1000 redeals is nothing on a modern computer, but in BBC BASIC it would be unacceptable. I wonder how fast you could do it with optimised 6502 code?
The idea is that instead of dealing 4 hands of 13, you first deal the pack into 2 hands of 26, then deal each of those into 2 hands of 13. The reason this helps is because it provides "save points" for the rejection sampling — once you have a good division into 26, you don't have to backtrack past that point when you reject. So it takes A+B trials to get a good deal instead of A×B trials (where 1/A and 1/B are respectively your chance of successfully getting to the savepoint, and your chance of getting from the savepoint to a complete deal, at each attempt). (Actually in this case it's A+2B vs. A×B², because there are two identical, independent deals after the savepoint.)
I tried this out (bbcmic.ro) and it's a factor of ten faster than my best previous attempt, completing a deal in an average of just under 0.3 seconds. Now it really is fast enough to use in an interactive implementation of a card game!
Statistics: Posted by joachim — Fri Apr 25, 2025 1:18 pm