Reverse-engineering randomness
What happens when people accidently make the unpredictable predictable
One of my favourite pieces of mathematical investigative journalism appeared in Wired in 2017. Brendan Koerner told the remarkable story of a group of Russian hackers who’d spotted something interesting with a certain brand of casino slot machines:
By early 2011, casinos throughout central and eastern Europe were logging incidents in which slots made by the Austrian company Novomatic paid out improbably large sums. Novomatic’s engineers could find no evidence that the machines in question had been tampered with, leading them to theorize that the cheaters had figured out how to predict the slots’ behavior. “Through targeted and prolonged observation of the individual game sequences as well as possibly recording individual games, it might be possible to allegedly identify a kind of ‘pattern’ in the game results,” the company admitted in a February 2011 notice to its customers.
In short, the Russians had worked out how to reverse-engineer the random number generators inside the slot machines. Or rather, the pseudo-random number generators – because computers can’t generate true randomness, they instead rely on engineering tricks to produce outputs that are near random. And in this case, it had left some machines vulnerable to a group who’d managed to derive a predictable pattern from the apparently unpredictable results.
Sometimes it can be useful to have a handle on computer-based randomness, particularly if you want your code to be reproducible. Typically, people will choose a ‘seed’ number for random generation, so that the code produces the same outputs each time, even if they were effectively random the first time the code ran.
Last year Claus Wilke had a nice article about random seeds in code, and particularly why you shouldn’t choose 42, which has ended up a very common seed number in AI-generated code:
In fact, frequently you see statements along the lines of “the random seed is arbitrary, you can use any number you want, so 42 is a fine choice.” This sentence is 100% correct, assuming you’re the only one who uses 42 and you also use it only once in your entire life.
The article noted the potential problems for scientific research projects, but I wanted to go one step further. If predictable randomness can make people money from slot machines, what could it mean for thousands of vibe-coded apps?
Codes within code
Apps often need to generate codes for users, from password resets to account verification. Typically these are 6-digit numbers. If the app has been built hastily – whether by a human or AI agent – internally it might be implementing this by picking a random number between 100000 and 999999. In which case, it would be possible to reverse-engineer the process.
For example, suppose you receive a code 343467. If the app code used random.seed(42), it would be easy to run through the sequence of generated random numbers and find that this is the 1502nd code that has been issued. And hence you could also generate the codes that will follow: 583550, 768891, 366314 etc.
There are various situations where this might crop up in vibe-coded apps1:
Magic login links. A site might email a link with a ‘login without password’ token. If that token comes from predictable randomness, someone could guess the login links for other users.
Username recovery links. Even if a system isn’t exposing passwords directly, a predictable account recovery token could still let an attacker get into the account recovery process.
File download URLs. Apps may generate ‘private’ document links or invoice URLs using random-looking IDs. If those IDs are predictable, an attacker may be able to access other users’ files.
Unlisted meeting or collaboration codes. An app may issue private webinar links, game lobby codes, or shared whiteboard sessions. If the codes are generated predictably, private spaces could stop being private.
Referral or offer codes. A retailer might generate codes for referrals or discounts. If these are produced predictably, it could hand revenue to an attacker who hasn’t earned it.
Vibe coding tools already have a certain slot machine-like ‘intermittent reward’ feel to them: put in some tokens and hope for a good outcome; if unsuccessful, pull the handle again. But they may end up echoing slot machines in another way, by illustrating what can potentially go wrong if your randomness isn’t as unpredictable as you think.
You may spot that there are other issues with using a random number generator for some of these things (i.e. lack of unique codes), but that isn’t to say an AI agent would avoid random number generation as a first pass attempt.


I'm guessing since the article you linked to is a scientist they don't care about security and just need a prng that gives them the same random-looking values every time they run their code. but no one who needs secure random values should _ever_ use the standard prng in languages like javascript or python. it doesn't matter _what_ you seed it with, it's trivial to calculate the seed given a certain number of outputs. use a csprng library or read from /dev/urandom
The folks who are likely to use 42 as a seed are significantly less likely vibecode. I gave a Douglas Adam’s-themed keynote on the economic impact of AI, and only 20% of the audience got my jokes.