Counting the cost of my ‘major keying error’
Receiving a demand for a parking fine is always annoying. Even more so when you know you paid for a ticket. But there it was, the letter from Euro Car Parks demanded payment, and they had photographs to prove it. It took 10 minutes for me to send evidence showing that at the time and place in question, the ticket machine had charged the price of an evening’s parking to my debit card.
After pondering this for a while, Euro Car Parks took a different tack: it withdrew the demand for payment of a fine, and, instead, demanded a £20 administrative fee for all the trouble I had caused it. My crime, it turns out, was that I had only entered the first four characters of my vehicle registration. This “major keying error” violated the car park’s terms and conditions.
But such mishaps merely spark curiosity. Why do “major keying errors” occur and is there anything we can do to prevent them?
In May, a pair of UK regulators fined Citigroup more than £60mn for several failures of risk control, most spectacularly when a trader planned to sell $58mn of shares but, in a major keying error, issued an order to sell $444,000mn of shares instead. Some of this order was blocked, but the remainder was more than enough to fleetingly crash stock markets across Europe.
The system made such an error unnervingly easy: the trader typed a number into the wrong box, asking the system to sell 58 million units instead of $58mn worth of units. Each unit was worth thousands of dollars, and there’s the problem. It is a bad idea to have a share trading system that lets you accidentally sell nearly half a trillion dollars worth of shares — which goes some way to explaining the £62mn fine. (How I wish regulators could be persuaded to levy such a magnificent fine on Euro Car Parks.)
What can be done to prevent such horrors? One possibility is to tell a system’s users not to make any mistakes. This seems to be the position of Euro Car Parks, and it is not wholly satisfactory. Nobody plans to enter the wrong registration number when paying for parking and, no doubt, Citigroup traders endeavour not to accidentally sell half a trillion dollars worth of shares. But mistakes will be made.
An alternative is to program the software to notice the mistake. Euro Car Parks could have flashed up a message saying “you have only entered four digits, are you sure that’s right?” or even “LOL sucker you’ll hear from our lawyers” would serve as a warning.
Citi’s system did flash up 711 warnings, of which only the first 18 lines were visible. That is only slightly better than no warnings at all, because trigger-happy warnings tend to be ignored as a matter of habit. And the Citi warnings must have been somewhat obscured by the fact that the system sometimes defaulted to assuming that shares had a unit price of -1, which means that if you mistakenly type 58 million units instead of $58mn, the system might tell you you’re selling -$58mn rather than the more obviously unnerving figure of, ahem, $444,000mn.
We can take comfort that this is not the most costly keying error in history. In fact it is not even Citigroup’s most costly keying error this decade. In 2020, the bank accidentally transferred $900mn of its own money to some creditors of Revlon, the cosmetics firm, again because of a software system that made such a slip all too easy. Some of those creditors decided to keep the money, on the grounds that Revlon did indeed owe it to them. US regulators fined Citi $400mn for having deficient systems.
We may laugh, but when a system requires perfection from operators, the consequences can be tragic. Nancy Leveson, an MIT professor who specialises in software safety, has documented an infamous case: the Therac-25. The Therac-25 was a radiation-therapy device in the 1980s that could fire high-energy beams either of electrons or X-rays into patients.
The type of beam matters. The X-ray beam was fired through a “flattener” to spread the treatment to the right area, but which also absorbed much of the energy. If the X-ray beam was somehow fired with the flattener out of position, disaster would result.
Disaster resulted. In one case, in a Texas hospital in 1986, the operator entered an “e” for the electron beam, then realised she had meant to type “x” for the X-ray, and swiftly moved the cursor back to correct the entry. The hidden flaw in the system was that rapid edits could bewilder it. If such an edit was made during the eight seconds it took to set everything up, the flattener would not be rotated into place and the software would be confused about the machine’s configuration.
The upshot? The X-ray beam was fired without the flattener, delivering an extreme dose of radiation. The computer then told the operator that only a low dose had been administered, and invited her to press “P” to proceed with a second attempt. The patient, suffering burning pains, was already trying to get off the treatment table when he was hit by the second beam. It was later estimated that he had received around 100 times the intended dose. He lost the use of his arm, was paralysed by radiation burns to his spine, and died five months later from numerous complications. It was not the only fatal accident involving the Therac-25. A major keying error, indeed.
There is no such thing as a foolproof computer system, but software can be designed to fail gracefully or disgracefully. On reflection, perhaps £20 wasn’t such an extortionate fee for a lesson in life.
Written for and first published in the Financial Times on 12 July 2024.
Loyal readers might enjoy the book that started it all, The Undercover Economist.
I’ve set up a storefront on Bookshop in the United States and the United Kingdom. Links to Bookshop and Amazon may generate referral fees.


