🧾 When the Machines Were Louder Than the Mistakes
Part 1: From Cardboard to Code — Joining the Machine
In the early 1970s, I joined the fledgling Driver and Vehicle Licensing Centre (DVLC) as a fresh-faced 19-year-old with a two A-levels and absolutely no interest in working with computers. I’d tried FORTRAN in school using graphite-sensed punched cards and had a miserable time of it — so when the Civil Service interview panel in London asked if I’d be interested in computing, I said “No.”
A few years later, I found myself standing in the middle of a massive, under construction, data centre in Swansea, surrounded by tape drives, gas turbines, and hammer printers — and helping to build one of the UK’s largest early national IT systems.
The DVLC system was designed around tape-based batch processing. At its heart were magnetic tapes running at 1600 BPI (later 6250 BPI), each one a file in its own right. The system tracked millions of vehicle and driver records, with 340 tapes for vehicle data and 240 for drivers, processed in overnight cycles. If a single tape broke, the system could skip that segment and continue, avoiding full re-runs. Resilience by necessity.
Each update was a delicate orchestration of variable-length, variable-format records, which had to be unpacked into fixed-layout structures for COBOL processing, and then repacked before writing back out. Every byte counted. Adding one byte to the average record length added 11 minutes to processing time, multiplied across hundreds of reels, that was days of extra work.
Part 2: Paper, People, and Punch Cards
The real story wasn’t just in the hardware it was in the people. Data didn’t come from the cloud or online forms. It came from 220 local offices, each with handwritten records and local staff who understood their regions better than any machine ever would. Those clerks were asked to transcribe thousands of records onto new data entry forms, while still keeping the old system running.
They did this knowing full well they were working themselves out of a job.
The process wasn’t frictionless. Forms came in with errors, public queries arrived by post, and the clerks handling those enquiries? They had no online access to the masterfiles. Every lookup required a paper form, a tape-based query job, and a printed response returned days later. A time and motion study showed that while a transaction took three weeks to complete, only about 30 seconds of it was actual manual handling. After cutover, the local office records were stored in an aircraft hanger on shelving, a retrieval and filing nightmare.
The system was built for throughput, not responsiveness and that distinction shaped the experience for staff and the public for years.
Part 3: Print Like You Mean It
At the end of every batch cycle came the thunder: sixteen 160-column barrel hammer printers, shaking the floor with mechanical fury as they fired off vehicle tax discs, licence confirmations, and system reports on fanfold paper.
And these weren’t just pieces of paper.
The vehicle tax discs were printed on, photogravure paper, circularly perforated, each one serial-numbered and audited like currency. Changing the layout or security features required coordination between IT, print operations, suppliers, and civil service policy. Some print jobs ran from offline spool tapes, meaning the spooling system was literally a physical spool.
Every misprint had to be accounted for. Every wrecked disc recorded. At that stage, we weren’t just handling data , we were manufacturing trust.
☕ The Epilogue
I often joke that I just made the coffee. But in truth, I had a front-row seat to the birth of modern digital government, in an age where the systems we built had weight, heat, noise and impact.
The Synology NAS sitting on my shelf today, silently syncing terabytes of data across two sites, is exponentially more powerful than the entire DVLC setup we built with tapes and turbines.
But it owes everything to what we learned back then: how to build systems that don’t just process data — they handle people, policy, and pressure.