1804

The Jacquard Loom

Joseph Marie Jacquard patents a loom that uses punched cards to control which threads are raised and lowered. Hole or no hole. Up or down. Binary before we called it binary.

Before Jacquard, creating patterned fabric required a skilled weaver and an assistant called a "draw boy" who manually raised threads according to the pattern. With punched cards, an unskilled worker could produce intricate patterns automatically. The same loom could produce unlimited designs just by changing cards.

In 1839, a woven silk portrait of Jacquard himself required 24,000 punched cards. Each card had over 1,000 hole positions.

1837

Babbage's Analytical Engine

Charles Babbage designs the Analytical Engine, a mechanical general-purpose computer that would use punched cards for input and programming. He was directly inspired by the Jacquard loom. The machine was never completed, but its design anticipated nearly every element of modern computing: input, processing, memory, and output.

1843

Ada Lovelace

Ada Lovelace, often called the first programmer, writes extensive notes on Babbage's Analytical Engine. She recognized that the machine could manipulate symbols of any kind, not just numbers. Her observation remains one of the most elegant descriptions of computational abstraction ever written:

"The Analytical Engine weaves algebraical patterns just as the Jacquard loom weaves flowers and leaves."

1890

Hollerith's Census Machine

Herman Hollerith uses punched cards to tabulate the U.S. Census, reducing processing time from eight years to one. His company eventually becomes IBM. The punched card, Jacquard's invention for weaving silk, is now processing data about millions of people.

1947

The First "Bug"

Engineers at Harvard find a moth stuck in a relay of the Mark II computer. They tape it into the logbook with the note: "First actual case of bug being found." Grace Hopper and the team help popularize the term "debugging."

The term "bug" for a technical problem actually predates this. Thomas Edison used it in 1878. But the moth in the relay became the iconic story of computing errors, and it established a culture: when something goes wrong, you find the bug, you fix it, you write it down.

1952

Grace Hopper's Compiler

Grace Hopper writes one of the first compilers, the A-0 System. Her idea: humans should write code in something closer to natural language, and let the machine translate it to binary. She coined the term "compiler" for a program that does this translation. This is abstraction as a deliberate design choice.

Her long-term vision was that people would one day write code in plain English and let sophisticated compilers handle the rest. She was describing, in 1952, something remarkably close to what large language models do today.

1958

The Integrated Circuit

Jack Kilby at Texas Instruments demonstrates the first integrated circuit. Multiple transistors on a single chip. This is the physical foundation that makes modern computing possible, and it introduces a new layer of abstraction: you no longer think about individual transistors, you think about logic gates.

1969-1972

C and UNIX

Dennis Ritchie creates the C programming language. Ken Thompson and Ritchie build UNIX. C gives programmers a way to write code that is close to the hardware but portable across machines. UNIX gives them an operating system that abstracts away the hardware entirely. You write code for UNIX, not for a specific machine.

1991

Python

Guido van Rossum releases Python. Its explicit goal: readability and simplicity. Python abstracts away memory management, type declarations, and most of the tedious bookkeeping that C requires. You can write print("Hello, World!") and it just works. The cost is performance. The benefit is that humans can read and write it quickly.

1994

The Pentium FDIV Bug

Professor Thomas Nicely discovers that Intel's Pentium processor sometimes gives wrong answers for floating-point division. The cause: five missing entries in a lookup table with over a thousand values. Intel recalled processors. The cost: $475 million.

This is a story about errors at the hardware layer, and about what happens when error handling fails. The bug affected about 1 in 9 billion random divisions. Rare, but real.

2003

The Cosmic Ray Election

An election in Schaerbeek, Belgium, gives one candidate exactly 4,096 extra votes. That is 2 to the 12th power. A perfect power of two. A cosmic ray had flipped a single bit in the voting machine's memory. The error was caught only because the vote total exceeded the number of eligible voters.

2017-Present

Large Language Models

The Transformer architecture enables a new kind of abstraction. You no longer need to write code at all. You describe what you want in natural language, and the model generates the code. Grace Hopper's 1952 vision, realized through a fundamentally different mechanism.

The debate now: these systems are probabilistic, not deterministic. They might give different answers to the same question. Critics say this makes them unreliable. But the history suggests another reading.

At every point in this timeline, there were errors. And at every point, people built systems to handle them.

Next: Errors →