If you don’t know where you’ve been, how do you know where you’re going? A love of tech is one of the things that binds our team so I wanted to delve into it. In these two posts, I’m going to attempt to whizz you through 6,000 years of computing history.

Computing for most living today has been omnipresent. But only a generation ago it was very much a fringe technology.

Like many innovations, it burned slowly but surely in the background until it exploded exponentially. But how long has computing been in the background?

Having worked for many years representing technology brands, I guess I have a good awareness of technological milestones but I have to confess, I was surprised by how old computing is.

Naturally, Wikipedia is a good place to start.

The first undisputed entry is listed for 4000 BC. A ‘Quipi‘ is a knotted string used for counting by the ancestors of the Tiwanaku people of South America. Much later, in 2500 BC, the Babylonians invented the enduring abacus. These fragments of our ancient history represent the first known forays of humanity into using machines / tools to lessen the burden of computing hard sums. However, the real step function in utility came in the form of gears.

**Differential Gears**

In 910 BC, in China, the ‘south-pointing chariot‘ was invented. It was the first known geared mechanism to use a differential gear. The chariot was a two-wheeled vehicle, upon which is a pointing figure connected to the wheels via differential gearing. Through careful selection of wheel size, track and gear ratios, the figure on top of the chariot always pointed in the same direction. This heralded an era of differential gear innovation. In 125 BC, the Corinthian colony of Syracuse built the Antikythera mechanism – a differential gear computer (analogue computer) capable of tracking the relative positions of heavenly bodies.

**Arabian Astronomy**

In 1015 AD, Abū Rayhān al-Bīrūnī invented the Planisphere, another analog computer. He also invented the first mechanical lunisolar calendar which employed a gear train and eight gear wheels. This was an early example of a fixed-wired knowledge processing machine. In the same century, two other Arab astronomers invented and built analogue computers to model celestial positions and the dimensions of the earth.

In 1206, Al-Jazari invented numerous automata and other technological innovations. One of these is a design for a programmable humanoid-shaped mannequin: this seems to have been the first serious, scientific (as opposed to magical) plan for a robot. He also invented the “castle”, an astronomical clock that is considered to be the earliest programmable analogue computer.

In 1235, Persian astronomer Abi Bakr invented a brass astrolabe with a geared calendar movement based on Abū Rayhān al-Bīrūnī’s technical calendar analogue computer. It is still in existence in remarkable condition in the Museum of the History of Science.

In 1416, Jamshīd al-Kāshī invented the Plate of Conjunctions, an analogue computer instrument used to determine the time of day at which planetary conjunctions will occur, and for performing linear interpolation. He also invented a mechanical “planetary computer” which he called the Plate of Zones, which could *graphically* solve several planetary problems, including the prediction of the true positions in longitude of the Sun and Moon.

**European Calculators**

Sometime after this, in Europe, devices were devised that used gears to help humans calculate complex sums. In 1493, Leonardo da Vinci produced drawings of a device consisting of interlocking cogwheels which can be interpreted as a mechanical calculator capable of addition and subtraction. Later, in 1617, Scotsman John Napier invented the Napier’s bones – essentially a box of carefully crafted rods, which allows users to reduce multiplication operations to addition and division to subtractions.

It wasn’t until 1642 when the Frenchman Blaise Pascal invented the ‘machine arithmétique’, that the age of mass calculator invention production started. Later dubbed the ‘Pascaline’, this calculator could only complete addition and subtraction. In what would become Germany in 1685, in a published article Gottfried Leibniz described a machine that used wheels with movable teeth which, when coupled to a Pascaline, could perform all four mathematical operations. In 1775, Englishman Charles Stanhope, 3rd Earl of Stanhope, built Gottfried’s machine which resembled what we now recognise as a calculator and in 1851 Frenchman Charles Xavier Thomas de Colmar invented the first mass-produced calculator was – the Arithmometer.

**Polynomials and Punchcards**

Around this time, programming was being born. There was evidence in Medieval times to suggest sporadic work in this area. For example, in 850 the Banu Masu brothers invented an automatic flute player which appears to have been the first programmable machine. But, it wasn’t until Frenchman Joseph-Marie Jacquard developed the Jacquard Loom in 1804 that the wave of innovation started. Essentially the Jacquard Loom was a loom controlled by punched cards. For different designs, there would be different cards.

As Charles Xavier Thomas de Colmar was refining his calculator design (it took him 30 years to go from design to mass production), over in England Charles Babbage designed his first mechanical computer. These computers would be a step up from the calculators of the previous centuries in that they could handle polynomials i.e. complex equations. In 1832, Babbage and Joseph Clement produced a prototype of a segment of his ‘difference engine’ which could tabulate quadratic polynomials. If completed the ‘engine’ would have been the size of a room.

The next year Babbage conceived and began to build the next version of his design – the Analytical Engine. It was designed to be powered by steam and complete multiplications or divisions in 2-4 minutes. Crucially it was programmable through punchcards – the original read-only memory. Babbage was not alone in his work on the Analytical Engine. Augusta Ada King, known more famously through her title Countess of Lovelace, was a key partner to Babbage in its development. Ada Lovelace was the first to recognise that the machine had applications beyond pure calculation, and to have published the first algorithm intended to be carried out by such a machine. As a result, she is often regarded as the first computer programmer.

Interestingly, that’s where the phrase ‘patching’ comes from. When a programmer would have to fix his or her programmes they would have to patch over some of the holes in the card with card pieces or tape to change the direction given to the machine. In the first half of the 20th-century, software suppliers distributed patches on paper tape or spare punched cards.

In 1853, a Swedish inventor called Per Scheutze completed the first full-scale difference engine in 1856. It was renamed the Tabulating Machine and produced printed output. A second machine was built to the order of the British Government in 1857.

**IBM**

The 1880 US census took 7 years to complete as all the processing had to be done by hand from paper forms. The next one would take even longer with population growth, so a competition was held to find a better method. It was won by a Census Department employee called Herman Hollerith. Hollerith essentially used punch cards to record the data of each citizen. Using an electromechanical tabulator, he reduced the time it took to take the 1890 census to 6 years even though there was a great population. In these intervening years, Herman went on to form the Tabulating Machine Company in 1886. In 1906, Herman Hollerith introduced a tabulator with a plugboard that can be rewired to adapt the machine for different applications.

In 1911 his company was acquired by the Computing-Tabulating-Recording Company (CTR) which was renamed International Business Machines (IBM) in 1924. In 1928, IBM standardised on punched cards with 80 columns of data and rectangular holes. Widely known as IBM Cards, they dominated the data processing industry for almost half a century. In 1931, IBM introduced the IBM 601 Multiplying Punch, an electromechanical machine that could read two numbers, up to 8 digits long, from a card and punch their product onto the same card. In 1934, Wallace Eckert of Columbia University connected an IBM 285 Tabulator, an 016 Duplicating Punch and an IBM 601 Multiplying Punch with a cam-controlled sequencer switch that he designed. The combined system was used to automate the integration of differential equations.

It’s hard to believe that this was only the beginning – we had not yet entered the era of binary computing. It is also striking that this era of nascent computer development was such a global affair that charted innovation across continents, religions and even sexes. We were, however, just about to enter the terrible years of the Second World War which had a significant part to play.

Look out for the second part of this post later this year when we will look at how computing has continued to evolve with our civilisation.

## Recent Comments