10 Reasons Why the Computer Was Invented

Few of us can imagine life without access to our computers and the many ways they make our lives easier. From online shopping and social networking to simple word processing and organization of data, computers have essentially become crucial to our sanity and survival in the 21st century.  

What’s more interesting, however, may not necessarily be what these innovative tools do for us today, but rather taking a step back in time – a step way back to the 19th century, around about 1822. What drove the need for a computer then, when the very first ideation came about, and to whom to we owe homage for coming up with such a revolutionary invention? We’ll take a bit of a trip back in time to answer these questions and find out what our current day computer was like in its humble beginnings.  

The History of Computer Technology

Why was the computer invented? The computer was invented in order to automate mathematical calculations that were previously completed by people. Charles Babbage is considered to be the “father” of the computer. Babbage was a mathematician, philosopher, inventor and mechanical engineer who saw a need for an automated system that would negate human error in computation. 

The Beginning of Computer Programming

But, it’s not just Babbage to whom we need give credit for the capabilities we often take for granted as we daily log on to our desk and laptop devices. When we think of a computer, we often consider a keyboard, monitor, and all that goes on inside without our awareness or even comprehension. We simply hit a key and expect the computer to perform a function. And happily, most of the time, it does – thanks to some very intricate and precise computer programming.

Without the programming behind the machinery, we would have a useless device at our disposal. Babbage’s concept of an automated machine was only the first step in bringing his ideas to fruition.  He also needed to find a way in which to program the hardware to perform the tasks we would ask of it.  

See, prior to Babbage’s notions, computers were not actually the hardware and software we know them to be today. A “computer” was a job title. And the job was performed by a person who, essentially, computed numbers all day long.  

It’s fairly simple to communicate with a human computer and tell it what to do. The ease of the transaction is not as simple with a machine. Hence the need not only for the machine itself, but the programming that happened behind the scenes, the instructions that would dictate what it should do.

Here is where we must introduce a young woman named Augusta Ada Byron.  She is often credited as the first computer programmer, recognizing that Babbage’s ideas had applications beyond what he initially hoped to accomplish. Together, these two masterful minds shaped the foundation for what we understand today as computer science and technology (Kim & Toole, 1999).

Why Were Computers Needed?

In the 1800’s, printed mathematical tables or logs, which were essentially very long lists of numbers showing the results of a calculation, were completed by the “human computers” mentioned earlier.  It was likely one of the most painful, least glamourous jobs of the 19th century. People sat, hour after hour, performing calculations by hand and recording them in books. Think of it as the world’s longest math class. Not exactly what we’d envision as an exciting job.

But these calculations were vitally important. Understanding them and the data retrieved from their outcomes was central to navigation, science, engineering, and mathematics (“Charles Babbage,” n.d.).

What Charles Babbage realized when faced with logs he knew to be fraught with error is that human computers are fallible, fickle creatures. Errors occurred in transcription as well as calculation (VanderLeest & Nyhoff, 2005). And those errors were often transferred to another set of calculations, thus creating a very complicated and convoluted mess. And, that can be a problem, whether you are attempting to map out the navigation for your next voyage across the ocean, calculating the sum of taxes to be collected, or simply assessing how much food supply is remaining in storage after a season of use.  

Think of it this way: imagine you are mapping out computations for navigating your next trip across the ocean to trade goods. This was, of course, common in the 19th century. Your calculations, unbeknownst to you, are riddled with errors. Not only are you not getting where you want to go, you are quite literally, and dangerously, lost at sea.  If the same occurred today with air travel computations, for example, we’d find ourselves in a disastrous mess, crossing paths with other planes at dangerous intervals – not to mention landing ourselves in the middle of nowhere. 

Babbage’s “aha moment” happened when he came to understand that the work of what he called “unskilled computers,” aka people, could be taken over completely by machinery which would not only increase reliability and eliminate human error, it would also speed up the process and increase efficiency. Sounds distinctly familiar when we consider the automation of labor today! 

So, with his ideas in hand, and presumably after quite a lot of trial and error throughout the process, in 1822 Babbage went on to follow through with his outrageous notion of automating these computations and created what he called the “difference engine” (“Charles Babbage,” n.d.).

The First Computer: Babbage’s “Difference Engine”

Babbage’s difference engine was designed to calculate a series of values automatically. It sounds an awful lot like a calculator, and in a sense, it was. It was intended to generate mathematical tables, just like those logs completed by “human computers” mentioned earlier, and automate the steps necessary to calculate the data. It was a simple device, however, and could only perform addition and subtraction, and a few polynomial equations (Kim & Toole, 1999). 

Nonetheless, it was certainly innovative since up to this point, while physical labor was beginning to be moved to automated machines, nobody had considered such an idea for “mental labor” (VanderLeest & Nyhoff, 2005).

Babbage’s ideas worked something like this:

Calculating polynomial equations like the ones above were the most complicated that the difference machine could accomplish. However, what we need to remember is that a single crank of a machine with an accurate outcome solved a major issue with human computers: removing the risk of error and getting accurate results quite a bit faster than what a human was capable of accomplishing (VanderLeest & Nyhoff, 2005).

Limitations of Babbage’s Difference Engine

As with many innovative ideas, Babbage recognized the limitations of his machine, and in the absence of funding, the difference engine unfortunately never came to full fruition. Still, by 1833, Babbage had already begun thinking about how to improve his design and the functionality of the machine. Simultaneously, he had also recently befriended an integral player along the journey of the first computer, Miss Augusta Ada Byron, who we mentioned earlier as a key figure in computer programing and understanding and applying Babbage’s designs (Kim & Toole, 1999).

Babbage’s New Concept: The Analytical Engine 

It is here that we’ll need to bring Miss Ada Byron back to the forefront as we progress through Babbage’s second undertaking. If you remember, Ada is recognized as one, if not the first computer programmer. She was the daughter of Lord Byron, whom many English majors may recognize as an influential figure in the world of poetry. Ada’s desire, however, was not to follow her father’s literary footsteps, but rather embraced her mother’s wishes that she pursue math and science (Kim & Toole, 1999).  

Ada was fascinated by Babbage’s few early publications about his difference engine, and the 17-year-old soon became captivated by his work. The two quickly became friends after meeting in 1833. Babbage began sharing with Ada his ideas for a new machine, one that would surpass the difference engine and come to be remarkably similar in architecture to today’s modern computer, despite also never having been built to completion (Kim & Toole, 1999).

The Beginning of Computer Technology

Most of us are familiar with some basic computer technology concepts, such as “memory” or “CPU” (central processing unit). It is in these basic ideas that the plans for the Analytical Engine became the foundation for what we understand as part of computer processing and programming today.  

If you are reading this on your computer, your central processing unit is what provides instruction to your computer, telling it what to do in basic arithmetic, logic, controlling, and “input/output” operations. It is, in a sense, the “brain” of your computer.  Whatever “input” you deliver is the data or signal the computer needs in order to provide you with the “output” or action you require. A simple example is the keyboard and monitor combination. The keyboard is the input device (with your control), and the monitor is the output device (“Output,” n.d.). 

It sounds a lot simpler than it is, and thankfully we don’t think much about what goes on behind our computer screens. We tend to simply assume the machine will do whatever we ask it to do; and, when it doesn’t – well, we’ve all been there before!  

The Punched Card Pattern and the Jacquard Loom

The idea of “input/output” for data processing didn’t exactly originate with Babbage’s analytical machine. It was an idea he actually borrowed from the Jacquard loom, another invention of the 19th century, which wove patterns in fabric using similar punched cards. The reason Jacquard’s loom was so innovative in its use of punched cards was because it allowed for a machine that could do multiple things, simply by changing the patterns on the cards. Prior to this, machines could only accomplish a singular task (Korner, 2014).

Babbage recognized that using punch cards allowed nearly any algebraic equation to be generated automatically – not only addition and subtraction like that of the difference engine (Kim & Toole, 1999).

Babbage devised a plan for this simple punched card reader for programming data input. He concluded that the analytical engine could contain a memory unit called the “store,” and an arithmetic unit called the “mill.” The output would then be an automatic printed page that resulted in the machine’s capability to perform addition, subtraction, multiplication, and division up to a 20-place decimal accuracy (“A Brief History of Computers,” n.d.).

Not only could the analytical engine theoretically perform these basic functions, it would also be able to repeat a set of instructions based on certain conditions. This idea became the foundation for what is known today as “conditional branching,” a common mathematical concept, “if x then y.” In 1840, Babbage presented his theories to a group of mathematicians and engineers in Turin, Italy, with the hopes that others would assimilate his novel ideas (Kim & Toole, 1999).

Ada Byron’s Influence in Computer Programming

Ada Byron, now Countess of Lovelace after marrying William King, the earl of Lovelace, continued her own work in the fields of math and science. She persisted in following Babbage’s ideas for his analytical machine, quietly working through her own theories. She decided to translate into English a paper written by a young mathematician by the name of Luigi Federico Menabrea, who was in attendance at Babbage’s presentation in 1840. Menabrea titled his paper, “Sketch of the Analytical Engine” (Kim & Toole, 1999). 

While Babbage continued to write down his plans for the Analytic Engine, he encouraged Ada to annotate her translation of Menabrea’s work, which resulted in notes twice as long as Menabrea’s original article. She and Babbage continued to collaborate, bringing together each of their findings, Ada focusing primarily on the idea of programming using Jacquard’s punched cards.  

Ada recognized that the use of punched cards allowed for the most complicated of patterns – for Jacquard’s loom, patterns in weaving fabrics together – and, for the Analytical Engine – the most complicated of algebraic patterns could be used to perform calculations automatically (Kim & Toole, 1999).  

She took this somewhat groundbreaking insight and went on to create a program for computing Bernoulli numbers, numbers often used in navigation. She succeeded in doing so, with a minor mathematical flaw here and there. What her results demonstrated was that the analytical engine was indeed capable of conditional branching (if x then y), repeating sets of instructions based on multiple conditions. It would be the most complex program ever written, far more complicated that what Babbage originally constructed (Kim & Toole, 1999).

“The Analytical Engine weaves algebraical patterns just as the Jacquard loom weaves flowers and leaves.”  -Ada Byron, Countess of Lovelace

Ultimately, Ada published the first paper to discuss at length the idea of computer programming – the only in existence for the next century. What Ada continued to focus on, and in essence what set her apart from Babbage, was her ability to analyze the length to which Babbage’s engine could reuse code and branch to different instructions based on conditions – the modern-day concept of conditional branching (Kim & Toole, 1999).

Final Thoughts

Ada and Babbage continued to correspond and work together, though there seems to be some controversy over who discovered what first, and to what degree. Whatever the case, what we have is a collaboration between the two in writing several computer programs, small and large, for the analytical engine – the first idea of a computer known to produce computations based on programming equations via input/output formulas (Kim & Toole, 1999).  

While the analytical engine was never fully developed, the documented plans for the machine’s capabilities became the foundation for what we understand today as computer programming and our modern-day machine. 

And while the computers we currently use have far surpassed what Babbage and Byron could have likely anticipated for a device that could successfully complete mathematical computations, we certainly owe thanks to their ingenious ideas in changing “computer” from a job title to a device we depend for nearly every aspect of our lives in the 21st century.    

References

A Brief History of Computers. (n.d.). Retrieved November 6, 2019, from http://www.cs.uah.edu/~rcoleman/Common/History/History.html.

Charles Babbage. (2019, November 5). Retrieved from https://en.wikipedia.org/wiki/Charles_Babbage.

Kim, E. E., & Toole, B. A. (1999, May). Ada and the First Computer. Retrieved November 6, 2019, from https://www.academia.edu/9440440/Ada_and_the_First_Computer.

Korner, T. (2014, April 22). Why Was The Computer Invented When It Was? Retrieved from https://plus.maths.org/content/why-was-computer-invented-when-it-was.

Output. (2019, September 6). Retrieved November 6, 2019, from https://en.wikipedia.org/wiki/Input/output.

VanderLeest, S. H., & Nyhoff, J. (2005). Chapter 2: The Anatomy of the Computer. Retrieved from https://cs.calvin.edu/activities/books/rit/chapter2/history/human.htm.