In the modern age, we take computers for granted, but this wasn't always the case. Where did these incredible machines come from, and how did they develop over history?
This work is shared under the following license: Creative Commons BY-SA-NC
What is needed to run this unit?
Do not try and force this. What areas of other subjects might this reflect and/discuss language. For IB, links with ToK.
What was successful? What needs changing? Alternative Assessments and Lesson Ideas? What other Differentiation Ideas/Plans could be used?
Any CC attribution, thanks, credit, etc.
- When we think of technology, computers and other digital machines are probably the first things that come to mind.
- Their advanced nature and importance to society mean that we can't stop thinking about them. They are certainly some of the most pervasive technologies in our lives.
- To a student like you, computers have simply always Existed, but that wasn't the case for your grandparents, or even your parents.
- In this unit, we'll explore how this insane bonkers disruptive technology went from massive machines built to win wars to the modern conveniences we have today.
- Here's a look at the major points in computing history:
Don't worry there's no quiz! You don't have to memorize this, just check out the interesting parts.
Infographic by Nowsourcing
Well, it depends on who you ask. Some would say that the first device designed for computing counts, which would be the abacus.
However, that was still a manually operated device. In those days, computers were people: they would do complicated maths manually and put the results into tables, which ended up in books which were used for everything from artillery warfare to tax collection.
The first device ever conceived that resembles a modern computer was something called the Analytical Engine, designed by Charles Babbage in the 1820s for the purpose of automating mathematical conversions. This device was intended as a successor to Babbage’s first project, the Difference Engine, a machine that could compute several sets of numbers and make hard-copy tables of the results. The development of the Difference Engine was assisted by Ada Lovelace, who was later recognised as the first-ever computer programmer for her work.
The impetus for the creation of the Engine was Babbage’s discovery of some unpublished work done for Napoleon Bonaparte, who had commissioned numerous human computers to create tables converting measurements in the old imperial system into the new metric system. Babbage wondered if, at the height of the Industrial Revolution, a machine could be built that could automate such a process. Eight years into the Engine’s development, Babbage had a mostly functional prototype… and no money.
A part of the Difference Engine
Undeterred, Babbage pressed on, intending to create an even more powerful machine, one capable of proper multiplication and division instead of having to brute-force such processes through complex addition and subtraction. This was the Analytical Engine, and the concept featured several parts that still feature, at least in philosophy of purpose, on all modern computers, most notably a central processing unit (or CPU), and memory. He didn’t call them that, but the concepts he came up with still hold true to this day. Punch cards, an essential component of early computers, also featured in his design. Finally, in true Industrial Revolution fashion, the whole thing would be steam-powered. Unfortunately, Babbage still had no money, and the machine was never built during his lifetime. Nobody in Babbage's time thought computers were going to go anywhere! How wrong they were!
A video of a modern reconstruction of the Difference Engine in action
The first digital computers emerged during the 20th century, in the lead-up to World War 2. These computers tended to be electromechanical, with electrical switches driving mechanical relays. One of the earliest examples of this was the Z2, developed by German engineer Konrad Zuse in 1939. This was followed up in 1941 by the Z3, the world’s first programmable, fully automatic digital computer. As the first computer to use a binary rather than decimal system, the Z3 was the first ever Turing-complete computer (i.e. a machine that can simulate the function of a universal Turing machine, doing any calculation one of those would be capable of). This concept is named after Alan Turing, one of the founding fathers of modern computing.
In 1936, Alan Turing developed the principle of the modern computer, in a paper entitled On Computable Numbers. In this paper, he described what came to be known as a Turing machine, a machine that could perform any mathematical computation. When the Second World War started, he was brought in by the British government to a place called Bletchley Park, with the intent that he would help to decode enciphered German messages created by the supposedly uncrackable Enigma machine. To this end, he adapted a design created by the Polish, and created an electro-mechanical device called a bombe. This was later superseded by the Colossus computer, a machine designed by British engineer Tommy Flowers, at Turing’s recommendation. This machine was capable of breaking far more advanced codes that the Germans had developed during the war. Many of humanity’s greatest technological advances came from wartime necessity, and the computer was no different.
Some estimates say that Turing's work shortened World War 2 by two years and saved fourteen million lives. Unfortunately, his work was never officially recognised during his lifetime, as he was a gay man, at a time when homosexuality was a crime in the UK. He was persecuted for his sexuality, and committed suicide at the age of 41, decades before the computer technology he had theorised and pioneered would truly hit its stride.
Here's a video from Crash Course explaining more about Turing and his work, if you want.
In the last section, we talked about the Colossus computer, the one invented by Turing and Tommy Flowers to help end World War Two. Thing is, if you looked at it, you probably wouldn't recognise it as a computer in the modern sense.
Colossus was big. Really big. It was seven feet tall, seventeen feet wide, and eleven feet deep, resulting in a total of 1300 cubic feet - about the size of a living room. It weighed five tonnes - that's more than the weight of two Range Rovers! Its 2500 valves, 100 logic gates, 1000 resistors and seven kilometres of wiring required eight kilowatts of power. By comparison, your school laptop, designed just 75 years later uses about twelve watts. That's an improvement of a factor of 650!
Of course, Colossus was still a partially mechanical machine. The first fully electronic computer, ENIAC, was built in 1945 and put to work in 1946, it was designed for the US Army to calculate artillery firing tables, although it was also used to determine the viability of thermonuclear bombs! Due to being purely electronic, it could calculate a thousand times faster than electro-mechanical computers. It could calculate a trajectory that would take a human 20 hours in 30 seconds, making it 2400 times more efficient than a human for that particular job.
It cost half a million US dollars to build - about $6.3 million in today's money. Even a seriously beefy gaming PC like an Alienware Aurora R7 clocks in at a mere $1,200 - over 5000 times cheaper. The ENIAC also ran at only 100 kHz. By comparison, the Alienware's 4 gHz processing speed makes it 40,000 times faster. That's a cost-to-performance improvement ratio of 20 million!
At this point, computers didn't have what we would recognise today as RAM - random-access memory. The first computer with RAM was the Manchester Baby, an experimental computer that paved the way for the first-ever commercially available computers. The Baby had one kilobit of RAM (one-eighth of a kilobyte). The eight gigabytes of RAM in the modern Alienware computer I was comparing the ENIAC to are an improvement by a factor of sixty-four million.
in the early days, you couldn't actually buy a computer. They were exclusively for use by governments and whoever else could afford to build one - which, with how expensive they were, wasn't many people. The first commercial computers, the British Ferranti Mark 1 and the American UNIVAC, were put on sale in 1951. They were very expensive - UNIVAC prices rose and rose to over a million dollars as more were sold, and only very rich companies like General Electric, Du Pont and Westinghouse could afford them. Most were sold to branches of the US government, with the most famous being the one purchased by the US Census Bureau.
This computer was used by the Census Bureau in conjunction with the television broadcaster CBS, to predict the results of the 1952 US presidential election. Using a sample of just 1% of the voting population, it famously - and correctly - predicted that Republican candidate Dwight D. Eisenhower would win by a landslide. This event brought computers into the public eye in America, and furthered interest and development of computing technology.
The final evolutionary link between the hulking beasts of yesteryear and the slick, modern machines of today is the mainframe. These were massive, powerful computers that were designed to be used by businesses. IBM were the masters of these machines, and one of their most important inventions in this field was something called System360, a family of computers large and small (relatively, of course) that used the same set of instructions on how to operate. This allowed a company that bought a lower-end System360 computer to upgrade to a better one without having to rewrite all their code.
By the time the 1970s rolled around, computers were more widespread than ever before, but they were still big, bulky machines that required advanced knowledge to use. They were also very expensive, to the point where they couldn't reasonably be owned by a single person. The true dream was a computer that was much smaller and cheaper, and that could be used by anyone. The first step towards this was the workstation.
This funky-looking thing was the Xerox Alto. Xerox is a company today known for printers and other document-related products, but in the 1970s, their engineers developed this computer for single-user use. Interestingly, it featured a screen in portrait orientation - it was made by a document company, after all. However, that screen was the first computer screen to show off a graphic user interface, or GUI, the cornerstone of all modern desktop operating systems. This was a decade before mass-market computers with GUIs became available.
The Alto became well-known in Silicon Valley, America's tech hub, as its GUI was seen as the future of computing. One of its biggest fans was Steve Jobs - you know, from Apple - whose visits to see the computer inspired the creation of the first Macintosh desktop computers.
Speaking of Steve Jobs, want to get hands-on with some of his earliest work? Well, you can, right from the comfort of C108!
At the back of the room, you'll find an Apple II, as well as another type of early personal computer, a Sinclair ZX Spectrum. Released in 1977 and 1982, respectively, these are a bit more advanced than Colossus or UNIVAC, but still primitive compared to modern hardware. They don't work, but you can have a play around with them and see how they would have felt to use.
Another interesting part of computer history that you can try for yourself is punch cards. Before computers had electronic data input or storage functions, data was represented by cards with holes punched in them. Different patterns equated to different words and numbers. You can have a play around with this website, see how it works, and create your own personal punch card. Write a message about something you learned in this unit, turn it into a punch card and download it as an image file. That image will be your final evidence for this unit. Good job!