computer generation

The Evolution Of Computer | Generations of Computer

The development of computers has been a wonderful journey that has covered several centuries and is defined by a number of inventions and advancements made by our greatest scientists. Because of these scientists, we are using now the latest technology in the computer system.

Now we have Laptops , Desktop computers , notebooks , etc. which we are using today to make our lives easier, and most importantly we can communicate with the world from anywhere around the world with these things.

So, In today’s blog, I want you to explore the journey of computers with me that has been made by our scientists.

Note: If you haven’t read our History of Computer blog then must read first then come over here

let’s look at the evolution of computers/generations of computers

COMPUTER GENERATIONS

Computer generations are essential to understanding computing technology’s evolution. It divides computer history into periods marked by substantial advancements in hardware, software, and computing capabilities. So the first period of computers started from the year 1940 in the first generation of computers. let us see…

Table of Contents

Generations of computer

The generation of classified into five generations:

  • First Generation Computer (1940-1956)
  • Second Generation Computer (1956-1963)
  • Third Generation Computer(1964-1971)
  • Fourth Generation Computer(1971-Present)
  • Fifth Generation Computer(Present and Beyond)
Computer GenerationsPeriodsBased on
First-generation of computer1940-1956Vacuum tubes
Second-generation of computer1956-1963Transistor
Third generation of computer1964-1971Integrated Circuit (ICs)
Fourth-generation of computer1971-presentMicroprocessor
Fifth-generation of computerPresent and BeyondAI (Artificial Intelligence)

1. FIRST GENERATION COMPUTER: Vacuum Tubes (1940-1956)

vacuum tubes g97ba56c1e 1280

The first generation of computers is characterized by the use of “Vacuum tubes” It was developed in 1904 by the British engineer “John Ambrose Fleming” . A vacuum tube is an electronic device used to control the flow of electric current in a vacuum. It is used in CRT(Cathode Ray Tube) TV , Radio , etc.

John Ambrose Fleming

The first general-purpose programmable electronic computer was the ENIAC (Electronic Numerical Integrator and Computer) which was completed in 1945 and introduced on Feb 14, 1946, to the public. It was built by two American engineers “J. Presper Eckert” and “John V Mauchly” at the University of Pennsylvania.

J. Presper Eckert

The ENIAC was 30-50 feet long, 30 tons weighted, contained 18000 vacuum tubes, 70,000 registers, and 10,000 capacitors, and it required 150000 watts of electricity, which makes it very expensive.

Later, Eckert and Mauchly developed the first commercially successful computer named UNIVAC(Univeral Automatic Computer) in 1952 .

Examples are ENIAC (Electronic Numerical Integrator and Computer), EDVAC (Electronic Discrete Variable Automatic Computer), UNIVAC-1 (Univeral Automatic Computer-1)

ENIAC Computer

  • These computers were designed by using vacuum tubes.
  • These generations’ computers were simple architecture.
  • These computers calculate data in a millisecond.
  • This computer is used for scientific purposes.

DISADVANTAGES

  • The computer was very costly.
  • Very large.
  • It takes up a lot of space and electricity
  • The speed of these computers was very slow
  • It is used for commercial purposes.
  • It is very expensive.
  • These computers heat a lot.
  • Cooling is needed to operate these types of computers because they heat up very quickly.

2. SECOND GENERATION COMPUTER: Transistors (1956-1963)

Transitors

The second generation of computers is characterized by the use of “Transistors” and it was developed in 1947 by three American physicists “John Bardeen, Walter Brattain, and William Shockley” .

John Bardeen

A transistor is a semiconductor device used to amplify or switch electronic signals or open or close a circuit. It was invented in Bell labs, The transistors became the key ingredient of all digital circuits, including computers.

The invention of transistors replaced the bulky electric tubes from the first generation of computers.

Transistors perform the same functions as a Vacuum tube , except that electrons move through instead of through a vacuum. Transistors are made of semiconducting materials and they control the flow of electricity.

It is smaller than the first generation of computers, it is faster and less expensive compared to the first generation of computers. The second-generation computer has a high level of programming languages, including FORTRAN (1956), ALGOL (1958), and COBOL (1959).

Examples are PDP-8 (Programmed Data Processor-8), IBM1400 (International business machine 1400 series), IBM 7090 (International business machine 7090 series), CDC 3600 ( Control Data Corporation 3600 series)

PDP 8

ADVANTAGES:

  • It is smaller in size as compared to the first-generation computer
  • It used less electricity
  • Not heated as much as the first-generation computer.
  • It has better speed

DISADVANTAGES:

  • It is also costly and not versatile
  • still, it is expensive for commercial purposes
  • Cooling is still needed
  • Punch cards were used for input
  • The computer is used for a particular purpose

3. THIRD GENERATION COMPUTER: Integrated Circuits (1964-1971)

pexels pixabay 39290

The Third generation of computers is characterized by the use of “Integrated Circuits” It was developed in 1958 by two American engineers “Robert Noyce” & “Jack Kilby” . The integrated circuit is a set of electronic circuits on small flat pieces of semiconductor that is normally known as silicon. The transistors were miniaturized and placed on silicon chips which are called semiconductors, which drastically increased the efficiency and speed of the computers.

integrated circuit inventor

These ICs (integrated circuits) are popularly known as chips. A single IC has many transistors, resistors, and capacitors built on a single slice of silicon.

This development made computers smaller in size, low cost, large memory, and processing. The speed of these computers is very high and it is efficient and reliable also.

These generations of computers have a higher level of languages such as Pascal PL/1, FORTON-II to V, COBOL, ALGOL-68, and BASIC(Beginners All-purpose Symbolic Instruction Code) was developed during these periods.

Examples are NCR 395 (National Cash Register), IBM 360,370 series, B6500

IBM 360

  • These computers are smaller in size as compared to previous generations
  • It consumed less energy and was more reliable
  • More Versatile
  • It produced less heat as compared to previous generations
  • These computers are used for commercial and as well as general-purpose
  • These computers used a fan for head discharge to prevent damage
  • This generation of computers has increased the storage capacity of computers
  • Still, a cooling system is needed.
  • It is still very costly
  • Sophisticated Technology is required to manufacture Integrated Circuits
  • It is not easy to maintain the IC chips.
  • The performance of these computers is degraded if we execute large applications.

4. FOURTH GENERATION OF COMPUTER: Microprocessor (1971-Present)

Intel C4004

The fourth generation of computers is characterized by the use of “Microprocessor”. It was invented in the 1970s and It was developed by four inventors named are “Marcian Hoff, Masatoshi Shima, Federico Faggin, and Stanley Mazor “. The first microprocessor named was the “Intel 4004” CPU, it was the first microprocessor that was invented.

Marcian Hoff

A microprocessor contains all the circuits required to perform arithmetic, logic, and control functions on a single chip. Because of microprocessors, fourth-generation includes more data processing capacity than equivalent-sized third-generation computers. Due to the development of microprocessors, it is possible to place the CPU(central processing unit) on a single chip. These computers are also known as microcomputers. The personal computer is a fourth-generation computer. It is the period when the evolution of computer networks takes place.

Examples are APPLE II, Alter 8800

Altair 8800 Computer

  • These computers are smaller in size and much more reliable as compared to other generations of computers.
  • The heating issue on these computers is almost negligible
  • No A/C or Air conditioner is required in a fourth-generation computer.
  • In these computers, all types of higher languages can be used in this generation
  • It is also used for the general purpose
  • less expensive
  • These computers are cheaper and portable
  • Fans are required to operate these kinds of computers
  • It required the latest technology for the need to make microprocessors and complex software
  • These computers were highly sophisticated
  • It also required advanced technology to make the ICs(Integrated circuits)

5. FIFTH GENERATION OF COMPUTERS (Present and beyond)

These generations of computers were based on AI (Artificial Intelligence) technology. Artificial technology is the branch of computer science concerned with making computers behave like humans and allowing the computer to make its own decisions currently, no computers exhibit full artificial intelligence (that is, can simulate human behavior).

AI

In the fifth generation of computers, VLSI technology and ULSI (Ultra Large Scale Integration) technology are used and the speed of these computers is extremely high. This generation introduced machines with hundreds of processors that could all be working on different parts of a single program. The development of a more powerful computer is still in progress. It has been predicted that such a computer will be able to communicate in natural spoken languages with its user.

In this generation, computers are also required to use a high level of languages like C language, c++, java, etc.

Examples are Desktop computers, laptops, notebooks, MacBooks, etc. These all are the computers which we are using.

Screenshot 2023 10 20 132524 min

  • These computers are smaller in size and it is more compatible
  • These computers are mighty cheaper
  • It is obviously used for the general purpose
  • Higher technology is used
  • Development of true artificial intelligence
  • Advancement in Parallel Processing and Superconductor Technology.
  • It tends to be sophisticated and complex tools
  • It pushes the limit of transistor density.

Frequently Asked Questions

How many computer generations are there.

Mainly five generations are there:

First Generation Computer (1940-1956) Second Generation Computer (1956-1963) Third Generation Computer(1964-1971) Fourth Generation Computer(1971-Present) Fifth Generation Computer(Present and Beyond)

Which things were invented in the first generation of computers?

Vacuum Tubes

What is the fifth generation of computers?

The Fifth Generation of computers is entirely based on Artificial Intelligence. Where it predicts that the computer will be able to communicate in natural spoken languages with its user.

What is the latest computer generation?

The latest generation of computers is Fifth which is totally based on Artificial Intelligence.

Who is the inventor of the Integrated Circuit?

“Robert Noyce” and “Jack Bily”

What is the full form of ENIAC ?

ENIAC Stands for “Electronic Numerical Integrator and Computer” .

Related posts:

  • What is a Computer System and Its Types?|Different types of Computer System
  • How does the Computer System Work| With Diagram, Input, Output, processing
  • The History of Computer Systems and its Generations
  • Different Applications of Computer Systems in Various Fields | Top 12 Fields
  • Explain Von Neumann Architecture?
  • What are the input and Output Devices of Computer System with Examples
  • What is Unicode and ASCII Code
  • What is RAM and its Types?
  • What is the difference between firmware and driver? | What are Firmware and Driver?
  • What is Hardware and its Types

4 thoughts on “The Evolution Of Computer | Generations of Computer”

It is really useful thanks

Glad to see

it is very useful information for the students of b.sc people who are seeing plz leave a comment to related post thank u

Love to see that this post is proving useful for the students.

Leave a Comment Cancel reply

Save my name, email, and website in this browser for the next time I comment.

  • Random article
  • Teaching guide
  • Privacy & cookies

A model of a Babbage-style Difference Engine at the Computer History Museum. Photo by Cory Doctorow.

A brief history of computers

by Chris Woodford . Last updated: January 19, 2023.

C omputers truly came into their own as great inventions in the last two decades of the 20th century. But their history stretches back more than 2500 years to the abacus: a simple calculator made from beads and wires, which is still used in some parts of the world today. The difference between an ancient abacus and a modern computer seems vast, but the principle—making repeated calculations more quickly than the human brain—is exactly the same.

Read on to learn more about the history of computers—or take a look at our article on how computers work .

Listen instead... or scroll to keep reading

Photo: A model of one of the world's first computers (the Difference Engine invented by Charles Babbage) at the Computer History Museum in Mountain View, California, USA. Photo by Cory Doctorow published on Flickr in 2020 under a Creative Commons (CC BY-SA 2.0) licence.

Cogs and Calculators

It is a measure of the brilliance of the abacus, invented in the Middle East circa 500 BC, that it remained the fastest form of calculator until the middle of the 17th century. Then, in 1642, aged only 18, French scientist and philosopher Blaise Pascal (1623–1666) invented the first practical mechanical calculator , the Pascaline, to help his tax-collector father do his sums. The machine had a series of interlocking cogs ( gear wheels with teeth around their outer edges) that could add and subtract decimal numbers. Several decades later, in 1671, German mathematician and philosopher Gottfried Wilhelm Leibniz (1646–1716) came up with a similar but more advanced machine. Instead of using cogs, it had a "stepped drum" (a cylinder with teeth of increasing length around its edge), an innovation that survived in mechanical calculators for 300 hundred years. The Leibniz machine could do much more than Pascal's: as well as adding and subtracting, it could multiply, divide, and work out square roots. Another pioneering feature was the first memory store or "register."

Apart from developing one of the world's earliest mechanical calculators, Leibniz is remembered for another important contribution to computing: he was the man who invented binary code, a way of representing any decimal number using only the two digits zero and one. Although Leibniz made no use of binary in his own calculator, it set others thinking. In 1854, a little over a century after Leibniz had died, Englishman George Boole (1815–1864) used the idea to invent a new branch of mathematics called Boolean algebra. [1] In modern computers, binary code and Boolean algebra allow computers to make simple decisions by comparing long strings of zeros and ones. But, in the 19th century, these ideas were still far ahead of their time. It would take another 50–100 years for mathematicians and computer scientists to figure out how to use them (find out more in our articles about calculators and logic gates ).

Artwork: Pascaline: Two details of Blaise Pascal's 17th-century calculator. Left: The "user interface": the part where you dial in numbers you want to calculate. Right: The internal gear mechanism. Picture courtesy of US Library of Congress .

Engines of Calculation

Neither the abacus, nor the mechanical calculators constructed by Pascal and Leibniz really qualified as computers. A calculator is a device that makes it quicker and easier for people to do sums—but it needs a human operator. A computer, on the other hand, is a machine that can operate automatically, without any human help, by following a series of stored instructions called a program (a kind of mathematical recipe). Calculators evolved into computers when people devised ways of making entirely automatic, programmable calculators.

Photo: Punched cards: Herman Hollerith perfected the way of using punched cards and paper tape to store information and feed it into a machine. Here's a drawing from his 1889 patent Art of Compiling Statistics (US Patent#395,782), showing how a strip of paper (yellow) is punched with different patterns of holes (orange) that correspond to statistics gathered about people in the US census. Picture courtesy of US Patent and Trademark Office.

The first person to attempt this was a rather obsessive, notoriously grumpy English mathematician named Charles Babbage (1791–1871). Many regard Babbage as the "father of the computer" because his machines had an input (a way of feeding in numbers), a memory (something to store these numbers while complex calculations were taking place), a processor (the number-cruncher that carried out the calculations), and an output (a printing mechanism)—the same basic components shared by all modern computers. During his lifetime, Babbage never completed a single one of the hugely ambitious machines that he tried to build. That was no surprise. Each of his programmable "engines" was designed to use tens of thousands of precision-made gears. It was like a pocket watch scaled up to the size of a steam engine , a Pascal or Leibniz machine magnified a thousand-fold in dimensions, ambition, and complexity. For a time, the British government financed Babbage—to the tune of £17,000, then an enormous sum. But when Babbage pressed the government for more money to build an even more advanced machine, they lost patience and pulled out. Babbage was more fortunate in receiving help from Augusta Ada Byron (1815–1852), Countess of Lovelace, daughter of the poet Lord Byron. An enthusiastic mathematician, she helped to refine Babbage's ideas for making his machine programmable—and this is why she is still, sometimes, referred to as the world's first computer programmer. [2] Little of Babbage's work survived after his death. But when, by chance, his notebooks were rediscovered in the 1930s, computer scientists finally appreciated the brilliance of his ideas. Unfortunately, by then, most of these ideas had already been reinvented by others.

Artwork: Charles Babbage (1791–1871). Picture from The Illustrated London News, 1871, courtesy of US Library of Congress .

Babbage had intended that his machine would take the drudgery out of repetitive calculations. Originally, he imagined it would be used by the army to compile the tables that helped their gunners to fire cannons more accurately. Toward the end of the 19th century, other inventors were more successful in their effort to construct "engines" of calculation. American statistician Herman Hollerith (1860–1929) built one of the world's first practical calculating machines, which he called a tabulator, to help compile census data. Then, as now, a census was taken each decade but, by the 1880s, the population of the United States had grown so much through immigration that a full-scale analysis of the data by hand was taking seven and a half years. The statisticians soon figured out that, if trends continued, they would run out of time to compile one census before the next one fell due. Fortunately, Hollerith's tabulator was an amazing success: it tallied the entire census in only six weeks and completed the full analysis in just two and a half years. Soon afterward, Hollerith realized his machine had other applications, so he set up the Tabulating Machine Company in 1896 to manufacture it commercially. A few years later, it changed its name to the Computing-Tabulating-Recording (C-T-R) company and then, in 1924, acquired its present name: International Business Machines (IBM).

Photo: Keeping count: Herman Hollerith's late-19th-century census machine (blue, left) could process 12 separate bits of statistical data each minute. Its compact 1940 replacement (red, right), invented by Eugene M. La Boiteaux of the Census Bureau, could work almost five times faster. Photo by Harris & Ewing courtesy of US Library of Congress .

Bush and the bomb

Photo: Dr Vannevar Bush (1890–1974). Picture by Harris & Ewing, courtesy of US Library of Congress .

The history of computing remembers colorful characters like Babbage, but others who played important—if supporting—roles are less well known. At the time when C-T-R was becoming IBM, the world's most powerful calculators were being developed by US government scientist Vannevar Bush (1890–1974). In 1925, Bush made the first of a series of unwieldy contraptions with equally cumbersome names: the New Recording Product Integraph Multiplier. Later, he built a machine called the Differential Analyzer, which used gears, belts, levers, and shafts to represent numbers and carry out calculations in a very physical way, like a gigantic mechanical slide rule. Bush's ultimate calculator was an improved machine named the Rockefeller Differential Analyzer, assembled in 1935 from 320 km (200 miles) of wire and 150 electric motors . Machines like these were known as analog calculators—analog because they stored numbers in a physical form (as so many turns on a wheel or twists of a belt) rather than as digits. Although they could carry out incredibly complex calculations, it took several days of wheel cranking and belt turning before the results finally emerged.

Impressive machines like the Differential Analyzer were only one of several outstanding contributions Bush made to 20th-century technology. Another came as the teacher of Claude Shannon (1916–2001), a brilliant mathematician who figured out how electrical circuits could be linked together to process binary code with Boolean algebra (a way of comparing binary numbers using logic) and thus make simple decisions. During World War II, President Franklin D. Roosevelt appointed Bush chairman first of the US National Defense Research Committee and then director of the Office of Scientific Research and Development (OSRD). In this capacity, he was in charge of the Manhattan Project, the secret $2-billion initiative that led to the creation of the atomic bomb. One of Bush's final wartime contributions was to sketch out, in 1945, an idea for a memory-storing and sharing device called Memex that would later inspire Tim Berners-Lee to invent the World Wide Web . [3] Few outside the world of computing remember Vannevar Bush today—but what a legacy! As a father of the digital computer, an overseer of the atom bomb, and an inspiration for the Web, Bush played a pivotal role in three of the 20th-century's most far-reaching technologies.

Photo: "A gigantic mechanical slide rule": A differential analyzer pictured in 1938. Picture courtesy of and © University of Cambridge Computer Laboratory, published with permission via Wikimedia Commons under a Creative Commons (CC BY 2.0) licence.

Turing—tested

The first modern computers.

The World War II years were a crucial period in the history of computing, when powerful gargantuan computers began to appear. Just before the outbreak of the war, in 1938, German engineer Konrad Zuse (1910–1995) constructed his Z1, the world's first programmable binary computer, in his parents' living room. [4] The following year, American physicist John Atanasoff (1903–1995) and his assistant, electrical engineer Clifford Berry (1918–1963), built a more elaborate binary machine that they named the Atanasoff Berry Computer (ABC). It was a great advance—1000 times more accurate than Bush's Differential Analyzer. These were the first machines that used electrical switches to store numbers: when a switch was "off", it stored the number zero; flipped over to its other, "on", position, it stored the number one. Hundreds or thousands of switches could thus store a great many binary digits (although binary is much less efficient in this respect than decimal, since it takes up to eight binary digits to store a three-digit decimal number). These machines were digital computers: unlike analog machines, which stored numbers using the positions of wheels and rods, they stored numbers as digits.

The first large-scale digital computer of this kind appeared in 1944 at Harvard University, built by mathematician Howard Aiken (1900–1973). Sponsored by IBM, it was variously known as the Harvard Mark I or the IBM Automatic Sequence Controlled Calculator (ASCC). A giant of a machine, stretching 15m (50ft) in length, it was like a huge mechanical calculator built into a wall. It must have sounded impressive, because it stored and processed numbers using "clickety-clack" electromagnetic relays (electrically operated magnets that automatically switched lines in telephone exchanges)—no fewer than 3304 of them. Impressive they may have been, but relays suffered from several problems: they were large (that's why the Harvard Mark I had to be so big); they needed quite hefty pulses of power to make them switch; and they were slow (it took time for a relay to flip from "off" to "on" or from 0 to 1).

Photo: An analog computer being used in military research in 1949. Picture courtesy of NASA on the Commons (where you can download a larger version.

Most of the machines developed around this time were intended for military purposes. Like Babbage's never-built mechanical engines, they were designed to calculate artillery firing tables and chew through the other complex chores that were then the lot of military mathematicians. During World War II, the military co-opted thousands of the best scientific minds: recognizing that science would win the war, Vannevar Bush's Office of Scientific Research and Development employed 10,000 scientists from the United States alone. Things were very different in Germany. When Konrad Zuse offered to build his Z2 computer to help the army, they couldn't see the need—and turned him down.

On the Allied side, great minds began to make great breakthroughs. In 1943, a team of mathematicians based at Bletchley Park near London, England (including Alan Turing) built a computer called Colossus to help them crack secret German codes. Colossus was the first fully electronic computer. Instead of relays, it used a better form of switch known as a vacuum tube (also known, especially in Britain, as a valve). The vacuum tube, each one about as big as a person's thumb (earlier ones were very much bigger) and glowing red hot like a tiny electric light bulb, had been invented in 1906 by Lee de Forest (1873–1961), who named it the Audion. This breakthrough earned de Forest his nickname as "the father of radio" because their first major use was in radio receivers , where they amplified weak incoming signals so people could hear them more clearly. [5] In computers such as the ABC and Colossus, vacuum tubes found an alternative use as faster and more compact switches.

Just like the codes it was trying to crack, Colossus was top-secret and its existence wasn't confirmed until after the war ended. As far as most people were concerned, vacuum tubes were pioneered by a more visible computer that appeared in 1946: the Electronic Numerical Integrator And Calculator (ENIAC). The ENIAC's inventors, two scientists from the University of Pennsylvania, John Mauchly (1907–1980) and J. Presper Eckert (1919–1995), were originally inspired by Bush's Differential Analyzer; years later Eckert recalled that ENIAC was the "descendant of Dr Bush's machine." But the machine they constructed was far more ambitious. It contained nearly 18,000 vacuum tubes (nine times more than Colossus), was around 24 m (80 ft) long, and weighed almost 30 tons. ENIAC is generally recognized as the world's first fully electronic, general-purpose, digital computer. Colossus might have qualified for this title too, but it was designed purely for one job (code-breaking); since it couldn't store a program, it couldn't easily be reprogrammed to do other things.

Photo: Sir Maurice Wilkes (left), his collaborator William Renwick, and the early EDSAC-1 electronic computer they built in Cambridge, pictured around 1947/8. Picture courtesy of and © University of Cambridge Computer Laboratory, published with permission via Wikimedia Commons under a Creative Commons (CC BY 2.0) licence.

ENIAC was just the beginning. Its two inventors formed the Eckert Mauchly Computer Corporation in the late 1940s. Working with a brilliant Hungarian mathematician, John von Neumann (1903–1957), who was based at Princeton University, they then designed a better machine called EDVAC (Electronic Discrete Variable Automatic Computer). In a key piece of work, von Neumann helped to define how the machine stored and processed its programs, laying the foundations for how all modern computers operate. [6] After EDVAC, Eckert and Mauchly developed UNIVAC 1 (UNIVersal Automatic Computer) in 1951. They were helped in this task by a young, largely unknown American mathematician and Naval reserve named Grace Murray Hopper (1906–1992), who had originally been employed by Howard Aiken on the Harvard Mark I. Like Herman Hollerith's tabulator over 50 years before, UNIVAC 1 was used for processing data from the US census. It was then manufactured for other users—and became the world's first large-scale commercial computer.

Machines like Colossus, the ENIAC, and the Harvard Mark I compete for significance and recognition in the minds of computer historians. Which one was truly the first great modern computer? All of them and none: these—and several other important machines—evolved our idea of the modern electronic computer during the key period between the late 1930s and the early 1950s. Among those other machines were pioneering computers put together by English academics, notably the Manchester/Ferranti Mark I, built at Manchester University by Frederic Williams (1911–1977) and Thomas Kilburn (1921–2001), and the EDSAC (Electronic Delay Storage Automatic Calculator), built by Maurice Wilkes (1913–2010) at Cambridge University. [7]

Photo: Control panel of the UNIVAC 1, the world's first large-scale commercial computer. Photo by Cory Doctorow published on Flickr in 2020 under a Creative Commons (CC BY-SA 2.0) licence.

The microelectronic revolution

Vacuum tubes were a considerable advance on relay switches, but machines like the ENIAC were notoriously unreliable. The modern term for a problem that holds up a computer program is a "bug." Popular legend has it that this word entered the vocabulary of computer programmers sometime in the 1950s when moths, attracted by the glowing lights of vacuum tubes, flew inside machines like the ENIAC, caused a short circuit, and brought work to a juddering halt. But there were other problems with vacuum tubes too. They consumed enormous amounts of power: the ENIAC used about 2000 times as much electricity as a modern laptop. And they took up huge amounts of space. Military needs were driving the development of machines like the ENIAC, but the sheer size of vacuum tubes had now become a real problem. ABC had used 300 vacuum tubes, Colossus had 2000, and the ENIAC had 18,000. The ENIAC's designers had boasted that its calculating speed was "at least 500 times as great as that of any other existing computing machine." But developing computers that were an order of magnitude more powerful still would have needed hundreds of thousands or even millions of vacuum tubes—which would have been far too costly, unwieldy, and unreliable. So a new technology was urgently required.

The solution appeared in 1947 thanks to three physicists working at Bell Telephone Laboratories (Bell Labs). John Bardeen (1908–1991), Walter Brattain (1902–1987), and William Shockley (1910–1989) were then helping Bell to develop new technology for the American public telephone system, so the electrical signals that carried phone calls could be amplified more easily and carried further. Shockley, who was leading the team, believed he could use semiconductors (materials such as germanium and silicon that allow electricity to flow through them only when they've been treated in special ways) to make a better form of amplifier than the vacuum tube. When his early experiments failed, he set Bardeen and Brattain to work on the task for him. Eventually, in December 1947, they created a new form of amplifier that became known as the point-contact transistor. Bell Labs credited Bardeen and Brattain with the transistor and awarded them a patent. This enraged Shockley and prompted him to invent an even better design, the junction transistor, which has formed the basis of most transistors ever since.

Like vacuum tubes, transistors could be used as amplifiers or as switches. But they had several major advantages. They were a fraction the size of vacuum tubes (typically about as big as a pea), used no power at all unless they were in operation, and were virtually 100 percent reliable. The transistor was one of the most important breakthroughs in the history of computing and it earned its inventors the world's greatest science prize, the 1956 Nobel Prize in Physics . By that time, however, the three men had already gone their separate ways. John Bardeen had begun pioneering research into superconductivity , which would earn him a second Nobel Prize in 1972. Walter Brattain moved to another part of Bell Labs.

William Shockley decided to stick with the transistor, eventually forming his own corporation to develop it further. His decision would have extraordinary consequences for the computer industry. With a small amount of capital, Shockley set about hiring the best brains he could find in American universities, including young electrical engineer Robert Noyce (1927–1990) and research chemist Gordon Moore (1929–). It wasn't long before Shockley's idiosyncratic and bullying management style upset his workers. In 1956, eight of them—including Noyce and Moore—left Shockley Transistor to found a company of their own, Fairchild Semiconductor, just down the road. Thus began the growth of "Silicon Valley," the part of California centered on Palo Alto, where many of the world's leading computer and electronics companies have been based ever since. [8]

It was in Fairchild's California building that the next breakthrough occurred—although, somewhat curiously, it also happened at exactly the same time in the Dallas laboratories of Texas Instruments. In Dallas, a young engineer from Kansas named Jack Kilby (1923–2005) was considering how to improve the transistor. Although transistors were a great advance on vacuum tubes, one key problem remained. Machines that used thousands of transistors still had to be hand wired to connect all these components together. That process was laborious, costly, and error prone. Wouldn't it be better, Kilby reflected, if many transistors could be made in a single package? This prompted him to invent the "monolithic" integrated circuit (IC) , a collection of transistors and other components that could be manufactured all at once, in a block, on the surface of a semiconductor. Kilby's invention was another step forward, but it also had a drawback: the components in his integrated circuit still had to be connected by hand. While Kilby was making his breakthrough in Dallas, unknown to him, Robert Noyce was perfecting almost exactly the same idea at Fairchild in California. Noyce went one better, however: he found a way to include the connections between components in an integrated circuit, thus automating the entire process.

Photo: An integrated circuit from the 1980s. This is an EPROM chip (effectively a forerunner of flash memory , which you could only erase with a blast of ultraviolet light).

Mainframes, minis, and micros

Photo: An IBM 704 mainframe pictured at NASA in 1958. Designed by Gene Amdahl, this scientific number cruncher was the successor to the 701 and helped pave the way to arguably the most important IBM computer of all time, the System/360, which Amdahl also designed. Photo courtesy of NASA .

Photo: The control panel of DEC's classic 1965 PDP-8 minicomputer. Photo by Cory Doctorow published on Flickr in 2020 under a Creative Commons (CC BY-SA 2.0) licence.

Integrated circuits, as much as transistors, helped to shrink computers during the 1960s. In 1943, IBM boss Thomas Watson had reputedly quipped: "I think there is a world market for about five computers." Just two decades later, the company and its competitors had installed around 25,000 large computer systems across the United States. As the 1960s wore on, integrated circuits became increasingly sophisticated and compact. Soon, engineers were speaking of large-scale integration (LSI), in which hundreds of components could be crammed onto a single chip, and then very large-scale integrated (VLSI), when the same chip could contain thousands of components.

The logical conclusion of all this miniaturization was that, someday, someone would be able to squeeze an entire computer onto a chip. In 1968, Robert Noyce and Gordon Moore had left Fairchild to establish a new company of their own. With integration very much in their minds, they called it Integrated Electronics or Intel for short. Originally they had planned to make memory chips, but when the company landed an order to make chips for a range of pocket calculators, history headed in a different direction. A couple of their engineers, Federico Faggin (1941–) and Marcian Edward (Ted) Hoff (1937–), realized that instead of making a range of specialist chips for a range of calculators, they could make a universal chip that could be programmed to work in them all. Thus was born the general-purpose, single chip computer or microprocessor—and that brought about the next phase of the computer revolution.

Personal computers

By 1974, Intel had launched a popular microprocessor known as the 8080 and computer hobbyists were soon building home computers around it. The first was the MITS Altair 8800, built by Ed Roberts . With its front panel covered in red LED lights and toggle switches, it was a far cry from modern PCs and laptops. Even so, it sold by the thousand and earned Roberts a fortune. The Altair inspired a Californian electronics wizard name Steve Wozniak (1950–) to develop a computer of his own. "Woz" is often described as the hacker's "hacker"—a technically brilliant and highly creative engineer who pushed the boundaries of computing largely for his own amusement. In the mid-1970s, he was working at the Hewlett-Packard computer company in California, and spending his free time tinkering away as a member of the Homebrew Computer Club in the Bay Area.

After seeing the Altair, Woz used a 6502 microprocessor (made by an Intel rival, Mos Technology) to build a better home computer of his own: the Apple I. When he showed off his machine to his colleagues at the club, they all wanted one too. One of his friends, Steve Jobs (1955–2011), persuaded Woz that they should go into business making the machine. Woz agreed so, famously, they set up Apple Computer Corporation in a garage belonging to Jobs' parents. After selling 175 of the Apple I for the devilish price of $666.66, Woz built a much better machine called the Apple ][ (pronounced "Apple Two"). While the Altair 8800 looked like something out of a science lab, and the Apple I was little more than a bare circuit board, the Apple ][ took its inspiration from such things as Sony televisions and stereos: it had a neat and friendly looking cream plastic case. Launched in April 1977, it was the world's first easy-to-use home "microcomputer." Soon home users, schools, and small businesses were buying the machine in their tens of thousands—at $1298 a time. Two things turned the Apple ][ into a really credible machine for small firms: a disk drive unit, launched in 1978, which made it easy to store data; and a spreadsheet program called VisiCalc, which gave Apple users the ability to analyze that data. In just two and a half years, Apple sold around 50,000 of the machine, quickly accelerating out of Jobs' garage to become one of the world's biggest companies. Dozens of other microcomputers were launched around this time, including the TRS-80 from Radio Shack (Tandy in the UK) and the Commodore PET. [9]

Apple's success selling to businesses came as a great shock to IBM and the other big companies that dominated the computer industry. It didn't take a VisiCalc spreadsheet to figure out that, if the trend continued, upstarts like Apple would undermine IBM's immensely lucrative business market selling "Big Blue" computers. In 1980, IBM finally realized it had to do something and launched a highly streamlined project to save its business. One year later, it released the IBM Personal Computer (PC), based on an Intel 8080 microprocessor, which rapidly reversed the company's fortunes and stole the market back from Apple.

The PC was successful essentially for one reason. All the dozens of microcomputers that had been launched in the 1970s—including the Apple ][—were incompatible. All used different hardware and worked in different ways. Most were programmed using a simple, English-like language called BASIC, but each one used its own flavor of BASIC, which was tied closely to the machine's hardware design. As a result, programs written for one machine would generally not run on another one without a great deal of conversion. Companies who wrote software professionally typically wrote it just for one machine and, consequently, there was no software industry to speak of.

In 1976, Gary Kildall (1942–1994), a teacher and computer scientist, and one of the founders of the Homebrew Computer Club, had figured out a solution to this problem. Kildall wrote an operating system (a computer's fundamental control software) called CP/M that acted as an intermediary between the user's programs and the machine's hardware. With a stroke of genius, Kildall realized that all he had to do was rewrite CP/M so it worked on each different machine. Then all those machines could run identical user programs—without any modification at all—inside CP/M. That would make all the different microcomputers compatible at a stroke. By the early 1980s, Kildall had become a multimillionaire through the success of his invention: the first personal computer operating system. Naturally, when IBM was developing its personal computer, it approached him hoping to put CP/M on its own machine. Legend has it that Kildall was out flying his personal plane when IBM called, so missed out on one of the world's greatest deals. But the truth seems to have been that IBM wanted to buy CP/M outright for just $200,000, while Kildall recognized his product was worth millions more and refused to sell. Instead, IBM turned to a young programmer named Bill Gates (1955–). His then tiny company, Microsoft, rapidly put together an operating system called DOS, based on a product called QDOS (Quick and Dirty Operating System), which they acquired from Seattle Computer Products. Some believe Microsoft and IBM cheated Kildall out of his place in computer history; Kildall himself accused them of copying his ideas. Others think Gates was simply the shrewder businessman. Either way, the IBM PC, powered by Microsoft's operating system, was a runaway success.

Yet IBM's victory was short-lived. Cannily, Bill Gates had sold IBM the rights to one flavor of DOS (PC-DOS) and retained the rights to a very similar version (MS-DOS) for his own use. When other computer manufacturers, notably Compaq and Dell, starting making IBM-compatible (or "cloned") hardware, they too came to Gates for the software. IBM charged a premium for machines that carried its badge, but consumers soon realized that PCs were commodities: they contained almost identical components—an Intel microprocessor, for example—no matter whose name they had on the case. As IBM lost market share, the ultimate victors were Microsoft and Intel, who were soon supplying the software and hardware for almost every PC on the planet. Apple, IBM, and Kildall made a great deal of money—but all failed to capitalize decisively on their early success. [10]

Photo: Personal computers threatened companies making large "mainframes" like this one. Picture courtesy of NASA on the Commons (where you can download a larger version).

The user revolution

Fortunately for Apple, it had another great idea. One of the Apple II's strongest suits was its sheer "user-friendliness." For Steve Jobs, developing truly easy-to-use computers became a personal mission in the early 1980s. What truly inspired him was a visit to PARC (Palo Alto Research Center), a cutting-edge computer laboratory then run as a division of the Xerox Corporation. Xerox had started developing computers in the early 1970s, believing they would make paper (and the highly lucrative photocopiers Xerox made) obsolete. One of PARC's research projects was an advanced $40,000 computer called the Xerox Alto. Unlike most microcomputers launched in the 1970s, which were programmed by typing in text commands, the Alto had a desktop-like screen with little picture icons that could be moved around with a mouse: it was the very first graphical user interface (GUI, pronounced "gooey")—an idea conceived by Alan Kay (1940–) and now used in virtually every modern computer. The Alto borrowed some of its ideas, including the mouse , from 1960s computer pioneer Douglas Engelbart (1925–2013).

Photo: During the 1980s, computers started to converge on the same basic "look and feel," largely inspired by the work of pioneers like Alan Kay and Douglas Engelbart. Photographs in the Carol M. Highsmith Archive, courtesy of US Library of Congress , Prints and Photographs Division.

Back at Apple, Jobs launched his own version of the Alto project to develop an easy-to-use computer called PITS (Person In The Street). This machine became the Apple Lisa, launched in January 1983—the first widely available computer with a GUI desktop. With a retail price of $10,000, over three times the cost of an IBM PC, the Lisa was a commercial flop. But it paved the way for a better, cheaper machine called the Macintosh that Jobs unveiled a year later, in January 1984. With its memorable launch ad for the Macintosh inspired by George Orwell's novel 1984 , and directed by Ridley Scott (director of the dystopic movie Blade Runner ), Apple took a swipe at IBM's monopoly, criticizing what it portrayed as the firm's domineering—even totalitarian—approach: Big Blue was really Big Brother. Apple's ad promised a very different vision: "On January 24, Apple Computer will introduce Macintosh. And you'll see why 1984 won't be like '1984'." The Macintosh was a critical success and helped to invent the new field of desktop publishing in the mid-1980s, yet it never came close to challenging IBM's position.

Ironically, Jobs' easy-to-use machine also helped Microsoft to dislodge IBM as the world's leading force in computing. When Bill Gates saw how the Macintosh worked, with its easy-to-use picture-icon desktop, he launched Windows, an upgraded version of his MS-DOS software. Apple saw this as blatant plagiarism and filed a $5.5 billion copyright lawsuit in 1988. Four years later, the case collapsed with Microsoft effectively securing the right to use the Macintosh "look and feel" in all present and future versions of Windows. Microsoft's Windows 95 system, launched three years later, had an easy-to-use, Macintosh-like desktop and MS-DOS running behind the scenes.

Photo: The IBM Blue Gene/P supercomputer at Argonne National Laboratory: one of the world's most powerful computers. Picture courtesy of Argonne National Laboratory published on Wikimedia Commons in 2009 under a Creative Commons Licence .

From nets to the Internet

Standardized PCs running standardized software brought a big benefit for businesses: computers could be linked together into networks to share information. At Xerox PARC in 1973, electrical engineer Bob Metcalfe (1946–) developed a new way of linking computers "through the ether" (empty space) that he called Ethernet. A few years later, Metcalfe left Xerox to form his own company, 3Com, to help companies realize "Metcalfe's Law": computers become useful the more closely connected they are to other people's computers. As more and more companies explored the power of local area networks (LANs), so, as the 1980s progressed, it became clear that there were great benefits to be gained by connecting computers over even greater distances—into so-called wide area networks (WANs).

Photo: Computers aren't what they used to be: they're much less noticeable because they're much more seamlessly integrated into everyday life. Some are "embedded" into household gadgets like coffee makers or televisions . Others travel round in our pockets in our smartphones—essentially pocket computers that we can program simply by downloading "apps" (applications).

Today, the best known WAN is the Internet —a global network of individual computers and LANs that links up hundreds of millions of people. The history of the Internet is another story, but it began in the 1960s when four American universities launched a project to connect their computer systems together to make the first WAN. Later, with funding for the Department of Defense, that network became a bigger project called ARPANET (Advanced Research Projects Agency Network). In the mid-1980s, the US National Science Foundation (NSF) launched its own WAN called NSFNET. The convergence of all these networks produced what we now call the Internet later in the 1980s. Shortly afterward, the power of networking gave British computer programmer Tim Berners-Lee (1955–) his big idea: to combine the power of computer networks with the information-sharing idea Vannevar Bush had proposed in 1945. Thus, was born the World Wide Web —an easy way of sharing information over a computer network, which made possible the modern age of cloud computing (where anyone can access vast computing power over the Internet without having to worry about where or how their data is processed). It's Tim Berners-Lee's invention that brings you this potted history of computing today!

And now where?

If you liked this article..., don't want to read our articles try listening instead, find out more, on this site.

  • Supercomputers : How do the world's most powerful computers work?

Other websites

There are lots of websites covering computer history. Here are a just a few favorites worth exploring!

  • The Computer History Museum : The website of the world's biggest computer museum in California.
  • The Computing Age : A BBC special report into computing past, present, and future.
  • Charles Babbage at the London Science Museum : Lots of information about Babbage and his extraordinary engines. [Archived via the Wayback Machine]
  • IBM History : Many fascinating online exhibits, as well as inside information about the part IBM inventors have played in wider computer history.
  • Wikipedia History of Computing Hardware : covers similar ground to this page.
  • Computer history images : A small but interesting selection of photos.
  • Transistorized! : The history of the invention of the transistor from PBS.
  • Intel Museum : The story of Intel's contributions to computing from the 1970s onward.

There are some superb computer history videos on YouTube and elsewhere; here are three good ones to start you off:

  • The Difference Engine : A great introduction to Babbage's Difference Engine from Doron Swade, one of the world's leading Babbage experts.
  • The ENIAC : A short Movietone news clip about the completion of the world's first programmable electronic computer.
  • A tour of the Computer History Museum : Dag Spicer gives us a tour of the world's most famous computer museum, in California.

For older readers

For younger readers.

Text copyright © Chris Woodford 2006, 2023. All rights reserved. Full copyright notice and terms of use .

Rate this page

Tell your friends, cite this page, more to explore on our website....

  • Get the book
  • Send feedback

essay about evolution of computer

The history of computing is both evolution and revolution

essay about evolution of computer

Head, Department of Computing & Information Systems, The University of Melbourne

Disclosure statement

Justin Zobel does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

University of Melbourne provides funding as a founding partner of The Conversation AU.

View all partners

This month marks the 60th anniversary of the first computer in an Australian university. The University of Melbourne took possession of the machine from CSIRO and on June 14, 1956, the recommissioned CSIRAC was formally switched on. Six decades on, our series Computing turns 60 looks at how things have changed.

It is a truism that computing continues to change our world. It shapes how objects are designed, what information we receive, how and where we work, and who we meet and do business with. And computing changes our understanding of the world around us and the universe beyond.

For example, while computers were initially used in weather forecasting as no more than an efficient way to assemble observations and do calculations, today our understanding of weather is almost entirely mediated by computational models.

Another example is biology. Where once research was done entirely in the lab (or in the wild) and then captured in a model, it often now begins in a predictive model, which then determines what might be explored in the real world.

The transformation that is due to computation is often described as digital disruption . But an aspect of this transformation that can easily be overlooked is that computing has been disrupting itself.

Evolution and revolution

Each wave of new computational technology has tended to lead to new kinds of systems, new ways of creating tools, new forms of data, and so on, which have often overturned their predecessors. What has seemed to be evolution is, in some ways, a series of revolutions.

But the development of computing technologies is more than a chain of innovation – a process that’s been a hallmark of the physical technologies that shape our world.

For example, there is a chain of inspiration from waterwheel, to steam engine, to internal combustion engine. Underlying this is a process of enablement. The industry of steam engine construction yielded the skills, materials and tools used in construction of the first internal combustion engines.

In computing, something richer is happening where new technologies emerge, not only by replacing predecessors, but also by enveloping them. Computing is creating platforms on which it reinvents itself, reaching up to the next platform.

Getting connected

Arguably, the most dramatic of these innovations is the web. During the 1970s and 1980s, there were independent advances in the availability of cheap, fast computing, of affordable disk storage and of networking.

essay about evolution of computer

Compute and storage were taken up in personal computers, which at that stage were standalone, used almost entirely for gaming and word processing. At the same time, networking technologies became pervasive in university computer science departments, where they enabled, for the first time, the collaborative development of software.

This was the emergence of a culture of open-source development, in which widely spread communities not only used common operating systems, programming languages and tools, but collaboratively contributed to them.

As networks spread, tools developed in one place could be rapidly promoted, shared and deployed elsewhere. This dramatically changed the notion of software ownership, of how software was designed and created, and of who controlled the environments we use.

The networks themselves became more uniform and interlinked, creating the global internet, a digital traffic infrastructure. Increases in computing power meant there was spare capacity for providing services remotely.

The falling cost of disk meant that system administrators could set aside storage to host repositories that could be accessed globally. The internet was thus used not just for email and chat forums (known then as news groups) but, increasingly, as an exchange mechanism for data and code.

This was in strong contrast to the systems used in business at that time, which were customised, isolated, and rigid.

With hindsight, the confluence of networking, compute and storage at the start of the 1990s, coupled with the open-source culture of sharing, seems almost miraculous. An environment ready for something remarkable, but without even a hint of what that thing might be.

The ‘superhighway’

It was to enhance this environment that then US Vice President Al Gore proposed in 1992 the “ information superhighway ”, before any major commercial or social uses of the internet had appeared.

essay about evolution of computer

Meanwhile, in 1990, researchers at CERN, including Tim Berners-Lee , created a system for storing documents and publishing them to the internet, which they called the world wide web .

As knowledge of this system spread on the internet (transmitted by the new model of open-source software systems), people began using it via increasingly sophisticated browsers. They also began to write documents specifically for online publication – that is, web pages.

As web pages became interactive and resources moved online, the web became a platform that has transformed society. But it also transformed computing.

With the emergence of the web came the decline of the importance of the standalone computer, dependent on local storage.

We all connect

The value of these systems is due to another confluence: the arrival on the web of vast numbers of users. For example, without behaviours to learn from, search engines would not work well, so human actions have become part of the system.

There are (contentious) narratives of ever-improving technology, but also an entirely unarguable narrative of computing itself being transformed by becoming so deeply embedded in our daily lives.

This is, in many ways, the essence of big data. Computing is being fed by human data streams: traffic data, airline trips, banking transactions, social media and so on.

The challenges of the discipline have been dramatically changed by this data, and also by the fact that the products of the data (such as traffic control and targeted marketing) have immediate impacts on people.

Software that runs robustly on a single computer is very different from that with a high degree of rapid interaction with the human world, giving rise to needs for new kinds of technologies and experts, in ways not evenly remotely anticipated by the researchers who created the technologies that led to this transformation.

Decisions that were once made by hand-coded algorithms are now made entirely by learning from data. Whole fields of study may become obsolete.

The discipline does indeed disrupt itself. And as the next wave of technology arrives (immersive environments? digital implants? aware homes?), it will happen again.

  • Computer science
  • Computing turns 60

essay about evolution of computer

OzGrav Postdoctoral Research Fellow

essay about evolution of computer

Student Administration Officer

essay about evolution of computer

Casual Facilitator: GERRIC Student Programs - Arts, Design and Architecture

essay about evolution of computer

Senior Lecturer, Digital Advertising

essay about evolution of computer

Manager, Centre Policy and Translation

SEP home page

  • Table of Contents
  • Random Entry
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Advanced Tools
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Entry Contents

Bibliography

Academic tools.

  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

The Modern History of Computing

Historically, computers were human clerks who calculated in accordance with effective methods. These human computers did the sorts of calculation nowadays carried out by electronic computers, and many thousands of them were employed in commerce, government, and research establishments. The term computing machine , used increasingly from the 1920s, refers to any machine that does the work of a human computer, i.e., any machine that calculates in accordance with effective methods. During the late 1940s and early 1950s, with the advent of electronic computing machines, the phrase ‘computing machine’ gradually gave way simply to ‘computer’, initially usually with the prefix ‘electronic’ or ‘digital’. This entry surveys the history of these machines.

  • Analog Computers

The Universal Turing Machine

Electromechanical versus electronic computation, turing's automatic computing engine, the manchester machine, eniac and edvac, other notable early computers, high-speed memory, other internet resources, related entries.

Charles Babbage was Lucasian Professor of Mathematics at Cambridge University from 1828 to 1839 (a post formerly held by Isaac Newton). Babbage's proposed Difference Engine was a special-purpose digital computing machine for the automatic production of mathematical tables (such as logarithm tables, tide tables, and astronomical tables). The Difference Engine consisted entirely of mechanical components — brass gear wheels, rods, ratchets, pinions, etc. Numbers were represented in the decimal system by the positions of 10-toothed metal wheels mounted in columns. Babbage exhibited a small working model in 1822. He never completed the full-scale machine that he had designed but did complete several fragments. The largest — one ninth of the complete calculator — is on display in the London Science Museum. Babbage used it to perform serious computational work, calculating various mathematical tables. In 1990, Babbage's Difference Engine No. 2 was finally built from Babbage's designs and is also on display at the London Science Museum.

The Swedes Georg and Edvard Scheutz (father and son) constructed a modified version of Babbage's Difference Engine. Three were made, a prototype and two commercial models, one of these being sold to an observatory in Albany, New York, and the other to the Registrar-General's office in London, where it calculated and printed actuarial tables.

Babbage's proposed Analytical Engine, considerably more ambitious than the Difference Engine, was to have been a general-purpose mechanical digital computer. The Analytical Engine was to have had a memory store and a central processing unit (or ‘mill’) and would have been able to select from among alternative actions consequent upon the outcome of its previous actions (a facility nowadays known as conditional branching). The behaviour of the Analytical Engine would have been controlled by a program of instructions contained on punched cards connected together with ribbons (an idea that Babbage had adopted from the Jacquard weaving loom). Babbage emphasised the generality of the Analytical Engine, saying ‘the conditions which enable a finite machine to make calculations of unlimited extent are fulfilled in the Analytical Engine’ (Babbage [1994], p. 97).

Babbage worked closely with Ada Lovelace, daughter of the poet Byron, after whom the modern programming language ADA is named. Lovelace foresaw the possibility of using the Analytical Engine for non-numeric computation, suggesting that the Engine might even be capable of composing elaborate pieces of music.

A large model of the Analytical Engine was under construction at the time of Babbage's death in 1871 but a full-scale version was never built. Babbage's idea of a general-purpose calculating engine was never forgotten, especially at Cambridge, and was on occasion a lively topic of mealtime discussion at the war-time headquarters of the Government Code and Cypher School, Bletchley Park, Buckinghamshire, birthplace of the electronic digital computer.

Analog computers

The earliest computing machines in wide use were not digital but analog. In analog representation, properties of the representational medium ape (or reflect or model) properties of the represented state-of-affairs. (In obvious contrast, the strings of binary digits employed in digital representation do not represent by means of possessing some physical property — such as length — whose magnitude varies in proportion to the magnitude of the property that is being represented.) Analog representations form a diverse class. Some examples: the longer a line on a road map, the longer the road that the line represents; the greater the number of clear plastic squares in an architect's model, the greater the number of windows in the building represented; the higher the pitch of an acoustic depth meter, the shallower the water. In analog computers, numerical quantities are represented by, for example, the angle of rotation of a shaft or a difference in electrical potential. Thus the output voltage of the machine at a time might represent the momentary speed of the object being modelled.

As the case of the architect's model makes plain, analog representation may be discrete in nature (there is no such thing as a fractional number of windows). Among computer scientists, the term ‘analog’ is sometimes used narrowly, to indicate representation of one continuously-valued quantity by another (e.g., speed by voltage). As Brian Cantwell Smith has remarked:

‘Analog’ should … be a predicate on a representation whose structure corresponds to that of which it represents … That continuous representations should historically have come to be called analog presumably betrays the recognition that, at the levels at which it matters to us, the world is more foundationally continuous than it is discrete. (Smith [1991], p. 271)

James Thomson, brother of Lord Kelvin, invented the mechanical wheel-and-disc integrator that became the foundation of analog computation (Thomson [1876]). The two brothers constructed a device for computing the integral of the product of two given functions, and Kelvin described (although did not construct) general-purpose analog machines for integrating linear differential equations of any order and for solving simultaneous linear equations. Kelvin's most successful analog computer was his tide predicting machine, which remained in use at the port of Liverpool until the 1960s. Mechanical analog devices based on the wheel-and-disc integrator were in use during World War I for gunnery calculations. Following the war, the design of the integrator was considerably improved by Hannibal Ford (Ford [1919]).

Stanley Fifer reports that the first semi-automatic mechanical analog computer was built in England by the Manchester firm of Metropolitan Vickers prior to 1930 (Fifer [1961], p. 29); however, I have so far been unable to verify this claim. In 1931, Vannevar Bush, working at MIT, built the differential analyser, the first large-scale automatic general-purpose mechanical analog computer. Bush's design was based on the wheel and disc integrator. Soon copies of his machine were in use around the world (including, at Cambridge and Manchester Universities in England, differential analysers built out of kit-set Meccano, the once popular engineering toy).

It required a skilled mechanic equipped with a lead hammer to set up Bush's mechanical differential analyser for each new job. Subsequently, Bush and his colleagues replaced the wheel-and-disc integrators and other mechanical components by electromechanical, and finally by electronic, devices.

A differential analyser may be conceptualised as a collection of ‘black boxes’ connected together in such a way as to allow considerable feedback. Each box performs a fundamental process, for example addition, multiplication of a variable by a constant, and integration. In setting up the machine for a given task, boxes are connected together so that the desired set of fundamental processes is executed. In the case of electrical machines, this was done typically by plugging wires into sockets on a patch panel (computing machines whose function is determined in this way are referred to as ‘program-controlled’).

Since all the boxes work in parallel, an electronic differential analyser solves sets of equations very quickly. Against this has to be set the cost of massaging the problem to be solved into the form demanded by the analog machine, and of setting up the hardware to perform the desired computation. A major drawback of analog computation is the higher cost, relative to digital machines, of an increase in precision. During the 1960s and 1970s, there was considerable interest in ‘hybrid’ machines, where an analog section is controlled by and programmed via a digital section. However, such machines are now a rarity.

In 1936, at Cambridge University, Turing invented the principle of the modern computer. He described an abstract digital computing machine consisting of a limitless memory and a scanner that moves back and forth through the memory, symbol by symbol, reading what it finds and writing further symbols (Turing [1936]). The actions of the scanner are dictated by a program of instructions that is stored in the memory in the form of symbols. This is Turing's stored-program concept, and implicit in it is the possibility of the machine operating on and modifying its own program. (In London in 1947, in the course of what was, so far as is known, the earliest public lecture to mention computer intelligence, Turing said, ‘What we want is a machine that can learn from experience’, adding that the ‘possibility of letting the machine alter its own instructions provides the mechanism for this’ (Turing [1947] p. 393). Turing's computing machine of 1936 is now known simply as the universal Turing machine. Cambridge mathematician Max Newman remarked that right from the start Turing was interested in the possibility of actually building a computing machine of the sort that he had described (Newman in interview with Christopher Evans in Evans [197?].

From the start of the Second World War Turing was a leading cryptanalyst at the Government Code and Cypher School, Bletchley Park. Here he became familiar with Thomas Flowers' work involving large-scale high-speed electronic switching (described below). However, Turing could not turn to the project of building an electronic stored-program computing machine until the cessation of hostilities in Europe in 1945.

During the wartime years Turing did give considerable thought to the question of machine intelligence. Colleagues at Bletchley Park recall numerous off-duty discussions with him on the topic, and at one point Turing circulated a typewritten report (now lost) setting out some of his ideas. One of these colleagues, Donald Michie (who later founded the Department of Machine Intelligence and Perception at the University of Edinburgh), remembers Turing talking often about the possibility of computing machines (1) learning from experience and (2) solving problems by means of searching through the space of possible solutions, guided by rule-of-thumb principles (Michie in interview with Copeland, 1995). The modern term for the latter idea is ‘heuristic search’, a heuristic being any rule-of-thumb principle that cuts down the amount of searching required in order to find a solution to a problem. At Bletchley Park Turing illustrated his ideas on machine intelligence by reference to chess. Michie recalls Turing experimenting with heuristics that later became common in chess programming (in particular minimax and best-first).

Further information about Turing and the computer, including his wartime work on codebreaking and his thinking about artificial intelligence and artificial life, can be found in Copeland 2004.

With some exceptions — including Babbage's purely mechanical engines, and the finger-powered National Accounting Machine - early digital computing machines were electromechanical. That is to say, their basic components were small, electrically-driven, mechanical switches called ‘relays’. These operate relatively slowly, whereas the basic components of an electronic computer — originally vacuum tubes (valves) — have no moving parts save electrons and so operate extremely fast. Electromechanical digital computing machines were built before and during the second world war by (among others) Howard Aiken at Harvard University, George Stibitz at Bell Telephone Laboratories, Turing at Princeton University and Bletchley Park, and Konrad Zuse in Berlin. To Zuse belongs the honour of having built the first working general-purpose program-controlled digital computer. This machine, later called the Z3, was functioning in 1941. (A program-controlled computer, as opposed to a stored-program computer, is set up for a new task by re-routing wires, by means of plugs etc.)

Relays were too slow and unreliable a medium for large-scale general-purpose digital computation (although Aiken made a valiant effort). It was the development of high-speed digital techniques using vacuum tubes that made the modern computer possible.

The earliest extensive use of vacuum tubes for digital data-processing appears to have been by the engineer Thomas Flowers, working in London at the British Post Office Research Station at Dollis Hill. Electronic equipment designed by Flowers in 1934, for controlling the connections between telephone exchanges, went into operation in 1939, and involved between three and four thousand vacuum tubes running continuously. In 1938–1939 Flowers worked on an experimental electronic digital data-processing system, involving a high-speed data store. Flowers' aim, achieved after the war, was that electronic equipment should replace existing, less reliable, systems built from relays and used in telephone exchanges. Flowers did not investigate the idea of using electronic equipment for numerical calculation, but has remarked that at the outbreak of war with Germany in 1939 he was possibly the only person in Britain who realized that vacuum tubes could be used on a large scale for high-speed digital computation. (See Copeland 2006 for m more information on Flowers' work.)

The earliest comparable use of vacuum tubes in the U.S. seems to have been by John Atanasoff at what was then Iowa State College (now University). During the period 1937–1942 Atanasoff developed techniques for using vacuum tubes to perform numerical calculations digitally. In 1939, with the assistance of his student Clifford Berry, Atanasoff began building what is sometimes called the Atanasoff-Berry Computer, or ABC, a small-scale special-purpose electronic digital machine for the solution of systems of linear algebraic equations. The machine contained approximately 300 vacuum tubes. Although the electronic part of the machine functioned successfully, the computer as a whole never worked reliably, errors being introduced by the unsatisfactory binary card-reader. Work was discontinued in 1942 when Atanasoff left Iowa State.

The first fully functioning electronic digital computer was Colossus, used by the Bletchley Park cryptanalysts from February 1944.

From very early in the war the Government Code and Cypher School (GC&CS) was successfully deciphering German radio communications encoded by means of the Enigma system, and by early 1942 about 39,000 intercepted messages were being decoded each month, thanks to electromechanical machines known as ‘bombes’. These were designed by Turing and Gordon Welchman (building on earlier work by Polish cryptanalysts).

During the second half of 1941, messages encoded by means of a totally different method began to be intercepted. This new cipher machine, code-named ‘Tunny’ by Bletchley Park, was broken in April 1942 and current traffic was read for the first time in July of that year. Based on binary teleprinter code, Tunny was used in preference to Morse-based Enigma for the encryption of high-level signals, for example messages from Hitler and members of the German High Command.

The need to decipher this vital intelligence as rapidly as possible led Max Newman to propose in November 1942 (shortly after his recruitment to GC&CS from Cambridge University) that key parts of the decryption process be automated, by means of high-speed electronic counting devices. The first machine designed and built to Newman's specification, known as the Heath Robinson, was relay-based with electronic circuits for counting. (The electronic counters were designed by C.E. Wynn-Williams, who had been using thyratron tubes in counting circuits at the Cavendish Laboratory, Cambridge, since 1932 [Wynn-Williams 1932].) Installed in June 1943, Heath Robinson was unreliable and slow, and its high-speed paper tapes were continually breaking, but it proved the worth of Newman's idea. Flowers recommended that an all-electronic machine be built instead, but he received no official encouragement from GC&CS. Working independently at the Post Office Research Station at Dollis Hill, Flowers quietly got on with constructing the world's first large-scale programmable electronic digital computer. Colossus I was delivered to Bletchley Park in January 1943.

By the end of the war there were ten Colossi working round the clock at Bletchley Park. From a cryptanalytic viewpoint, a major difference between the prototype Colossus I and the later machines was the addition of the so-called Special Attachment, following a key discovery by cryptanalysts Donald Michie and Jack Good. This broadened the function of Colossus from ‘wheel setting’ — i.e., determining the settings of the encoding wheels of the Tunny machine for a particular message, given the ‘patterns’ of the wheels — to ‘wheel breaking’, i.e., determining the wheel patterns themselves. The wheel patterns were eventually changed daily by the Germans on each of the numerous links between the German Army High Command and Army Group commanders in the field. By 1945 there were as many 30 links in total. About ten of these were broken and read regularly.

Colossus I contained approximately 1600 vacuum tubes and each of the subsequent machines approximately 2400 vacuum tubes. Like the smaller ABC, Colossus lacked two important features of modern computers. First, it had no internally stored programs. To set it up for a new task, the operator had to alter the machine's physical wiring, using plugs and switches. Second, Colossus was not a general-purpose machine, being designed for a specific cryptanalytic task involving counting and Boolean operations.

F.H. Hinsley, official historian of GC&CS, has estimated that the war in Europe was shortened by at least two years as a result of the signals intelligence operation carried out at Bletchley Park, in which Colossus played a major role. Most of the Colossi were destroyed once hostilities ceased. Some of the electronic panels ended up at Newman's Computing Machine Laboratory in Manchester (see below), all trace of their original use having been removed. Two Colossi were retained by GC&CS (renamed GCHQ following the end of the war). The last Colossus is believed to have stopped running in 1960.

Those who knew of Colossus were prohibited by the Official Secrets Act from sharing their knowledge. Until the 1970s, few had any idea that electronic computation had been used successfully during the second world war. In 1970 and 1975, respectively, Good and Michie published notes giving the barest outlines of Colossus. By 1983, Flowers had received clearance from the British Government to publish a partial account of the hardware of Colossus I. Details of the later machines and of the Special Attachment, the uses to which the Colossi were put, and the cryptanalytic algorithms that they ran, have only recently been declassified. (For the full account of Colossus and the attack on Tunny see Copeland 2006.)

To those acquainted with the universal Turing machine of 1936, and the associated stored-program concept, Flowers' racks of digital electronic equipment were proof of the feasibility of using large numbers of vacuum tubes to implement a high-speed general-purpose stored-program computer. The war over, Newman lost no time in establishing the Royal Society Computing Machine Laboratory at Manchester University for precisely that purpose. A few months after his arrival at Manchester, Newman wrote as follows to the Princeton mathematician John von Neumann (February 1946):

I am … hoping to embark on a computing machine section here, having got very interested in electronic devices of this kind during the last two or three years. By about eighteen months ago I had decided to try my hand at starting up a machine unit when I got out. … I am of course in close touch with Turing.

Turing and Newman were thinking along similar lines. In 1945 Turing joined the National Physical Laboratory (NPL) in London, his brief to design and develop an electronic stored-program digital computer for scientific work. (Artificial Intelligence was not far from Turing's thoughts: he described himself as ‘building a brain’ and remarked in a letter that he was ‘more interested in the possibility of producing models of the action of the brain than in the practical applications to computing’.) John Womersley, Turing's immediate superior at NPL, christened Turing's proposed machine the Automatic Computing Engine, or ACE, in homage to Babbage's Difference Engine and Analytical Engine.

Turing's 1945 report ‘Proposed Electronic Calculator’ gave the first relatively complete specification of an electronic stored-program general-purpose digital computer. The report is reprinted in full in Copeland 2005.

The first electronic stored-program digital computer to be proposed in the U.S. was the EDVAC (see below). The ‘First Draft of a Report on the EDVAC’ (May 1945), composed by von Neumann, contained little engineering detail, in particular concerning electronic hardware (owing to restrictions in the U.S.). Turing's ‘Proposed Electronic Calculator’, on the other hand, supplied detailed circuit designs and specifications of hardware units, specimen programs in machine code, and even an estimate of the cost of building the machine (£11,200). ACE and EDVAC differed fundamentally from one another; for example, ACE employed distributed processing, while EDVAC had a centralised structure.

Turing saw that speed and memory were the keys to computing. Turing's colleague at NPL, Jim Wilkinson, observed that Turing ‘was obsessed with the idea of speed on the machine’ [Copeland 2005, p. 2]. Turing's design had much in common with today's RISC architectures and it called for a high-speed memory of roughly the same capacity as an early Macintosh computer (enormous by the standards of his day). Had Turing's ACE been built as planned it would have been in a different league from the other early computers. However, progress on Turing's Automatic Computing Engine ran slowly, due to organisational difficulties at NPL, and in 1948 a ‘very fed up’ Turing (Robin Gandy's description, in interview with Copeland, 1995) left NPL for Newman's Computing Machine Laboratory at Manchester University. It was not until May 1950 that a small pilot model of the Automatic Computing Engine, built by Wilkinson, Edward Newman, Mike Woodger, and others, first executed a program. With an operating speed of 1 MHz, the Pilot Model ACE was for some time the fastest computer in the world.

Sales of DEUCE, the production version of the Pilot Model ACE, were buoyant — confounding the suggestion, made in 1946 by the Director of the NPL, Sir Charles Darwin, that ‘it is very possible that … one machine would suffice to solve all the problems that are demanded of it from the whole country’ [Copeland 2005, p. 4]. The fundamentals of Turing's ACE design were employed by Harry Huskey (at Wayne State University, Detroit) in the Bendix G15 computer (Huskey in interview with Copeland, 1998). The G15 was arguably the first personal computer; over 400 were sold worldwide. DEUCE and the G15 remained in use until about 1970. Another computer deriving from Turing's ACE design, the MOSAIC, played a role in Britain's air defences during the Cold War period; other derivatives include the Packard-Bell PB250 (1961). (More information about these early computers is given in [Copeland 2005].)

The earliest general-purpose stored-program electronic digital computer to work was built in Newman's Computing Machine Laboratory at Manchester University. The Manchester ‘Baby’, as it became known, was constructed by the engineers F.C. Williams and Tom Kilburn, and performed its first calculation on 21 June 1948. The tiny program, stored on the face of a cathode ray tube, was just seventeen instructions long. A much enlarged version of the machine, with a programming system designed by Turing, became the world's first commercially available computer, the Ferranti Mark I. The first to be completed was installed at Manchester University in February 1951; in all about ten were sold, in Britain, Canada, Holland and Italy.

The fundamental logico-mathematical contributions by Turing and Newman to the triumph at Manchester have been neglected, and the Manchester machine is nowadays remembered as the work of Williams and Kilburn. Indeed, Newman's role in the development of computers has never been sufficiently emphasised (due perhaps to his thoroughly self-effacing way of relating the relevant events).

It was Newman who, in a lecture in Cambridge in 1935, introduced Turing to the concept that led directly to the Turing machine: Newman defined a constructive process as one that a machine can carry out (Newman in interview with Evans, op. cit.). As a result of his knowledge of Turing's work, Newman became interested in the possibilities of computing machinery in, as he put it, ‘a rather theoretical way’. It was not until Newman joined GC&CS in 1942 that his interest in computing machinery suddenly became practical, with his realisation that the attack on Tunny could be mechanised. During the building of Colossus, Newman tried to interest Flowers in Turing's 1936 paper — birthplace of the stored-program concept - but Flowers did not make much of Turing's arcane notation. There is no doubt that by 1943, Newman had firmly in mind the idea of using electronic technology in order to construct a stored-program general-purpose digital computing machine.

In July of 1946 (the month in which the Royal Society approved Newman's application for funds to found the Computing Machine Laboratory), Freddie Williams, working at the Telecommunications Research Establishment, Malvern, began the series of experiments on cathode ray tube storage that was to lead to the Williams tube memory. Williams, until then a radar engineer, explains how it was that he came to be working on the problem of computer memory:

[O]nce [the German Armies] collapsed … nobody was going to care a toss about radar, and people like me … were going to be in the soup unless we found something else to do. And computers were in the air. Knowing absolutely nothing about them I latched onto the problem of storage and tackled that. (Quoted in Bennett 1976.)

Newman learned of Williams' work, and with the able help of Patrick Blackett, Langworthy Professor of Physics at Manchester and one of the most powerful figures in the University, was instrumental in the appointment of the 35 year old Williams to the recently vacated Chair of Electro-Technics at Manchester. (Both were members of the appointing committee (Kilburn in interview with Copeland, 1997).) Williams immediately had Kilburn, his assistant at Malvern, seconded to Manchester. To take up the story in Williams' own words:

[N]either Tom Kilburn nor I knew the first thing about computers when we arrived in Manchester University. We'd had enough explained to us to understand what the problem of storage was and what we wanted to store, and that we'd achieved, so the point now had been reached when we'd got to find out about computers … Newman explained the whole business of how a computer works to us. (F.C. Williams in interview with Evans [1976])

Elsewhere Williams is explicit concerning Turing's role and gives something of the flavour of the explanation that he and Kilburn received:

Tom Kilburn and I knew nothing about computers, but a lot about circuits. Professor Newman and Mr A.M. Turing … knew a lot about computers and substantially nothing about electronics. They took us by the hand and explained how numbers could live in houses with addresses and how if they did they could be kept track of during a calculation. (Williams [1975], p. 328)

It seems that Newman must have used much the same words with Williams and Kilburn as he did in an address to the Royal Society on 4th March 1948:

Professor Hartree … has recalled that all the essential ideas of the general-purpose calculating machines now being made are to be found in Babbage's plans for his analytical engine. In modern times the idea of a universal calculating machine was independently introduced by Turing … [T]he machines now being made in America and in this country … [are] in certain general respects … all similar. There is provision for storing numbers, say in the scale of 2, so that each number appears as a row of, say, forty 0's and 1's in certain places or "houses" in the machine. … Certain of these numbers, or "words" are read, one after another, as orders. In one possible type of machine an order consists of four numbers, for example 11, 13, 27, 4. The number 4 signifies "add", and when control shifts to this word the "houses" H11 and H13 will be connected to the adder as inputs, and H27 as output. The numbers stored in H11 and H13 pass through the adder, are added, and the sum is passed on to H27. The control then shifts to the next order. In most real machines the process just described would be done by three separate orders, the first bringing [H11] (=content of H11) to a central accumulator, the second adding [H13] into the accumulator, and the third sending the result to H27; thus only one address would be required in each order. … A machine with storage, with this automatic-telephone-exchange arrangement and with the necessary adders, subtractors and so on, is, in a sense, already a universal machine. (Newman [1948], pp. 271–272)

Following this explanation of Turing's three-address concept (source 1, source 2, destination, function) Newman went on to describe program storage (‘the orders shall be in a series of houses X1, X2, …’) and conditional branching. He then summed up:

From this highly simplified account it emerges that the essential internal parts of the machine are, first, a storage for numbers (which may also be orders). … Secondly, adders, multipliers, etc. Thirdly, an "automatic telephone exchange" for selecting "houses", connecting them to the arithmetic organ, and writing the answers in other prescribed houses. Finally, means of moving control at any stage to any chosen order, if a certain condition is satisfied, otherwise passing to the next order in the normal sequence. Besides these there must be ways of setting up the machine at the outset, and extracting the final answer in useable form. (Newman [1948], pp. 273–4)

In a letter written in 1972 Williams described in some detail what he and Kilburn were told by Newman:

About the middle of the year [1946] the possibility of an appointment at Manchester University arose and I had a talk with Professor Newman who was already interested in the possibility of developing computers and had acquired a grant from the Royal Society of £30,000 for this purpose. Since he understood computers and I understood electronics the possibilities of fruitful collaboration were obvious. I remember Newman giving us a few lectures in which he outlined the organisation of a computer in terms of numbers being identified by the address of the house in which they were placed and in terms of numbers being transferred from this address, one at a time, to an accumulator where each entering number was added to what was already there. At any time the number in the accumulator could be transferred back to an assigned address in the store and the accumulator cleared for further use. The transfers were to be effected by a stored program in which a list of instructions was obeyed sequentially. Ordered progress through the list could be interrupted by a test instruction which examined the sign of the number in the accumulator. Thereafter operation started from a new point in the list of instructions. This was the first information I received about the organisation of computers. … Our first computer was the simplest embodiment of these principles, with the sole difference that it used a subtracting rather than an adding accumulator. (Letter from Williams to Randell, 1972; in Randell [1972], p. 9)

Turing's early input to the developments at Manchester, hinted at by Williams in his above-quoted reference to Turing, may have been via the lectures on computer design that Turing and Wilkinson gave in London during the period December 1946 to February 1947 (Turing and Wilkinson [1946–7]). The lectures were attended by representatives of various organisations planning to use or build an electronic computer. Kilburn was in the audience (Bowker and Giordano [1993]). (Kilburn usually said, when asked from where he obtained his basic knowledge of the computer, that he could not remember (letter from Brian Napper to Copeland, 2002); for example, in a 1992 interview he said: ‘Between early 1945 and early 1947, in that period, somehow or other I knew what a digital computer was … Where I got this knowledge from I've no idea’ (Bowker and Giordano [1993], p. 19).)

Whatever role Turing's lectures may have played in informing Kilburn, there is little doubt that credit for the Manchester computer — called the ‘Newman-Williams machine’ in a contemporary document (Huskey 1947) — belongs not only to Williams and Kilburn but also to Newman, and that the influence on Newman of Turing's 1936 paper was crucial, as was the influence of Flowers' Colossus.

The first working AI program, a draughts (checkers) player written by Christopher Strachey, ran on the Ferranti Mark I in the Manchester Computing Machine Laboratory. Strachey (at the time a teacher at Harrow School and an amateur programmer) wrote the program with Turing's encouragement and utilising the latter's recently completed Programmers' Handbook for the Ferranti. (Strachey later became Director of the Programming Research Group at Oxford University.) By the summer of 1952, the program could, Strachey reported, ‘play a complete game of draughts at a reasonable speed’. (Strachey's program formed the basis for Arthur Samuel's well-known checkers program.) The first chess-playing program, also, was written for the Manchester Ferranti, by Dietrich Prinz; the program first ran in November 1951. Designed for solving simple problems of the mate-in-two variety, the program would examine every possible move until a solution was found. Turing started to program his ‘Turochamp’ chess-player on the Ferranti Mark I, but never completed the task. Unlike Prinz's program, the Turochamp could play a complete game (when hand-simulated) and operated not by exhaustive search but under the guidance of heuristics.

The first fully functioning electronic digital computer to be built in the U.S. was ENIAC, constructed at the Moore School of Electrical Engineering, University of Pennsylvania, for the Army Ordnance Department, by J. Presper Eckert and John Mauchly. Completed in 1945, ENIAC was somewhat similar to the earlier Colossus, but considerably larger and more flexible (although far from general-purpose). The primary function for which ENIAC was designed was the calculation of tables used in aiming artillery. ENIAC was not a stored-program computer, and setting it up for a new job involved reconfiguring the machine by means of plugs and switches. For many years, ENIAC was believed to have been the first functioning electronic digital computer, Colossus being unknown to all but a few.

In 1944, John von Neumann joined the ENIAC group. He had become ‘intrigued’ (Goldstine's word, [1972], p. 275) with Turing's universal machine while Turing was at Princeton University during 1936–1938. At the Moore School, von Neumann emphasised the importance of the stored-program concept for electronic computing, including the possibility of allowing the machine to modify its own program in useful ways while running (for example, in order to control loops and branching). Turing's paper of 1936 (‘On Computable Numbers, with an Application to the Entscheidungsproblem’) was required reading for members of von Neumann's post-war computer project at the Institute for Advanced Study, Princeton University (letter from Julian Bigelow to Copeland, 2002; see also Copeland [2004], p. 23). Eckert appears to have realised independently, and prior to von Neumann's joining the ENIAC group, that the way to take full advantage of the speed at which data is processed by electronic circuits is to place suitably encoded instructions for controlling the processing in the same high-speed storage devices that hold the data itself (documented in Copeland [2004], pp. 26–7). In 1945, while ENIAC was still under construction, von Neumann produced a draft report, mentioned previously, setting out the ENIAC group's ideas for an electronic stored-program general-purpose digital computer, the EDVAC (von Neuman [1945]). The EDVAC was completed six years later, but not by its originators, who left the Moore School to build computers elsewhere. Lectures held at the Moore School in 1946 on the proposed EDVAC were widely attended and contributed greatly to the dissemination of the new ideas.

Von Neumann was a prestigious figure and he made the concept of a high-speed stored-program digital computer widely known through his writings and public addresses. As a result of his high profile in the field, it became customary, although historically inappropriate, to refer to electronic stored-program digital computers as ‘von Neumann machines’.

The Los Alamos physicist Stanley Frankel, responsible with von Neumann and others for mechanising the large-scale calculations involved in the design of the atomic bomb, has described von Neumann's view of the importance of Turing's 1936 paper, in a letter:

I know that in or about 1943 or ‘44 von Neumann was well aware of the fundamental importance of Turing's paper of 1936 … Von Neumann introduced me to that paper and at his urging I studied it with care. Many people have acclaimed von Neumann as the "father of the computer" (in a modern sense of the term) but I am sure that he would never have made that mistake himself. He might well be called the midwife, perhaps, but he firmly emphasized to me, and to others I am sure, that the fundamental conception is owing to Turing, in so far as not anticipated by Babbage … Both Turing and von Neumann, of course, also made substantial contributions to the "reduction to practice" of these concepts but I would not regard these as comparable in importance with the introduction and explication of the concept of a computer able to store in its memory its program of activities and of modifying that program in the course of these activities. (Quoted in Randell [1972], p. 10)

Other notable early stored-program electronic digital computers were:

  • EDSAC, 1949, built at Cambridge University by Maurice Wilkes
  • BINAC, 1949, built by Eckert's and Mauchly's Electronic Control Co., Philadelphia (opinions differ over whether BINAC ever actually worked)
  • Whirlwind I, 1949, Digital Computer Laboratory, Massachusetts Institute of Technology, Jay Forrester
  • SEAC, 1950, US Bureau of Standards Eastern Division, Washington D.C., Samuel Alexander, Ralph Slutz
  • SWAC, 1950, US Bureau of Standards Western Division, Institute for Numerical Analysis, University of California at Los Angeles, Harry Huskey
  • UNIVAC, 1951, Eckert-Mauchly Computer Corporation, Philadelphia (the first computer to be available commercially in the U.S.)
  • the IAS computer, 1952, Institute for Advanced Study, Princeton University, Julian Bigelow, Arthur Burks, Herman Goldstine, von Neumann, and others (thanks to von Neumann's publishing the specifications of the IAS machine, it became the model for a group of computers known as the Princeton Class machines; the IAS computer was also a strong influence on the IBM 701)
  • IBM 701, 1952, International Business Machine's first mass-produced electronic stored-program computer.

The EDVAC and ACE proposals both advocated the use of mercury-filled tubes, called ‘delay lines’, for high-speed internal memory. This form of memory is known as acoustic memory. Delay lines had initially been developed for echo cancellation in radar; the idea of using them as memory devices originated with Eckert at the Moore School. Here is Turing's description:

It is proposed to build "delay line" units consisting of mercury … tubes about 5′ long and 1″ in diameter in contact with a quartz crystal at each end. The velocity of sound in … mercury … is such that the delay will be 1.024 ms. The information to be stored may be considered to be a sequence of 1024 ‘digits’ (0 or 1) … These digits will be represented by a corresponding sequence of pulses. The digit 0 … will be represented by the absence of a pulse at the appropriate time, the digit 1 … by its presence. This series of pulses is impressed on the end of the line by one piezo-crystal, it is transmitted down the line in the form of supersonic waves, and is reconverted into a varying voltage by the crystal at the far end. This voltage is amplified sufficiently to give an output of the order of 10 volts peak to peak and is used to gate a standard pulse generated by the clock. This pulse may be again fed into the line by means of the transmitting crystal, or we may feed in some altogether different signal. We also have the possibility of leading the gated pulse to some other part of the calculator, if we have need of that information at the time. Making use of the information does not of course preclude keeping it also. (Turing [1945], p. 375)

Mercury delay line memory was used in EDSAC, BINAC, SEAC, Pilot Model ACE, EDVAC, DEUCE, and full-scale ACE (1958). The chief advantage of the delay line as a memory medium was, as Turing put it, that delay lines were "already a going concern" (Turing [1947], p. 380). The fundamental disadvantages of the delay line were that random access is impossible and, moreover, the time taken for an instruction, or number, to emerge from a delay line depends on where in the line it happens to be.

In order to minimize waiting-time, Turing arranged for instructions to be stored not in consecutive positions in the delay line, but in relative positions selected by the programmer in such a way that each instruction would emerge at exactly the time it was required, in so far as this was possible. Each instruction contained a specification of the location of the next. This system subsequently became known as ‘optimum coding’. It was an integral feature of every version of the ACE design. Optimum coding made for difficult and untidy programming, but the advantage in terms of speed was considerable. Thanks to optimum coding, the Pilot Model ACE was able to do a floating point multiplication in 3 milliseconds (Wilkes's EDSAC required 4.5 milliseconds to perform a single fixed point multiplication).

In the Williams tube or electrostatic memory, previously mentioned, a two-dimensional rectangular array of binary digits was stored on the face of a commercially-available cathode ray tube. Access to data was immediate. Williams tube memories were employed in the Manchester series of machines, SWAC, the IAS computer, and the IBM 701, and a modified form of Williams tube in Whirlwind I (until replacement by magnetic core in 1953).

Drum memories, in which data was stored magnetically on the surface of a metal cylinder, were developed on both sides of the Atlantic. The initial idea appears to have been Eckert's. The drum provided reasonably large quantities of medium-speed memory and was used to supplement a high-speed acoustic or electrostatic memory. In 1949, the Manchester computer was successfully equipped with a drum memory; this was constructed by the Manchester engineers on the model of a drum developed by Andrew Booth at Birkbeck College, London.

The final major event in the early history of electronic computation was the development of magnetic core memory. Jay Forrester realised that the hysteresis properties of magnetic core (normally used in transformers) lent themselves to the implementation of a three-dimensional solid array of randomly accessible storage points. In 1949, at Massachusetts Institute of Technology, he began to investigate this idea empirically. Forrester's early experiments with metallic core soon led him to develop the superior ferrite core memory. Digital Equipment Corporation undertook to build a computer similar to the Whirlwind I as a test vehicle for a ferrite core memory. The Memory Test Computer was completed in 1953. (This computer was used in 1954 for the first simulations of neural networks, by Belmont Farley and Wesley Clark of MIT's Lincoln Laboratory (see Copeland and Proudfoot [1996]).

Once the absolute reliability, relative cheapness, high capacity and permanent life of ferrite core memory became apparent, core soon replaced other forms of high-speed memory. The IBM 704 and 705 computers (announced in May and October 1954, respectively) brought core memory into wide use.

Works Cited

  • Babbage, C. (ed. by Campbell-Kelly, M.), 1994, Passages from the Life of a Philosopher , New Brunswick: Rutgers University Press
  • Bennett, S., 1976, ‘F.C. Williams: his contribution to the development of automatic control’, National Archive for the History of Computing, University of Manchester, England. (This is a typescript based on interviews with Williams in 1976.)
  • Bowker, G., and Giordano, R., 1993, ‘Interview with Tom Kilburn’, Annals of the History of Computing , 15 : 17–32.
  • Copeland, B.J. (ed.), 2004, The Essential Turing Oxford University Press
  • Copeland, B.J. (ed.), 2005, Alan Turing's Automatic Computing Engine: The Master Codebreaker's Struggle to Build the Modern Computer Oxford University Press
  • Copeland, B.J. and others, 2006, Colossus: The Secrets of Bletchley Park's Codebreaking Computers Oxford University Press
  • Copeland, B.J., and Proudfoot, D., 1996, ‘On Alan Turing's Anticipation of Connectionism’ Synthese , 108 : 361–377
  • Evans, C., 197?, interview with M.H.A. Newman in ‘The Pioneers of Computing: an Oral History of Computing’, London: Science Museum
  • Fifer, S., 1961, Analog Computation: Theory, Techniques, Applications New York: McGraw-Hill
  • Ford, H., 1919, ‘Mechanical Movement’, Official Gazette of the United States Patent Office , October 7, 1919: 48
  • Goldstine, H., 1972, The Computer from Pascal to von Neumann Princeton University Press
  • Huskey, H.D., 1947, ‘The State of the Art in Electronic Digital Computing in Britain and the United States’, in [Copeland 2005]
  • Newman, M.H.A., 1948, ‘General Principles of the Design of All-Purpose Computing Machines’ Proceedings of the Royal Society of London , series A, 195 (1948): 271–274
  • Randell, B., 1972, ‘On Alan Turing and the Origins of Digital Computers’, in Meltzer, B., Michie, D. (eds), Machine Intelligence 7 , Edinburgh: Edinburgh University Press, 1972
  • Smith, B.C., 1991, ‘The Owl and the Electric Encyclopaedia’, Artificial Intelligence , 47 : 251–288
  • Thomson, J., 1876, ‘On an Integrating Machine Having a New Kinematic Principle’ Proceedings of the Royal Society of London , 24 : 262–5
  • Turing, A.M., 1936, ‘On Computable Numbers, with an Application to the Entscheidungsproblem’ Proceedings of the London Mathematical Society , Series 2, 42 (1936–37): 230–265. Reprinted in The Essential Turing (Copeland [2004]).
  • Turing, A.M, 1945, ‘Proposed Electronic Calculator’, in Alan Turing's Automatic Computing Engine (Copeland [2005])
  • Turing, A.M., 1947, ‘Lecture on the Automatic Computing Engine’, in The Essential Turing (Copeland [2004])
  • Turing, A.M., and Wilkinson, J.H., 1946–7, ‘The Turing-Wilkinson Lecture Series (1946-7)’, in Alan Turing's Automatic Computing Engine (Copeland [2005])
  • von Neumann, J., 1945, ‘First Draft of a Report on the EDVAC’, in Stern, N. From ENIAC to UNIVAC: An Appraisal of the Eckert-Mauchly Computers Bedford, Mass.: Digital Press (1981), pp. 181–246
  • Williams, F.C., 1975, ‘Early Computers at Manchester University’ The Radio and Electronic Engineer , 45 (1975): 237–331
  • Wynn-Williams, C.E., 1932, ‘A Thyratron "Scale of Two" Automatic Counter’ Proceedings of the Royal Society of London , series A, 136 : 312–324

Further Reading

  • Copeland, B.J., 2004, ‘Colossus — Its Origins and Originators’ Annals of the History of Computing , 26 : 38–45
  • Metropolis, N., Howlett, J., Rota, G.C. (eds), 1980, A History of Computing in the Twentieth Century New York: Academic Press
  • Randell, B. (ed.), 1982, The Origins of Digital Computers: Selected Papers Berlin: Springer-Verlag
  • Williams, M.R., 1997, A History of Computing Technology Los Alamitos: IEEE Computer Society Press
How to cite this entry . Preview the PDF version of this entry at the Friends of the SEP Society . Look up topics and thinkers related to this entry at the Internet Philosophy Ontology Project (InPhO). Enhanced bibliography for this entry at PhilPapers , with links to its database.
  • The Turing Archive for the History of Computing
  • The Alan Turing Home Page
  • Australian Computer Museum Society
  • The Bletchley Park Home Page
  • Charles Babbage Institute
  • Computational Logic Group at St. Andrews
  • The Computer Conservation Society (UK)
  • CSIRAC (a.k.a. CSIR MARK I) Home Page
  • Frode Weierud's CryptoCellar
  • Logic and Computation Group at Penn
  • National Archive for the History of Computing
  • National Cryptologic Museum

computability and complexity | recursive functions | Turing, Alan | Turing machines

Copyright © 2006 by B. Jack Copeland < jack . copeland @ canterbury . ac . nz >

  • Accessibility

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

The Stanford Encyclopedia of Philosophy is copyright © 2023 by The Metaphysics Research Lab , Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

Computers actually date back to the 1930s. Here's how they've changed.

  • From the 1930s to today, the computer has changed dramatically. 
  • The first modern computer was created in the 1930s and was called the Z1, which was followed by large machinery that took up entire rooms. 
  • In the '60s, computers evolved from professional use to personal use, as the first personal computer was introduced to the public.
  • In the 1980s, Apple introduced its first computer, the Macintosh, and has dominated the computer industry ever since with laptops and tablets. 
  • Visit Insider's homepage for more stories.

Insider Today

Although computers seem like a relatively modern invention , computing dates back to the early 1800s. 

Throughout computing history, there has not been a lone inventor or a single first computer . The invention of the computer was incremental, with dozens of scientists and mathematicians building on their predecessors. The modern computer, however, can be traced back to the 1930s.

Keep reading to learn how the computer has changed throughout the decades. 

The 1930s marked the beginning of calculating machines, which were considered the first programmable computers.

essay about evolution of computer

Konrad Zuse created what became known as the first programmable computer, the Z1 , in 1936 in his parent's living room in Berlin. He assembled metal plates, pins, and old film, creating a machine that could easily add and subtract. Although his early models were destroyed in World War II, Zuse is credited with creating the first digital computer. 

In the 1940s, computers took up entire rooms, like the ENIAC, which was once called a "mathematical robot."

essay about evolution of computer

John Mauchly created the ENIAC during World War II to help the Army with ballistics analytics. The machine could calculate thousands of problems each second . The large-scale ENIAC weighed 30 tons and needed a 1,500-square-foot room to house the 40 cabinets, 6,000 switches, and 18,000 vacuum tubes that comprise the machine. 

Some call this invention the beginning of the computer age.

In the 1950s, computers were strictly used for scientific and engineering research, like the JOHNNIAC, which was once described as a "helpful assistant" for mathematicians.

essay about evolution of computer

The JOHNNIAC was completed in 1954 and was used by RAND researchers. The massive machine weighed just over two tons with over 5,000 vacuum tubes. This early computer operated for 13 years or 51,349 hours before being dismantled. 

In the 1960s, everything changed when the Programma 101 became the first desktop computer sold to the average consumer.

essay about evolution of computer

Up until 1965, computers were reserved for mathematicians and engineers in a lab setting. The Programma 101 changed everything , by offering the general public a desktop computer that anyone could use. The 65-pound machine was the size of a typewriter and had 37 keys and a printer built-in. 

The Italian invention ushered in the idea of the personal computer that would last to this day. 

As personal computers became popular in the 1970s, the Xerox Alto helped pave the way for Steve Jobs' Apple.

essay about evolution of computer

The Xerox Alto was created in the '70s as a personal computer that could print documents and send emails. What was most notable about the computer was its design, which included a mouse, keyboard, and screen. This state-of-the-art design would later influence Apple designs in the following decade. 

The Alto computers were also designed to be kid-friendly so that everyone — no matter the age — could operate a personal computer. 

In the '80s, Apple's Macintosh was described as a game-changer for the computer industry.

essay about evolution of computer

When Steve Jobs introduced the first Macintosh computer in 1984 , Consumer Reports called it a "dazzling display of technical wizardry." Like the Xerox Alto, the Macintosh had a keyboard, a mouse, and a small 9-inch screen. The computer — which weighed in at 22 pounds and cost $2,495 — was applauded for its interface of windows and icons. 

As the '90s marked a period of self-expression, Apple released the famous iMac G3, which was customizable.

essay about evolution of computer

The iMac G3 was launched in 1998 after Steve Jobs' return to Apple in 1997. The computer quickly became known for its Bondi blue, clear casing. The 38-pound iMac included USB ports, a keyboard, and a mouse. It was meant to be portable and customizable. 

The company sold 800,000 computers in the first five months , saving Apple from extinction. The iMac is also notable because it was the first time Apple used the "I" to name its products, explaining it stood for "internet," "innovation," and "individuality." 

In the early 2000s, laptops became increasingly popular, especially after Apple launched its MacBook Air.

essay about evolution of computer

In 2008, Steve Jobs slid the first MacBook Air from a manila envelope and shocked the audience at Apple's Macworld with how thin the laptop was. Measuring only 0.76-inch thick , the expertly designed laptop changed the industry forever. Apple got rid of the CD drive and only included a USB port and a headphone jack. At the time, the minimalistic device cost $1,799. 

Today, computers come in all shapes and sizes, including tablets.

essay about evolution of computer

Today's most innovative computers are tablets, which are simple touchscreens without a keyboard or a mouse. Although tablet sales are on the decline, 33 million tablets were sold in 2018.

The market is also filled with other computer models , including the MacBook Pro, iMac, Dell XPS, and iPhones. 

  • 20 vintage photos of products that show how far we've come in the last 100 years
  • These are Apple's 3 best inventions since the iPhone
  • 14 great computer tricks everyone should know

Follow INSIDER on Facebook .

essay about evolution of computer

  • Main content

Logo for Clemson University Open Textbooks

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

Modern (1940’s-present)

66 History of Computers

Chandler Little and Ben Greene

Introduction

Modern technology first started evolving when electricity started to be used more often in everyday life. One of the biggest inventions in the 20th century was the computer, and it has gone through many changes and improvements since its creation. The last two decades have shown more advancement in technology than any other invention. They have advanced almost every level of learning in our lives and looks like it will only keep impacting through the decades. Computers in today’s society have become a focal point for everyday life and will continue to do so for the foreseeable future. One important company that has shaped the computer industry is Apple, Inc., which was founded in the final quarter of the 20th century. This is one of the primary computer providers in American society, so a history of the company is included in this chapter to contribute specific information about a business that has propelled the growth of computer popularity.

The Evolution of Computers

Computers have come a long way from their creation. This first computer was created in 1822 by Charles Babbage and was created with a series of vacuum tubes, ultimately weighing 700 pounds- far larger than computers today. Computers have vastly decreased in size due to the invention of the transistor in 1947, which revolutionized computing by replacing bulky vacuum tubes with

essay about evolution of computer

smaller components that made computers more compact and also more reliable. This led to an era of rapid technological advancement, leading to the development of integrated circuits, microprocessors, and eventually the smaller, lighter personal computers that have become indispensable in modern society. For example, most laptops today weigh in the range of two to eight pounds. A picture of one of the first computers can be seen in Figure 1. There have been large amounts of movement in the data storage sector in computers. The very first hard drive was created in 1956 and had a capacity of 5MB and weighed in at 550 pounds.

Today hard drives are becoming smaller and we see them weighing a couple of ounces to a couple of pounds. As files have come more complex, the need for more space in computers has increased drastically. Today we see games take up to 100GB of storage. To give a reference as to how big of a difference

essay about evolution of computer

5MB is to 100GB of storage is that 5MB is .005 GB. The hard drives we have today are seeing sizes from 10TB and larger. A TB is 1000GB. The evolution of the hard drive can be seen in figure 2. As the world of computers keeps progressing, there is a general concept of making them smaller, but at the same time seeing a generational step in improvement. The improvements allow the daily tasks of many users (like teachers, researchers, and doctors) to become shorter, making their tasks quicker and easier to accomplish. New software is also constantly being developed, and as a result, we are seeing strides in staying connected to others in the ways of social media, messaging platforms, and other means of communication. The downside to this growing dependence on computers is that  technological failures/damages can create major setbacks on any given day.

relation to sts

The development of computers as a prevalent form of technology has had a profound impact on society as a whole in the United States. Computers are now ubiquitous, playing a crucial role in various aspects of everyday life. They are utilized in most classrooms across the country, facilitating learning and enhancing educational experiences for students of all ages. In the workplace, computers have revolutionized business operations, streamlining processes, increasing efficiency, and enabling remote work capabilities. Additionally, computers have become indispensable tools in households across America. According to the U.S. Census Bureau’s American Community Survey, a staggering 92% of households reported having at least one type of computer (2021). This statistic underscores the widespread integration of computers into the fabric of American life. The impact of computers extends beyond mere accessibility. They have transformed communication, allowing people to connect instantaneously through email, social media, and video conferencing platforms. Additionally, computers have revolutionized entertainment, providing access to a vast array of digital content, from streaming services to video games.

Overall, the pervasive presence of computers underscores their monumental impact on Americans’ lives, shaping how we learn, work, communicate, and entertain ourselves. As technology continues to evolve, the influence of computers on society is expected to grow even further, driving transformative changes across various domains.

missing Voices in computer history

The evolution of computers has been happening at a fast rate, and when this happens people’s contributions are left out. The main demographic in computers that are left out are women. Grace Hopper is one of the most influential people in the computer spectrum, but her work is not shown in the classroom. In the 1950s, Grace Hopper was a senior mathematician on her team for UNIVAC (UNIVersal Automatic  Computing INC). Here she created the very first compiler (Cassel, 2016). This was a massive accomplishment for anyone in the field of computing because it allowed the idea that programming languages are not tied to a specific computer, but can be used on any computer. This single feature in computers was one of the main driving forces for computing to become so robust and powerful that it is today. Grace Hopper’s work needs to be talked about in classrooms not only just in engineering courses, but as well as general classes. Students need to hear that a woman was the driving force behind the evolution of computing. By talking about this, it may urge more women to join the computing field because right now only 25% of jobs in the computing sector are held by women (Cassel, 2016). With a more diverse workforce in computing, we can see the creation of new ideas and features that were never thought of before.

During the evolution of computers, many people have been left out with their creation with respect to the development and algorithms. With the push to gender equality in the world in future years, this gap between the disparity between women’s credibility and men’s credibility will be shrunk to a negligible amount. As computers continue to evolve the world of STS will need to evolve with them to adapt to the changes in technology. If not, some of the great creations in the computer sector will be neglected, and most notoriously here is VR (Virtual Reality) with its higher entry-level price and motion sickness that comes along with VR (Virtual Reality).

history of apple, inc.

In American society today, two primary operating systems dominate: Windows and MacOS. MacOS is the operating system for Apple computers, which were estimated to cover about 16.1% of the U.S. personal computer market in the fourth quarter of 2023 according to a study from Gartner (2024). The company Apple Inc. was founded on April 1, 1976, by Steve Jobs and Steve Wozniak who wanted to make computers more user-friendly and accessible to individuals. Their vision was to revolutionize the computer industry. They started by building the Apple I in Jobs’ garage, leading to the introduction of the Apple II, which featured color graphics and propelled the company’s growth. However, internal conflicts and the departure of key figures like Jobs and Wozniak led to a period of struggle in the 1980s and early 1990s. Jobs returned to Apple in 1997 and initiated transformative changes, including an alliance with Microsoft and the launch of groundbreaking products like the iBook and iPod. Apple continued to expand their product line, with the introduction of the iPhone in 2007 marking a new era of success for Apple, propelling it to become the second most valuable company in the world. Apple has been able to maintain a strong position in the technology market for this entire period by continuously improving its beginning product, the Macintosh computer, and by adapting to new technological changes.

Image of one of the original Apple computers

Apple’s Macintosh computers have changed quite a lot throughout the company’s history. The Macintosh 128k (figure 3) was the very first Apple computer, released on January 24, 1984. It had a 9-inch black and white display with 128KB of RAM (computer memory) and operated on MacOS 1.0. The next important release was in April 1995 with several variations of the Macintosh Performa which had 500MB to 1 GB of memory. Interestingly, the multiple models of this computer ended up competing with each other and were discontinued. This led to the iMac G3 in August 1998 which sported a futuristic design with multiple color options for the back of the computer as well as USB ports, 4GB of memory, and built-in speakers. iMac G3 was the beginning of MacOS 8. In 2007, the iMac went through a major redesign with melded glass and aluminum as the material and a widescreen display. The newer Mac’s continue to be built slimmer, with faster processors, better displays, and more storage (Mingis and Moreau, 2021).

Missing voices within apple

In a male-dominant field, it’s very possible for women’s impacts to be drowned out in technological evolutions. Within Apple’s business specifically, several women have made a large difference in their progression. Susan Kare for example, was the first designer of Apple’s icons like the stopwatch and paintbrush that helped Apple establish the Mac. Another woman with a large contribution to Apple was Joanne Hoffman. She was the fifth person to join Macintosh in 1980 and “wrote the first draft of the User Interface Guidelines for the Mac and figured out how to pitch the computer at the education markets” in the beginning of Apple’s existence (Evans, 2016).

Throughout this chapter, the importance of computers as a catalyst for advancement in our society is evident. Clearly computers have evolved from their inception up until this point today in multiple different ways, and several people from several different backgrounds have played an important part in this.

How has the advancement in technology improved your life?

A brief history of computers – unipi.it . (n.d.). Retrieved November 7, 2022, from https://digitaltools.labcd.unipi.it/wp-content/uploads/2021/05/A-brief-history-of-computers.pdf

Cassel, L.  (December 15, 2016 ). “Op-Ed: 25 Years After Computing Pioneer Grace Hopper’ s Death, We Still Have Work to Do”. USNEWS.com, Thursday. Accessed via Nexis Uni database from Clemson University.

Evans, J. (2016, March 8). 10 Women Who Made Apple Great. Computerworld. https://www.computerworld.com/article/3041874/10-women-who-made-apple-great.html

Gartner. (2024, January 10). Gartner Says Worldwide PC Shipments Increased 0.3% in Fourth Quarter of 2023 but Declined 14.8% for the Year. Gartner. https://www.gartner.com/en/newsroom/press-releases/01-10-2024-gartner-says-worldwide-pc-shipments-increased-zero-point-three-percent-in-fourth-quarter-of-2023-but-declined-fourteen-point-eight-percent-for-the-year#:~:text=HP%20maintained%20the%20top%20spot,share%20(see%20Table%202).&text=HP%20Inc.,-4%2C665

Kleiman, K. & Saklayen, N. (2018, April 19). These 6 pioneering women helped create modern computer s. ideas.ted.com. Retrieved September 26, 2021, from https://ideas.ted.com/how-i-discovered-six-pioneering-women-who-helped-create-modern-computers-and-why-we-should-never-forget-them / .

Mingis, K. and Moreau, S. (2021, April 28). The Evolution of the Macintosh – and the Imac. Computerworld. https://www.computerworld.com/article/1617841/evolution-of-macintosh-and-imac.html

Richardson, A. (2023, April). The Founding of Apple Computer, Inc. Library of Congress. https://guides.loc.gov/this-month-in-business-history/april/apple-computer-founded

Thompson, C. (2019, June 1). The gendered history of human computers. Smithsonian.com. Retrieved September 26, 2021, from https://www.smithsonianmag.com/science-nature/history-human-computers-180972202/ .

Women in Computing and Women in Engineering honored for promoting girls in STEM.  (May 26, 2017 Friday). US Official News. Accessed via Nexis Uni database from Clemson University.

Zimmermann, K. A. (2017, September 7). History of computers: A brief timeline. LiveScience. Retrieved September 26, 2021, from https://www.livescience.com/20718-computer-history.html .

“Gene Amdahl’s first computer.” by Erik Pitti is licensed under CC BY 2.0

“First hard drives” by gabrielsaldana is licensed under CC BY 2.0

 Sailko. (2017). Neo Preistoria Exhibition (Milan 2016). Wikipedia. https://en.wikipedia.org/wiki/Macintosh_128K#/media/File:Computer_macintosh_128k,_1984_(all_about_Apple_onlus).jpg

AI ACKNOWLEDGMENT

I acknowledge the use of ChatGPT to generate additional content for this chapter.

Prompts and uses:

I entered the following prompt: Summarize the history of Apple in 7 sentences based on that prompt [the Library of Congress article].

Use: I modified the output to add more information from the article that I found relevant. I also adjusted the wording to make it fit the style of the rest of the chapter.

I entered the following prompt: The development of computers as a new, prevalent form of technology has majorly impacted society as a whole in the United States, as they are involved in several aspects of everyday life. Computers are used in some form in most classrooms in the country, in most workplaces, and in most households. Specifically, 92% of households in the U.S. Census Bureau’s American Community Survey reported having at least one type of computer. This is a simple statistic that shows the monumental impact of computers in Americans’ lives.

Use: I used the output the expand upon this paragraph; after entering the prompt, ChatGPT added several sentences and reworded some of the previously written content. I then removed some of the added information from ChatGPT and made the output more concise.

I entered the following prompt: Give me 3 more sentences to add to the following prompt. I am trying to talk about the history of computers and how they were invented. The prompt begins now- Computers have come a long way from their creation. This first computer was created in 1822 by Charles Babbage. This computer was created with a series of vacuum tubes and weighed a total of 700 pounds, which is much larger than the computers we see today. For example, most laptops weigh in a range of two to eight pounds.

After getting this output, I wrote this prompt: Give me two more sentences to add to that.

Use: I used the 5 sentences of the output to select the information that I wanted and add it to the content that was already in the book in order to provide more detail to the reduction in sizes of computers over time.

To the extent possible under law, Chandler Little and Ben Greene have waived all copyright and related or neighboring rights to Science Technology and Society a Student Led Exploration , except where otherwise noted.

Share This Book

History of computers: A brief timeline

The history of computers began with primitive designs in the early 19th century and went on to change the world during the 20th century.

History of computers: Apple I computer 1976

  • 2000-present day

Additional resources

The history of computers goes back over 200 years. At first theorized by mathematicians and entrepreneurs, during the 19th century mechanical calculating machines were designed and built to solve the increasingly complex number-crunching challenges. The advancement of technology enabled ever more-complex computers by the early 20th century, and computers became larger and more powerful.

Today, computers are almost unrecognizable from designs of the 19th century, such as Charles Babbage's Analytical Engine — or even from the huge computers of the 20th century that occupied whole rooms, such as the Electronic Numerical Integrator and Calculator.  

Here's a brief history of computers, from their primitive number-crunching origins to the powerful modern-day machines that surf the Internet, run games and stream multimedia. 

19th century

1801: Joseph Marie Jacquard, a French merchant and inventor invents a loom that uses punched wooden cards to automatically weave fabric designs. Early computers would use similar punch cards.

1821: English mathematician Charles Babbage conceives of a steam-driven calculating machine that would be able to compute tables of numbers. Funded by the British government, the project, called the "Difference Engine" fails due to the lack of technology at the time, according to the University of Minnesota . 

1848: Ada Lovelace, an English mathematician and the daughter of poet Lord Byron, writes the world's first computer program. According to Anna Siffert, a professor of theoretical mathematics at the University of Münster in Germany, Lovelace writes the first program while translating a paper on Babbage's Analytical Engine from French into English. "She also provides her own comments on the text. Her annotations, simply called "notes," turn out to be three times as long as the actual transcript," Siffert wrote in an article for The Max Planck Society . "Lovelace also adds a step-by-step description for computation of Bernoulli numbers with Babbage's machine — basically an algorithm — which, in effect, makes her the world's first computer programmer." Bernoulli numbers are a sequence of rational numbers often used in computation.

Babbage's Analytical Engine

1853: Swedish inventor Per Georg Scheutz and his son Edvard design the world's first printing calculator. The machine is significant for being the first to "compute tabular differences and print the results," according to Uta C. Merzbach's book, " Georg Scheutz and the First Printing Calculator " (Smithsonian Institution Press, 1977).

1890: Herman Hollerith designs a punch-card system to help calculate the 1890 U.S. Census. The machine,  saves the government several years of calculations, and the U.S. taxpayer approximately $5 million, according to Columbia University  Hollerith later establishes a company that will eventually become International Business Machines Corporation ( IBM ).

Early 20th century

1931: At the Massachusetts Institute of Technology (MIT), Vannevar Bush invents and builds the Differential Analyzer, the first large-scale automatic general-purpose mechanical analog computer, according to Stanford University . 

1936: Alan Turing , a British scientist and mathematician, presents the principle of a universal machine, later called the Turing machine, in a paper called "On Computable Numbers…" according to Chris Bernhardt's book " Turing's Vision " (The MIT Press, 2017). Turing machines are capable of computing anything that is computable. The central concept of the modern computer is based on his ideas. Turing is later involved in the development of the Turing-Welchman Bombe, an electro-mechanical device designed to decipher Nazi codes during World War II, according to the UK's National Museum of Computing . 

1937: John Vincent Atanasoff, a professor of physics and mathematics at Iowa State University, submits a grant proposal to build the first electric-only computer, without using gears, cams, belts or shafts.

original garage where Bill Hewlett and Dave Packard started their business

1939: David Packard and Bill Hewlett found the Hewlett Packard Company in Palo Alto, California. The pair decide the name of their new company by the toss of a coin, and Hewlett-Packard's first headquarters are in Packard's garage, according to MIT . 

1941: German inventor and engineer Konrad Zuse completes his Z3 machine, the world's earliest digital computer, according to Gerard O'Regan's book " A Brief History of Computing " (Springer, 2021). The machine was destroyed during a bombing raid on Berlin during World War II. Zuse fled the German capital after the defeat of Nazi Germany and later released the world's first commercial digital computer, the Z4, in 1950, according to O'Regan. 

1941: Atanasoff and his graduate student, Clifford Berry, design the first digital electronic computer in the U.S., called the Atanasoff-Berry Computer (ABC). This marks the first time a computer is able to store information on its main memory, and is capable of performing one operation every 15 seconds, according to the book " Birthing the Computer " (Cambridge Scholars Publishing, 2016)

1945: Two professors at the University of Pennsylvania, John Mauchly and J. Presper Eckert, design and build the Electronic Numerical Integrator and Calculator (ENIAC). The machine is the first "automatic, general-purpose, electronic, decimal, digital computer," according to Edwin D. Reilly's book "Milestones in Computer Science and Information Technology" (Greenwood Press, 2003). 

Computer technicians operating the ENIAC

1946: Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census Bureau to build the UNIVAC, the first commercial computer for business and government applications.

1947: William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invent the transistor . They discover how to make an electric switch with solid materials and without the need for a vacuum.

1949: A team at the University of Cambridge develops the Electronic Delay Storage Automatic Calculator (EDSAC), "the first practical stored-program computer," according to O'Regan. "EDSAC ran its first program in May 1949 when it calculated a table of squares and a list of prime numbers ," O'Regan wrote. In November 1949, scientists with the Council of Scientific and Industrial Research (CSIR), now called CSIRO, build Australia's first digital computer called the Council for Scientific and Industrial Research Automatic Computer (CSIRAC). CSIRAC is the first digital computer in the world to play music, according to O'Regan.

Late 20th century

1953: Grace Hopper develops the first computer language, which eventually becomes known as COBOL, which stands for COmmon, Business-Oriented Language according to the National Museum of American History . Hopper is later dubbed the "First Lady of Software" in her posthumous Presidential Medal of Freedom citation. Thomas Johnson Watson Jr., son of IBM CEO Thomas Johnson Watson Sr., conceives the IBM 701 EDPM to help the United Nations keep tabs on Korea during the war.

1954: John Backus and his team of programmers at IBM publish a paper describing their newly created FORTRAN programming language, an acronym for FORmula TRANslation, according to MIT .

1958: Jack Kilby and Robert Noyce unveil the integrated circuit, known as the computer chip. Kilby is later awarded the Nobel Prize in Physics for his work.

1968: Douglas Engelbart reveals a prototype of the modern computer at the Fall Joint Computer Conference, San Francisco. His presentation, called "A Research Center for Augmenting Human Intellect" includes a live demonstration of his computer, including a mouse and a graphical user interface (GUI), according to the Doug Engelbart Institute . This marks the development of the computer from a specialized machine for academics to a technology that is more accessible to the general public.

The first computer mouse, invented in 1963 by Douglas C. Engelbart

1969: Ken Thompson, Dennis Ritchie and a group of other developers at Bell Labs produce UNIX, an operating system that made "large-scale networking of diverse computing systems — and the internet — practical," according to Bell Labs .. The team behind UNIX continued to develop the operating system using the C programming language, which they also optimized. 

1970: The newly formed Intel unveils the Intel 1103, the first Dynamic Access Memory (DRAM) chip.

1971: A team of IBM engineers led by Alan Shugart invents the "floppy disk," enabling data to be shared among different computers.

1972: Ralph Baer, a German-American engineer, releases Magnavox Odyssey, the world's first home game console, in September 1972 , according to the Computer Museum of America . Months later, entrepreneur Nolan Bushnell and engineer Al Alcorn with Atari release Pong, the world's first commercially successful video game. 

1973: Robert Metcalfe, a member of the research staff for Xerox, develops Ethernet for connecting multiple computers and other hardware.

1977: The Commodore Personal Electronic Transactor (PET), is released onto the home computer market, featuring an MOS Technology 8-bit 6502 microprocessor, which controls the screen, keyboard and cassette player. The PET is especially successful in the education market, according to O'Regan.

1975: The magazine cover of the January issue of "Popular Electronics" highlights the Altair 8080 as the "world's first minicomputer kit to rival commercial models." After seeing the magazine issue, two "computer geeks," Paul Allen and Bill Gates, offer to write software for the Altair, using the new BASIC language. On April 4, after the success of this first endeavor, the two childhood friends form their own software company, Microsoft.

1976: Steve Jobs and Steve Wozniak co-found Apple Computer on April Fool's Day. They unveil Apple I, the first computer with a single-circuit board and ROM (Read Only Memory), according to MIT .

Apple I computer 1976

1977: Radio Shack began its initial production run of 3,000 TRS-80 Model 1 computers — disparagingly known as the "Trash 80" — priced at $599, according to the National Museum of American History. Within a year, the company took 250,000 orders for the computer, according to the book " How TRS-80 Enthusiasts Helped Spark the PC Revolution " (The Seeker Books, 2007).

1977: The first West Coast Computer Faire is held in San Francisco. Jobs and Wozniak present the Apple II computer at the Faire, which includes color graphics and features an audio cassette drive for storage.

1978: VisiCalc, the first computerized spreadsheet program is introduced.

1979: MicroPro International, founded by software engineer Seymour Rubenstein, releases WordStar, the world's first commercially successful word processor. WordStar is programmed by Rob Barnaby, and includes 137,000 lines of code, according to Matthew G. Kirschenbaum's book " Track Changes: A Literary History of Word Processing " (Harvard University Press, 2016).

1981: "Acorn," IBM's first personal computer, is released onto the market at a price point of $1,565, according to IBM. Acorn uses the MS-DOS operating system from Windows. Optional features include a display, printer, two diskette drives, extra memory, a game adapter and more.

A worker using an Acorn computer by IBM, 1981

1983: The Apple Lisa, standing for "Local Integrated Software Architecture" but also the name of Steve Jobs' daughter, according to the National Museum of American History ( NMAH ), is the first personal computer to feature a GUI. The machine also includes a drop-down menu and icons. Also this year, the Gavilan SC is released and is the first portable computer with a flip-form design and the very first to be sold as a "laptop."

1984: The Apple Macintosh is announced to the world during a Superbowl advertisement. The Macintosh is launched with a retail price of $2,500, according to the NMAH. 

1985 : As a response to the Apple Lisa's GUI, Microsoft releases Windows in November 1985, the Guardian reported . Meanwhile, Commodore announces the Amiga 1000.

1989: Tim Berners-Lee, a British researcher at the European Organization for Nuclear Research ( CERN ), submits his proposal for what would become the World Wide Web. His paper details his ideas for Hyper Text Markup Language (HTML), the building blocks of the Web. 

1993: The Pentium microprocessor advances the use of graphics and music on PCs.

1996: Sergey Brin and Larry Page develop the Google search engine at Stanford University.

1997: Microsoft invests $150 million in Apple, which at the time is struggling financially.  This investment ends an ongoing court case in which Apple accused Microsoft of copying its operating system. 

1999: Wi-Fi, the abbreviated term for "wireless fidelity" is developed, initially covering a distance of up to 300 feet (91 meters) Wired reported . 

21st century

2001: Mac OS X, later renamed OS X then simply macOS, is released by Apple as the successor to its standard Mac Operating System. OS X goes through 16 different versions, each with "10" as its title, and the first nine iterations are nicknamed after big cats, with the first being codenamed "Cheetah," TechRadar reported.  

2003: AMD's Athlon 64, the first 64-bit processor for personal computers, is released to customers. 

2004: The Mozilla Corporation launches Mozilla Firefox 1.0. The Web browser is one of the first major challenges to Internet Explorer, owned by Microsoft. During its first five years, Firefox exceeded a billion downloads by users, according to the Web Design Museum . 

2005: Google buys Android, a Linux-based mobile phone operating system

2006: The MacBook Pro from Apple hits the shelves. The Pro is the company's first Intel-based, dual-core mobile computer. 

2009: Microsoft launches Windows 7 on July 22. The new operating system features the ability to pin applications to the taskbar, scatter windows away by shaking another window, easy-to-access jumplists, easier previews of tiles and more, TechRadar reported .  

Apple CEO Steve Jobs holds the iPad during the launch of Apple's new tablet computing device in San Francisco

2010: The iPad, Apple's flagship handheld tablet, is unveiled.

2011: Google releases the Chromebook, which runs on Google Chrome OS.

2015: Apple releases the Apple Watch. Microsoft releases Windows 10.

2016: The first reprogrammable quantum computer was created. "Until now, there hasn't been any quantum-computing platform that had the capability to program new algorithms into their system. They're usually each tailored to attack a particular algorithm," said study lead author Shantanu Debnath, a quantum physicist and optical engineer at the University of Maryland, College Park.

2017: The Defense Advanced Research Projects Agency (DARPA) is developing a new "Molecular Informatics" program that uses molecules as computers. "Chemistry offers a rich set of properties that we may be able to harness for rapid, scalable information storage and processing," Anne Fischer, program manager in DARPA's Defense Sciences Office, said in a statement. "Millions of molecules exist, and each molecule has a unique three-dimensional atomic structure as well as variables such as shape, size, or even color. This richness provides a vast design space for exploring novel and multi-value ways to encode and process data beyond the 0s and 1s of current logic-based, digital architectures."

2019: A team at Google became the first to demonstrate quantum supremacy — creating a quantum computer that could feasibly outperform the most powerful classical computer — albeit for a very specific problem with no practical real-world application. The described the computer, dubbed "Sycamore" in a paper that same year in the journal Nature . Achieving quantum advantage – in which a quantum computer solves a problem with real-world applications faster than the most powerful classical computer —  is still a ways off. 

2022: The first exascale supercomputer, and the world's fastest, Frontier, went online at the Oak Ridge Leadership Computing Facility (OLCF) in Tennessee. Built by Hewlett Packard Enterprise (HPE) at the cost of $600 million, Frontier uses nearly 10,000 AMD EPYC 7453 64-core CPUs alongside nearly 40,000 AMD Radeon Instinct MI250X GPUs. This machine ushered in the era of exascale computing, which refers to systems that can reach more than one exaFLOP of power – used to measure the performance of a system. Only one machine – Frontier – is currently capable of reaching such levels of performance. It is currently being used as a tool to aid scientific discovery.

What is the first computer in history?

Charles Babbage's Difference Engine, designed in the 1820s, is considered the first "mechanical" computer in history, according to the Science Museum in the U.K . Powered by steam with a hand crank, the machine calculated a series of values and printed the results in a table. 

What are the five generations of computing?

The "five generations of computing" is a framework for assessing the entire history of computing and the key technological advancements throughout it. 

The first generation, spanning the 1940s to the 1950s, covered vacuum tube-based machines. The second then progressed to incorporate transistor-based computing between the 50s and the 60s. In the 60s and 70s, the third generation gave rise to integrated circuit-based computing. We are now in between the fourth and fifth generations of computing, which are microprocessor-based and AI-based computing.

What is the most powerful computer in the world?

As of November 2023, the most powerful computer in the world is the Frontier supercomputer . The machine, which can reach a performance level of up to 1.102 exaFLOPS, ushered in the age of exascale computing in 2022 when it went online at Tennessee's  Oak Ridge Leadership Computing Facility (OLCF) 

There is, however, a potentially more powerful supercomputer waiting in the wings in the form of the Aurora supercomputer, which is housed at the Argonne National Laboratory (ANL) outside of Chicago.  Aurora went online in November 2023. Right now, it lags far behind Frontier, with performance levels of just 585.34 petaFLOPS (roughly half the performance of Frontier), although it's still not finished. When work is completed, the supercomputer is expected to reach performance levels higher than 2 exaFLOPS.

What was the first killer app?

Killer apps are widely understood to be those so essential that they are core to the technology they run on. There have been so many through the years – from Word for Windows in 1989 to iTunes in 2001 to social media apps like WhatsApp in more recent years

Several pieces of software may stake a claim to be the first killer app, but there is a broad consensus that VisiCalc, a spreadsheet program created by VisiCorp and originally released for the Apple II in 1979, holds that title. Steve Jobs even credits this app for propelling the Apple II to become the success it was, according to co-creator Dan Bricklin .

  • Fortune: A Look Back At 40 Years of Apple
  • The New Yorker: The First Windows
  • " A Brief History of Computing " by Gerard O'Regan (Springer, 2021)

Sign up for the Live Science daily newsletter now

Get the world’s most fascinating discoveries delivered straight to your inbox.

Timothy is Editor in Chief of print and digital magazines All About History and History of War . He has previously worked on sister magazine All About Space , as well as photography and creative brands including Digital Photographer and 3D Artist . He has also written for How It Works magazine, several history bookazines and has a degree in English Literature from Bath Spa University . 

'Crazy idea' memory device could slash AI energy consumption by up to 2,500 times

World's 'best-performing' quantum computing chip could be used in machines by 2027, scientists claim

'Stunning' Bronze Age burial chamber discovered on the English moor

Most Popular

  • 2 New AI algorithm can predict the 'tipping points' for future disasters, scientists say
  • 3 Astronomers find black hole's favorite snack: 'The star appears to be living to die another day'
  • 4 1,000-year-old remains of 'elite woman' in silk cloak found in abandoned fortress in Mongolia
  • 5 Intense solar storm opens '2-way highway' for charged particles, sparking rare auroras on the sun

essay about evolution of computer

History of Computers: From Abacus to Modern PC Essay

  • To find inspiration for your paper and overcome writer’s block
  • As a source of information (ensure proper referencing)
  • As a template for you assignment

Works cited

In history computers were only used as machines that performed calculations. These changed over time with more supplicated machines which were being developed to perform more general tasks (Null L.& Lobur J., P. 34) Modern computer is a result of advances in technologies and the need to quantify, record numbers and language. Papyrus was used to make records and write numbers.

Among the first machines used was the Abacus, which helped the early man to count (History of Computers, Para. 1). It was people who were regarded as the first computer. This is because computers were developed to perform the functions that were assigned to people. The name computer was a job title for people who were used to doing calculations (An Illustrated History of Computers, part. 1). The word “computer” is said to have originated from the Latin to refer to a person who computes (Rojas & Hashagen, P. 1). According to Webster’s dictionary, a computer is an electronic that can be programmed to store, retrieve and process data (Zeruzzi, p. 351).

Operations Back When People Were Regarded As Computers

Classification and Development of Computers

Computers can be classified by their technology, their use, how they operated and the era in which they were in use (Rojas, P.1). According to Rojas (P. 3), we can classify computers into two classes that are the electronic programmed computer and others which were developed after the electronic stored programmed concept (P.3). Calculators were among the early machines and an example of this is the Harvard Mark 1 (Zeruzzi, p. 351)

Harvard Mark

Early man was in need of a way to count and do calculations. Between 1000 BC and 500 BD, He used the Abacus which had movable beads for calculations (The History of The Computer, Para. 2). A mathematician by the name Charles Babbage proposed that they construct a machine and name it Babbage Difference Engine which could calculate and print mathematical tables (The History of The Computer, Para. 3). In 1979 the United States Department of Defense had to honor Ada Byron Lovelace by naming a computer language she had written. She came up with the first computer program to improve Babbage’s ideas to make them a reality. Her ideas saw the machine’s capability to produce music and graphs (The History of The Computer, Para. 4).

The Old Abacus

George Boole was a professor of mathematics who wrote an investigation of the laws of thought; he was then recognized to be the founder of computer science (A Brief History of Computers & Networks, Part. 1). A punch card was developed by Herman Hollerith of MIT; it was a machine that used electric power. In 1982, William Burroughs introduced a calculator which could print only that it was a manual machine. He later improved it so as it could use electricity.

A differential analyzer was built by Vannevar Bush of MIT in 1925. It could handle simple calculus only that it was not accurate. The machine was made of gears and shafts. Konrad Zuse was a German engineer who built a calculator to do calculations he handled dairy. Thereafter a programmable calculator was made in 1938 by Zuse. In 1936 at Iowa State campus, John Vincent Atanasoff started developing a digital computer and came up with ABC as the way of solving linear equations (History of Computers, Para. 11).

The Enigma was another machine that the Germans used in computing algorithms in 1937; it was a complex mechanical encoder. In the same year George Steblitz came up with a model that could solve more complex calculations. The enigma code was broken by the British who built a colossus mark 1 (History of Computers, Para.13).

In 1943 at Penn state, the development of an electronic numerical integrator and computer began by Mauchly and Presper Eckert of the Moore School. In 1944 Harvard Mark 1 was introduced and then used by the U.S navy. The Harvard 1 used a paper tape as its information storage. IBM came up with a 701 computer which became the first commercially successful computer. They developed languages like FORTRAN, LISP and COBOL that were used with the computer. In 1958 a transistor powered computer was introduced by a team headed by Seymour Cray. This was the year the integrated circuits were as well developed by Kilby and Noyce. At this time computers used integrated circuits instead of the transistors (A Brief History of Computers & Networks, Part 2).

The IBM 701 computer.

These first computers were room sized and were considered to be quite powerful. Once again IBM introduced system 360 which was designed for business purposes. The system was then used to demonstrate the very first wide area network TSS (Time Share System). The first microcomputer was used to manage telephone lines. This major development was a joint MIT and Bell design of greatly defined networks featuring shared resources. Through this development, Bell was not happy with MIT and parted ways and thereafter he came up with UNIX an operating system. After UNIX, there came APANet and consequently Alan Keys designed Apple operating system. This is the era in which he proposed the design of personal computers.

A group of technicians seemed not to be happy with all these developments; they planned to form a company and named it Intel in 1969 (A Brief History of Computers & Networks, Part. 2). A pocket calculator was introduced by Texas instruments. Xerox introduced the mouse and proposals then was brought forward to develop the local area network.

The first personal computer was marketed in kit form with 256 bytes of memory. The machine used BASIC compiler that was developed by Bill Gates and other technicians. Apple followed the trend and went on to advertise also in personal computers in the same kit form. The computers comprised of a monitor, and keyboard. A few years passed and the personal computer took its center stage to the American scene where many computer companies were formed.

Many of these companies did not survive for a long time, they vanished. By 1977 stores were selling personal computers and some of them exist today. Companies are now reducing the size of the personal computer while the performance of the machines is being improved. There is also an effort to reduce their prices to make them affordable and maximize their sales. After a failed attempt, IBM once again introduced a personal computer in 1981 which was successful. (A Brief History of Computers & Networks, Part 2).

IBM Personal Computer.

The reducing size and price of computers have made it a universal component which has made big changes in human lives than any other development. Although it is difficult to tell which computer was the first to be developed, this paper tries to shows their history. Thus it is very important to know who, why and where these developments started (Rojas & Hashagen, P13).

A Brief History of Computers and Networks. 2010. Web.

An Illustrated History of Computers Part 1 . 2010. Web.

Ceruzzi E. A History of Modern Computing 2 nd edition. MIT Press, Cambridge MA. 2003.

History of computers. 2010. Web.

IBM 701 , Layout for a 701 Installation. 2010. Web.

IBM Personal Computer. 2010. Web.

Null, Lobur. The essentials of Computer Organization & Architecture. Jones & Bartlet, Sudburn MA. 2006.

Rojas, Hashagen. The First Computers: History and Architectures. MIT Press, Cambridge. 2002.

The History of The Computer. 2010. Web.

  • Graphics Card: Technical Description for Non-Professionals
  • Toughness and Corrosion: Definition and Unit
  • Ten Programming Languages
  • The Enigma of the Druze Religion: Centuries of Mystery in the Middle East
  • Ada Lovelace: Scientific Contributions
  • Graphics Card: Technical Description for Professionals
  • Management of Automotive Design and Development
  • How the Carbon Absorb the Chemical Harmful Agents
  • New Atomic Power Plant Under Construction in Georgia
  • Adjustable Speed Drives Improving Circulating Water System
  • Chicago (A-D)
  • Chicago (N-B)

IvyPanda. (2022, September 28). History of Computers: From Abacus to Modern PC. https://ivypanda.com/essays/history-of-computers-from-abacus-to-modern-pc/

"History of Computers: From Abacus to Modern PC." IvyPanda , 28 Sept. 2022, ivypanda.com/essays/history-of-computers-from-abacus-to-modern-pc/.

IvyPanda . (2022) 'History of Computers: From Abacus to Modern PC'. 28 September.

IvyPanda . 2022. "History of Computers: From Abacus to Modern PC." September 28, 2022. https://ivypanda.com/essays/history-of-computers-from-abacus-to-modern-pc/.

1. IvyPanda . "History of Computers: From Abacus to Modern PC." September 28, 2022. https://ivypanda.com/essays/history-of-computers-from-abacus-to-modern-pc/.

Bibliography

IvyPanda . "History of Computers: From Abacus to Modern PC." September 28, 2022. https://ivypanda.com/essays/history-of-computers-from-abacus-to-modern-pc/.

September 1, 2009

11 min read

The Origin of Computing

The information age began with the realization that machines could emulate the power of minds

By Martin Campbell-Kelly

In the standard story, the computer’s evolution has been brisk and short. It starts with the giant machines warehoused in World War II–era laboratories. Microchips shrink them onto desktops, Moore’s Law predicts how powerful they will become, and Microsoft capitalizes on the software. Eventually small, inexpensive devices appear that can trade stocks and beam video around the world. That is one way to approach the history of computing—the history of solid-state electronics in the past 60 years.

But computing existed long before the transistor. Ancient astronomers developed ways to predict the motion of the heavenly bodies. The Greeks deduced the shape and size of Earth. Taxes were summed; distances mapped. Always, though, computing was a human pursuit. It was arithmetic, a skill like reading or writing that helped a person make sense of the world.

The age of computing sprang from the abandonment of this limitation. Adding machines and cash registers came first, but equally critical was the quest to organize mathematical computations using what we now call “programs.” The idea of a program first arose in the 1830s, a century before what we traditionally think of as the birth of the computer. Later, the modern electronic computers that came out of World War II gave rise to the notion of the universal computer—a machine capable of any kind of information processing, even including the manipulation of its own programs. These are the computers that power our world today. Yet even as computer technology has matured to the point where it is omnipresent and seemingly limitless, researchers are attempting to use fresh insights from the mind, biological systems and quantum physics to build wholly new types of machines.

On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing . By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.

The Difference Engine In 1790, shortly after the start of the French Revolution, the government decided that the republic required a new set of maps to establish a fair system of property taxation.* He also ordered a switch from the old imperial system of measurements to the new metric system. To facilitate all the conversions, the French ordinance survey office began to compute an exhaustive collection of mathematical tables.

In the 18th century, however, computations were done by hand. A factory floor of between 60 and 80 human computers added and subtracted sums to fill in line after line of the tables for the survey’s Tables du Cadastre project. It was grunt work, demanding no special skills above basic numeracy and literacy. In fact, most computers were hairdressers who had lost their jobs—aristocratic hairstyles being the sort of thing that could endanger one’s neck in revolutionary France.

The project took about 10 years to complete, but by then, the war-torn republic did not have the funds necessary to publish the work. The manuscript languished in the Académie des Sciences for decades. Then, in 1819, a promising young British scientist named Charles Babbage would view it on a visit to Paris. Babbage was 28 at the time; three years earlier he had been elected to the Royal Society, the most prominent scientific organization in Britain. He was also very knowledgeable about the world of human computers—at various times he personally supervised the construction of astronomical and actuarial tables.

On his return to England, Babbage decided he would replicate the French project not with human computers but with machinery. England at the time was in the throes of the Industrial Revolution. Jobs that had been done by human or animal labor were falling to the efficiency of the machine. Babbage saw the power of this world of steam and brawn, of interchangeable parts and mechanization, and realized that it could replace not just muscle but the work of minds.

He proposed the construction of his Calculating Engine in 1822 and secured government funding in 1824. For the next decade he immersed himself in the world of manufacturing, seeking the best technologies with which to construct his engine.

In 1833 Babbage celebrated his annus mirabilis. That year he not only produced a functioning model of his calculating machine (which he called the Difference Engine) but also published his classic Economy of Machinery and Manufactures, establishing his reputation as the world’s leading industrial economist. He held Saturday evening soirees at his home in Devonshire Street in London, which were attended by the front rank of society. At these gatherings the model Difference Engine was placed on display as a conversation piece.

A year later Babbage abandoned the Difference Engine for a much grander vision that he called the Analytical Engine. Whereas the Difference Engine had been limited to the single task of table making, the Analytical Engine would be capable of any mathematical calculation. Like a modern computer, it would have a processor that performed arithmetic (the “mill”), memory to hold numbers (the “store”), and the ability to alter its function via user input, in this case by punched cards. In short, it was a computer conceived in Victorian technology .

Babbage’s decision to abandon the Difference Engine for the Analytical Engine was not well received, however, and the government demurred to supply him with additional funds. Undeterred, he produced thousands of pages of detailed notes and machine drawings in the hope that the government would one day fund construction. It was not until the 1970s, well into the computer age, that modern scholars studied these papers for the first time. The Analytical Engine was, as one of those scholars remarked, almost like looking at a modern computer designed on another planet.

The Dark Ages

Babbage’s vision, in essence, was digital computing. Like today’s devices, such machines manipulate numbers (or digits) according to a set of instructions and produce a precise numerical result.

Yet after Babbage’s failure, computation entered what English mathematician L. J. Comrie called the Dark Age of digital computing—a period that lasted into World War II. During this time, computation was done primarily with so-called analog computers, machines that model a system using a mechanical analog. Suppose, for example, an astronomer would like to predict the time of an event such as a solar eclipse. To do this digitally, she would numerically solve Kepler’s laws of motion. She could also create an analog computer, a model solar system made of gears and levers (or a simple electronic circuit) that would allow her to “run” time into the future.

Before World War II, the most sophisticated practical analog computing instrument was the Differential Analyzer, developed by Vannevar Bush at the Massachusetts Institute of Technology in 1929. At that time, the U.S. was investing heavily in rural electrification, and Bush was investigating electrical transmission. Such problems could be encoded in ordinary differential equations, but these were very time-consuming to solve. The Differential Analyzer allowed for an approximate solution without any numerical processing. The machine was physically quite large—it filled a good-size laboratory—and was something of a Rube Goldberg construction of gears and rotating shafts. To “program” the machine, technicians connected the various subunits of the device using screwdrivers, spanners and lead hammers. Though laborious to set up, once done the apparatus could solve in minutes equations that would take several days by hand. A dozen copies of the machine were built in the U.S. and England.

One of these copies made its way to the U.S. Army’s Aberdeen Proving Ground in Maryland, the facility responsible for readying field weapons for deployment. To aim artillery at a target of known range, soldiers had to set the vertical and horizontal angles (the elevation and azimuth) of the barrel so that the fired shell would follow the desired parabolic trajectory—soaring skyward before dropping onto the target. They selected the angles out of a firing table that contained numerous entries for various target distances and geographic conditions.

Every entry in the firing table required the integration of an ordinary differential equation. An on-site team of 200 computers would take two to three days to do each calculation by hand. The Differential Analyzer, in contrast, would need only about 20 minutes.

Everything is Change

On December 7, 1941, Japanese forces attacked the U.S. Naval base at Pearl Harbor. The U.S. was at war. Mobilization meant the army needed ever more firing tables, each of which contained about 3,000 entries. Even with the Differential Analyzer, the backlog of calculations at Aberdeen was mounting.

Eighty miles up the road from Aberdeen, the Moore School of Electrical Engineering at the University of Pennsylvania had its own differential analyzer. In the spring of 1942 a 35-year-old instructor at the school named John W. Mauchly had an idea for how to speed up calculations: construct an “electronic computor” [ sic ] that would use vacuum tubes in place of the mechanical components. Mauchly, a bespectacled, theoretically-minded individual, probably would not have been able to build the machine on his own. But he found his complement in an energetic young researcher at the school named J. Presper (“Pres”) Eckert, who had already showed sparks of engineering genius.

A year after Mauchly made his original proposal, following various accidental and bureaucratic delays, it found its way to Lieutenant Herman Goldstine, a 30-year-old Ph.D. in mathematics from the University of Chicago who was the technical liaison officer between Aberdeen and the Moore School. Within days Goldstine got the go-ahead for the project. Construction of the ENIAC—for Electronic Numerical Integrator and Computer—began on April 9, 1943. It was Eckert’s 23rd birthday.

Many engineers had serious doubts about whether the ENIAC would ever be successful. Conventional wisdom held that the life of a vacuum tube was about 3,000 hours, and the ENIAC’s initial design called for 5,000 tubes. At that failure rate, the machine would not function for more than a few minutes before a broken tube put it out of action. Eckert, however, understood that the tubes tended to fail under the stress of being turned on or off. He knew it was for that reason radio stations never turned off their transmission tubes. If tubes were operated significantly below their rated voltage, they would last longer still. (The total number of tubes would grow to 18,000 by the time the machine was complete.)

Eckert and his team completed the ENIAC in two and a half years. The finished machine was an engineering tour de force, a 30-ton behemoth that consumed 150 kilowatts of power. The machine could perform 5,000 additions per second and compute a trajectory in less time than a shell took to reach its real-life target. It was also a prime example of the role that serendipity often plays in invention: although the Moore School was not then a leading computing research facility, it happened to be in the right location at the right time with the right people.

Yet the ENIAC was finished in 1945, too late to help in the war effort. It was also limited in its capabilities. It could store only up to 20 numbers at a time. Programming the machine took days and required manipulating a patchwork of cables that resembled the inside of a busy telephone exchange. Moreover, the ENIAC was designed to solve ordinary differential equations. Some challenges—notably, the calculations required for the Manhattan Project—required the solution of partial differential equations.

John von Neumann was a consultant to the Manhattan Project when he learned of the ENIAC on a visit to Aberdeen in the summer of 1944. Born in 1903 into a wealthy Hungarian banking family, von Neumann was a mathematical prodigy who tore through his education. By 23 he had become the youngest ever privatdocent (the approximate equivalent of an associate professor) at the University of Berlin. In 1930 he emigrated to the U.S., where he joined Albert Einstein and Kurt Gödel as one of first faculty members of the Institute for Advanced Study in Princeton, N.J. He became a naturalized U.S. citizen in 1937.

Von Neumann quickly recognized the power of the machine’s computation, and in the several months after his visit to Aberdeen, he joined in meetings with Eckert, Mauchly, Goldstine and Arthur Burks—another Moore School instructor—to hammer out the design of a successor machine, the Electronic Discrete Variable Automatic Computer, or EDVAC.

The EDVAC was a huge improvement over the ENIAC. Von Neumann introduced the ideas and nomenclature of Warren McCulloch and Walter Pitts, neuroscientists who had developed a theory of the logical operations of the brain (this is where we get the term computer “memory”). He thought of the machine as being made of five core parts: Memory held not just numerical data but also the instructions for operation. An arithmetic unit performed arithmetic options. An input “organ” enabled the transfer of programs and data into memory, and an output organ recorded the results of computation. Finally, a control unit coordinated the entire system.

This layout, or architecture, makes it possible to change the computer’s program without altering the physical structure of the machine. Programs were held in memory and could be modified in a trice. Moreover, a program could manipulate its own instructions. This feature would not only enable von Neumann to solve his partial differential equations, it would confer a powerful flexibility that forms the very heart of modern computer science.

In June 1945 von Neumann wrote his classic First Draft of a Report on the EDVAC on behalf of the group. In spite of its unfinished status, it was rapidly circulated among the computing cognoscenti with two consequences. First, there never was a second draft. Second, von Neumann ended up with most of the credit for the invention.

Machine Evolution

The subsequent 60-year diffusion of the computer within society is a long story that has to be told in another place. Perhaps the single most remarkable development was that the computer—originally designed for mathematical calculations—turned out, with the right software, to be infinitely adaptable to different uses, from business data processing to personal computing to the construction of a global information network.

We can think of computer development as having taken place along three vectors—hardware, software and architecture. The improvements in hardware over the past 50 years are legendary. Bulky electronic tubes gave way in the late 1950s to “discrete” transistors—that is, single transistors individually soldered into place. In the mid-1960s microcircuits connected several transistors—then hundreds of transistors, then thousands of transistors—on a silicon “chip.” The microprocessor, developed in the early 1970s, held a complete computer processing unit on a chip. The microprocessor gave rise to the PC and now controls devices ranging from sprinkler systems to ballistic missiles.

The challenges of software were more subtle. In 1947 and 1948 von Neumann and Goldstine produced a series of reports called Planning and Coding of Problems for an Electronic Computing Instrument . In these reports they set down dozens of routines for mathematical computation with the expectation that some lowly “coder” would be able to effortlessly convert them into working programs. It was not to be. The process of writing programs and getting them to work was excruciatingly difficult. The first to make this discovery was Maurice Wilkes, the University of Cambridge computer scientist who had created the first practical stored-program computer. In his Memoirs, Wilkes ruefully recalled the very moment in 1949 when “the realization came over me with full force that a good part of the remainder of my life was going to be spent in finding the errors in my own programs.”

He and others at Cambridge developed a method of writing computer instructions in a symbolic form that made the whole job easier and less error prone. The computer would take this symbolic language and then convert it into binary. IBM introduced the programming language Fortran in 1957, which greatly simplified the writing of scientific and mathematical programs. At Dartmouth College in 1964, educator John G. Kemeny and computer scientist Thomas E. Kurtz invented Basic, a simple but mighty programming language intended to democratize computing and bring it to the entire undergraduate population. With Basic even schoolkids—the young Bill Gates among them—could begin to write their own programs.

In contrast, computer architecture—that is, the logical arrangement of subsystems that make up a computer—has barely evolved. Nearly every machine in use today shares its basic architecture with the stored program computer of 1945. The situation mirrors that of the gasoline-powered automobile—the years have seen many technical refinements and efficiency improvements in both, but the basic design is largely the same. And although it is certainly possible to design a radically better device, both have achieved what historians of technology call “closure.” Investments over the decades have produced such excellent gains that no one has had a compelling reason to invest in an alternative.

Yet there are multiple possibilities for radical evolution. For example, in the 1980s interest ran high in so-called massively parallel machines, which contained thousands of computing elements operating simultaneously, designed for computationally intensive tasks such as weather forecasting and atomic weapons research. Computer scientists have also looked to the human brain for inspiration. We now know that the brain is not a general-purpose computer made from gray matter. Rather it contains specialized processing centers for different tasks, such as face recognition or speech understanding. Scientists are harnessing these ideas in “neural networks” for applications such as automobile license plate identification and iris recognition. They could be the next step in a centuries-old process: embedding the powers of the mind in the guts of a machine.

*Erratum (10/15/09): This sentence has been edited since posting to correct a factual error.

Encyclopedia Britannica

  • History & Society
  • Science & Tech
  • Biographies
  • Animals & Nature
  • Geography & Travel
  • Arts & Culture
  • Games & Quizzes
  • On This Day
  • One Good Fact
  • New Articles
  • Lifestyles & Social Issues
  • Philosophy & Religion
  • Politics, Law & Government
  • World History
  • Health & Medicine
  • Browse Biographies
  • Birds, Reptiles & Other Vertebrates
  • Bugs, Mollusks & Other Invertebrates
  • Environment
  • Fossils & Geologic Time
  • Entertainment & Pop Culture
  • Sports & Recreation
  • Visual Arts
  • Demystified
  • Image Galleries
  • Infographics
  • Top Questions
  • Britannica Kids
  • Saving Earth
  • Space Next 50
  • Student Center
  • Introduction & Top Questions

Analog computers

Mainframe computer.

  • Supercomputer
  • Minicomputer
  • Microcomputer
  • Laptop computer
  • Embedded processors
  • Central processing unit
  • Main memory
  • Secondary memory
  • Input devices
  • Output devices
  • Communication devices
  • Peripheral interfaces
  • Fabrication
  • Transistor size
  • Power consumption
  • Quantum computing
  • Molecular computing
  • Role of operating systems
  • Multiuser systems
  • Thin systems
  • Reactive systems
  • Operating system design approaches
  • Local area networks
  • Wide area networks
  • Business and personal software
  • Scientific and engineering software
  • Internet and collaborative software
  • Games and entertainment
  • Analog calculators: from Napier’s logarithms to the slide rule
  • Digital calculators: from the Calculating Clock to the Arithmometer
  • The Jacquard loom
  • The Difference Engine
  • The Analytical Engine
  • Ada Lovelace, the first programmer
  • Herman Hollerith’s census tabulator
  • Other early business machine companies
  • Vannevar Bush’s Differential Analyzer
  • Howard Aiken’s digital calculators
  • The Turing machine
  • The Atanasoff-Berry Computer
  • The first computer network
  • Konrad Zuse
  • Bigger brains
  • Von Neumann’s “Preliminary Discussion”
  • The first stored-program machines
  • Machine language
  • Zuse’s Plankalkül
  • Interpreters
  • Grace Murray Hopper
  • IBM develops FORTRAN
  • Control programs
  • The IBM 360
  • Time-sharing from Project MAC to UNIX
  • Minicomputers
  • Integrated circuits
  • The Intel 4004
  • Early computer enthusiasts
  • The hobby market expands
  • From Star Trek to Microsoft
  • Application software
  • Commodore and Tandy enter the field
  • The graphical user interface
  • The IBM Personal Computer
  • Microsoft’s Windows operating system
  • Workstation computers
  • Embedded systems
  • Handheld digital devices
  • The Internet
  • Social networking
  • Ubiquitous computing

computer

What is a computer?

Who invented the computer, what can computers do, are computers conscious, what is the impact of computer artificial intelligence (ai) on society.

Technical insides of a desktop computer

Our editors will review what you’ve submitted and determine whether to revise the article.

  • University of Rhode Island - College of Arts and Sciences - Department of Computer Science and Statistics - History of Computers
  • LiveScience - History of Computers: A Brief Timeline
  • Computer History Museum - Timeline of Computer history
  • Engineering LibreTexts - What is a computer?
  • Computer Hope - What is a Computer?
  • computer - Children's Encyclopedia (Ages 8-11)
  • computer - Student Encyclopedia (Ages 11 and up)
  • Table Of Contents

computer

A computer is a machine that can store and process information . Most computers rely on a binary system , which uses two variables, 0 and 1, to complete tasks such as storing data, calculating algorithms, and displaying information. Computers come in many different shapes and sizes, from handheld smartphones to supercomputers weighing more than 300 tons.

Many people throughout history are credited with developing early prototypes that led to the modern computer. During World War II, physicist John Mauchly , engineer J. Presper Eckert, Jr. , and their colleagues at the University of Pennsylvania designed the first programmable general-purpose electronic digital computer, the Electronic Numerical Integrator and Computer (ENIAC).

What is the most powerful computer in the world?

As of November 2021 the most powerful computer in the world is the Japanese supercomputer Fugaku, developed by RIKEN and Fujitsu . It has been used to model COVID-19 simulations.

How do programming languages work?

Popular modern programming languages , such as JavaScript and Python, work through multiple forms of programming paradigms. Functional programming, which uses mathematical functions to give outputs based on data input, is one of the more common ways code is used to provide instructions for a computer.

The most powerful computers can perform extremely complex tasks, such as simulating nuclear weapon experiments and predicting the development of climate change . The development of quantum computers , machines that can handle a large number of calculations through quantum parallelism (derived from superposition ), would be able to do even more-complex tasks.

A computer’s ability to gain consciousness is a widely debated topic. Some argue that consciousness depends on self-awareness and the ability to think , which means that computers are conscious because they recognize their environment and can process data. Others believe that human consciousness can never be replicated by physical processes. Read one researcher’s perspective.

Computer artificial intelligence's impact on society is widely debated. Many argue that AI improves the quality of everyday life by doing routine and even complicated tasks better than humans can, making life simpler, safer, and more efficient. Others argue AI poses dangerous privacy risks, exacerbates racism by standardizing people, and costs workers their jobs leading to greater unemployment. For more on the debate over artificial intelligence, visit ProCon.org .

computer , device for processing, storing, and displaying information.

Computer once meant a person who did computations, but now the term almost universally refers to automated electronic machinery . The first section of this article focuses on modern digital electronic computers and their design, constituent parts, and applications. The second section covers the history of computing. For details on computer architecture , software , and theory, see computer science .

Computing basics

The first computers were used primarily for numerical calculations. However, as any information can be numerically encoded, people soon realized that computers are capable of general-purpose information processing . Their capacity to handle large amounts of data has extended the range and accuracy of weather forecasting . Their speed has allowed them to make decisions about routing telephone connections through a network and to control mechanical systems such as automobiles, nuclear reactors, and robotic surgical tools. They are also cheap enough to be embedded in everyday appliances and to make clothes dryers and rice cookers “smart.” Computers have allowed us to pose and answer questions that were difficult to pursue in the past. These questions might be about DNA sequences in genes, patterns of activity in a consumer market, or all the uses of a word in texts that have been stored in a database . Increasingly, computers can also learn and adapt as they operate by using processes such as machine learning .

Computers also have limitations, some of which are theoretical. For example, there are undecidable propositions whose truth cannot be determined within a given set of rules, such as the logical structure of a computer. Because no universal algorithmic method can exist to identify such propositions, a computer asked to obtain the truth of such a proposition will (unless forcibly interrupted) continue indefinitely—a condition known as the “ halting problem .” ( See Turing machine .) Other limitations reflect current technology . For example, although computers have progressed greatly in terms of processing data and using artificial intelligence algorithms , they are limited by their incapacity to think in a more holistic fashion. Computers may imitate humans—quite effectively, even—but imitation may not replace the human element in social interaction. Ethical concerns also limit computers, because computers rely on data, rather than a moral compass or human conscience , to make decisions.

Technician operates the system console on the new UNIVAC 1100/83 computer at the Fleet Analysis Center, Corona Annex, Naval Weapons Station, Seal Beach, CA. June 1, 1981. Univac magnetic tape drivers or readers in background. Universal Automatic Computer

Analog computers use continuous physical magnitudes to represent quantitative information. At first they represented quantities with mechanical components ( see differential analyzer and integrator ), but after World War II voltages were used; by the 1960s digital computers had largely replaced them. Nonetheless, analog computers, and some hybrid digital-analog systems, continued in use through the 1960s in tasks such as aircraft and spaceflight simulation.

One advantage of analog computation is that it may be relatively simple to design and build an analog computer to solve a single problem. Another advantage is that analog computers can frequently represent and solve a problem in “real time”; that is, the computation proceeds at the same rate as the system being modeled by it. Their main disadvantages are that analog representations are limited in precision—typically a few decimal places but fewer in complex mechanisms—and general-purpose devices are expensive and not easily programmed.

Digital computers

In contrast to analog computers, digital computers represent information in discrete form, generally as sequences of 0s and 1s ( binary digits, or bits). The modern era of digital computers began in the late 1930s and early 1940s in the United States , Britain, and Germany . The first devices used switches operated by electromagnets (relays). Their programs were stored on punched paper tape or cards, and they had limited internal data storage. For historical developments, see the section Invention of the modern computer .

During the 1950s and ’60s, Unisys (maker of the UNIVAC computer), International Business Machines Corporation (IBM), and other companies made large, expensive computers of increasing power . They were used by major corporations and government research laboratories, typically as the sole computer in the organization. In 1959 the IBM 1401 computer rented for $8,000 per month (early IBM machines were almost always leased rather than sold), and in 1964 the largest IBM S/360 computer cost several million dollars.

These computers came to be called mainframes, though the term did not become common until smaller computers were built. Mainframe computers were characterized by having (for their time) large storage capabilities, fast components, and powerful computational abilities. They were highly reliable, and, because they frequently served vital needs in an organization, they were sometimes designed with redundant components that let them survive partial failures. Because they were complex systems, they were operated by a staff of systems programmers, who alone had access to the computer. Other users submitted “batch jobs” to be run one at a time on the mainframe.

Such systems remain important today, though they are no longer the sole, or even primary, central computing resource of an organization, which will typically have hundreds or thousands of personal computers (PCs). Mainframes now provide high-capacity data storage for Internet servers, or, through time-sharing techniques, they allow hundreds or thousands of users to run programs simultaneously. Because of their current roles, these computers are now called servers rather than mainframes.

Logo

Essay on Generation of Computer

Students are often asked to write an essay on Generation of Computer in their schools and colleges. And if you’re also looking for the same, we have created 100-word, 250-word, and 500-word essays on the topic.

Let’s take a look…

100 Words Essay on Generation of Computer

Introduction.

Computers are essential parts of our lives. They have evolved over time, leading to five generations. Each generation is defined by a significant technological development.

First Generation (1940-1956)

The first generation of computers used vacuum tubes for circuitry and magnetic drums for memory. They were large, slow, expensive, and produced a lot of heat.

Second Generation (1956-1963)

Transistors replaced vacuum tubes in the second generation. Computers became smaller, faster, cheaper, more energy-efficient, and reliable.

Third Generation (1964-1971)

The third generation introduced integrated circuits, combining many transistors onto a single chip. Computers became even smaller, faster, and more reliable.

Fourth Generation (1971-Present)

Fifth generation (present and beyond).

The fifth generation focuses on artificial intelligence and aims to create computers that can process natural language and have capabilities of learning and self-organization.

250 Words Essay on Generation of Computer

The evolution of computers has been a transformative journey. From the rudimentary first generation to the sophisticated fifth generation, computers have drastically changed, shaping society and technology along the way.

First Generation (1940-1956): Vacuum Tubes

The first generation of computers used vacuum tubes for circuitry and magnetic drums for memory. These machines were enormous, consumed massive power, and required constant cooling. However, they laid the foundation for modern computing, introducing binary code and stored programs.

Second Generation (1956-1963): Transistors

Transistors replaced vacuum tubes, leading to smaller, faster, cheaper, and more reliable computers. This generation also introduced the concept of a programming language, allowing for more complex tasks.

Third Generation (1964-1971): Integrated Circuits

The third generation saw the advent of integrated circuits, miniaturizing transistors by embedding them into silicon chips. This led to further reduction in size and cost while increasing speed and reliability. High-level programming languages like FORTRAN and COBOL were born.

Fourth Generation (1971-Present): Microprocessors

Fifth generation (present and beyond): artificial intelligence.

The fifth generation, still in its infancy, aims to create computers that can process natural language and have artificial intelligence capabilities. The goal is to develop machines that can understand, learn, and respond like a human.

The generational advancement of computers is a testament to human ingenuity and innovation. Each generation has brought us closer to creating machines that not only augment human capability but also possess the potential to mimic human intelligence.

500 Words Essay on Generation of Computer

The first generation (1940-1956): vacuum tubes.

The first generation of computers were characterized by the use of vacuum tubes. These machines were enormous, occupying entire rooms, and were prone to overheating. Their programming was done in machine language, which was a low-level language. Despite their size and inefficiency, these computers laid the groundwork for modern computing and marked the beginning of the digital age.

The Second Generation (1956-1963): Transistors

Transistors replaced vacuum tubes in the second generation of computers, resulting in smaller, faster, and more reliable machines. This generation also saw the introduction of assembly language, which was easier to understand and use than machine language. Computers became more accessible, leading to increased commercial use.

The Third Generation (1964-1971): Integrated Circuits

The fourth generation (1971-present): microprocessors.

The fourth generation of computers heralded the era of microprocessors. A single chip now contained thousands of ICs, leading to the development of personal computers. The advent of graphical user interfaces, the mouse, and the internet revolutionized the way we interact with computers. This generation also saw the rise of object-oriented programming, which has become the standard in software development.

The Fifth Generation (Present and Beyond): Artificial Intelligence

The fifth and current generation of computers is characterized by artificial intelligence (AI) and quantum computing. AI enables machines to learn and make decisions, while quantum computing promises to solve complex problems exponentially faster than classical computers. This generation aims to create computers that can understand, learn, and respond to natural language, a significant leap in human-computer interaction.

The journey of computer evolution is a testament to human ingenuity and innovation. From the colossal vacuum tube machines of the first generation to the AI-driven systems of today, each generation of computers has brought us closer to creating machines that can match, and perhaps one day surpass, human cognitive abilities. As we stand on the brink of a new era in computing, the possibilities are as exciting as they are limitless.

If you’re looking for more, here are essays on other interesting topics:

Apart from these, you can look at all the essays by clicking here .

Leave a Reply Cancel reply

Save my name, email, and website in this browser for the next time I comment.

Essay Service Examples Technology Computer

The History of Computers: An Essay

Table of contents

Introduction: the dawn of computing, the mechanical beginnings: from abacus to analytical engine, the birth of binary logic and its impact on computing, the era of innovations: babbage to the first relay computer, the advent of electronic computing: world war ii and beyond, the transition to transistors: the second generation of computers, integrated circuits and minicomputers: the third generation, the microprocessor revolution: the fourth generation and personal computing, conclusion: the future of computing and its infinite possibilities.

  • Proper editing and formatting
  • Free revision, title page, and bibliography
  • Flexible prices and money-back guarantee

document

Our writers will provide you with an essay sample written from scratch: any topic, any deadline, any instructions.

reviews

Cite this paper

Related essay topics.

Get your paper done in as fast as 3 hours, 24/7.

Related articles

The History of Computers: An Essay

Most popular essays

It is impossible to imagine the modern world without computers. Today’s computers help the work...

Computer engineering merges together with computer science and electrical engineering to further...

  • Effects of Computers

We all know that computers are a very important part of our modern life, but they do have a...

Computers are normally utilized in numerous zones. It is a significant utility for individuals,...

  • Effects of Technology

In nowadays, the technology that has more impact on human beings is the computer. The computer had...

The current trend implies that the computers are used nearly everywhere. Firstly, in the...

Computers are generally utilized things in numerous fields in our present world as will be later....

  • Advantages of Technology

In today’s world, it is necessary to use technology, especially when it comes to education. The...

The computers are increasing day after day, their capabilities and features are developing day...

Join our 150k of happy users

  • Get original paper written according to your instructions
  • Save time for what matters most

Fair Use Policy

EduBirdie considers academic integrity to be the essential part of the learning process and does not support any violation of the academic standards. Should you have any questions regarding our Fair Use Policy or become aware of any violations, please do not hesitate to contact us via [email protected].

We are here 24/7 to write your paper in as fast as 3 hours.

Provide your email, and we'll send you this sample!

By providing your email, you agree to our Terms & Conditions and Privacy Policy .

Say goodbye to copy-pasting!

Get custom-crafted papers for you.

Enter your email, and we'll promptly send you the full essay. No need to copy piece by piece. It's in your inbox!

The Evolution of Computers

Proceedings of the 5th International Conference on Communication and Information Processing (ICCIP)- 2023

9 Pages Posted: 9 Jan 2024

Sudarshan Kakad

Nutan college of engineering and research, vishnupuri, talegaon dabhade, pune – 410507, india., vaibhav kale, atharv kakare, aditya kalokhe, archana yewale.

Date Written: June 9, 2023

The evolution of computers has been one of the most transformative journeys in human history. This review paper provides a comprehensive examination of the development of computers from their early beginnings to the present day, shedding light on the major milestones, technological breakthroughs, and their profound implications on society. The paper commences with an exploration of the origins of computers, starting from the mechanical calculators of the 19th century to the emergence of early electronic computers in the mid-20th century. It highlights significant contributions by pioneers such as Charles Babbage, Alan Turing, and John von Neumann, who laid the foundations for the digital computing era. The subsequent sections focus on the revolutionary advancements that have shaped the evolution of computers. This includes the advent of transistors and integrated circuits, leading to the development of smaller, faster, and more powerful computers. The rise of personal computers in the 1970s and 1980s democratized computing, empowering individuals and businesses alike. The review also delves into the progression of computer architectures, the mainframe and minicomputer. Overall, this review paper provides a comprehensive overview of the evolution of computers, capturing the key milestones, technological advancements, and societal implications. It serves as a valuable resource for researchers, educators, and technology enthusiasts seeking to understand the transformative journey of computers.

Keywords: computers, evolution, history, inventions

JEL Classification: C0

Suggested Citation: Suggested Citation

Archana Yewale (Contact Author)

Nutan college of engineering and research, vishnupuri, talegaon dabhade, pune – 410507, india. ( email ), do you have a job opening that you would like to promote on ssrn, paper statistics, related ejournals, information theory & research ejournal.

Subscribe to this fee journal for more curated articles on this topic

History of Science & Environment eJournal

5th international conference on communication & information processing (iccip) 2023.

Subscribe to this free journal for more curated articles on this topic

Home / Essay Samples / Information Science and Technology / Effects of Computers / Computer Evolution: From Room-Sized to Pocket-Sized

Computer Evolution: From Room-Sized to Pocket-Sized

  • Category: Information Science and Technology
  • Topic: Computer , Digital Era , Effects of Computers

Pages: 1 (676 words)

  • Downloads: -->

--> ⚠️ Remember: This essay was written and uploaded by an--> click here.

Found a great essay sample but want a unique one?

are ready to help you with your essay

You won’t be charged yet!

Artificial Intelligence Essays

Technology in Education Essays

Robots Essays

Net Neutrality Essays

Related Essays

We are glad that you like it, but you cannot copy from our website. Just insert your email and this sample will be sent to you.

By clicking “Send”, you agree to our Terms of service  and  Privacy statement . We will occasionally send you account related emails.

Your essay sample has been sent.

In fact, there is a way to get an original essay! Turn to our writers and order a plagiarism-free paper.

samplius.com uses cookies to offer you the best service possible.By continuing we’ll assume you board with our cookie policy .--> -->