History of computers: A brief timeline

The history of computers began with primitive designs in the early 19th century and went on to change the world during the 20th century.

History of computers: Apple I computer 1976

  • 2000-present day

Additional resources

The history of computers goes back over 200 years. At first theorized by mathematicians and entrepreneurs, during the 19th century mechanical calculating machines were designed and built to solve the increasingly complex number-crunching challenges. The advancement of technology enabled ever more-complex computers by the early 20th century, and computers became larger and more powerful.

Today, computers are almost unrecognizable from designs of the 19th century, such as Charles Babbage's Analytical Engine — or even from the huge computers of the 20th century that occupied whole rooms, such as the Electronic Numerical Integrator and Calculator.  

Here's a brief history of computers, from their primitive number-crunching origins to the powerful modern-day machines that surf the Internet, run games and stream multimedia. 

19th century

1801: Joseph Marie Jacquard, a French merchant and inventor invents a loom that uses punched wooden cards to automatically weave fabric designs. Early computers would use similar punch cards.

1821: English mathematician Charles Babbage conceives of a steam-driven calculating machine that would be able to compute tables of numbers. Funded by the British government, the project, called the "Difference Engine" fails due to the lack of technology at the time, according to the University of Minnesota . 

1848: Ada Lovelace, an English mathematician and the daughter of poet Lord Byron, writes the world's first computer program. According to Anna Siffert, a professor of theoretical mathematics at the University of Münster in Germany, Lovelace writes the first program while translating a paper on Babbage's Analytical Engine from French into English. "She also provides her own comments on the text. Her annotations, simply called "notes," turn out to be three times as long as the actual transcript," Siffert wrote in an article for The Max Planck Society . "Lovelace also adds a step-by-step description for computation of Bernoulli numbers with Babbage's machine — basically an algorithm — which, in effect, makes her the world's first computer programmer." Bernoulli numbers are a sequence of rational numbers often used in computation.

Babbage's Analytical Engine

1853: Swedish inventor Per Georg Scheutz and his son Edvard design the world's first printing calculator. The machine is significant for being the first to "compute tabular differences and print the results," according to Uta C. Merzbach's book, " Georg Scheutz and the First Printing Calculator " (Smithsonian Institution Press, 1977).

1890: Herman Hollerith designs a punch-card system to help calculate the 1890 U.S. Census. The machine,  saves the government several years of calculations, and the U.S. taxpayer approximately $5 million, according to Columbia University  Hollerith later establishes a company that will eventually become International Business Machines Corporation ( IBM ).

Early 20th century

1931: At the Massachusetts Institute of Technology (MIT), Vannevar Bush invents and builds the Differential Analyzer, the first large-scale automatic general-purpose mechanical analog computer, according to Stanford University . 

1936: Alan Turing , a British scientist and mathematician, presents the principle of a universal machine, later called the Turing machine, in a paper called "On Computable Numbers…" according to Chris Bernhardt's book " Turing's Vision " (The MIT Press, 2017). Turing machines are capable of computing anything that is computable. The central concept of the modern computer is based on his ideas. Turing is later involved in the development of the Turing-Welchman Bombe, an electro-mechanical device designed to decipher Nazi codes during World War II, according to the UK's National Museum of Computing . 

1937: John Vincent Atanasoff, a professor of physics and mathematics at Iowa State University, submits a grant proposal to build the first electric-only computer, without using gears, cams, belts or shafts.

original garage where Bill Hewlett and Dave Packard started their business

1939: David Packard and Bill Hewlett found the Hewlett Packard Company in Palo Alto, California. The pair decide the name of their new company by the toss of a coin, and Hewlett-Packard's first headquarters are in Packard's garage, according to MIT . 

1941: German inventor and engineer Konrad Zuse completes his Z3 machine, the world's earliest digital computer, according to Gerard O'Regan's book " A Brief History of Computing " (Springer, 2021). The machine was destroyed during a bombing raid on Berlin during World War II. Zuse fled the German capital after the defeat of Nazi Germany and later released the world's first commercial digital computer, the Z4, in 1950, according to O'Regan. 

1941: Atanasoff and his graduate student, Clifford Berry, design the first digital electronic computer in the U.S., called the Atanasoff-Berry Computer (ABC). This marks the first time a computer is able to store information on its main memory, and is capable of performing one operation every 15 seconds, according to the book " Birthing the Computer " (Cambridge Scholars Publishing, 2016)

1945: Two professors at the University of Pennsylvania, John Mauchly and J. Presper Eckert, design and build the Electronic Numerical Integrator and Calculator (ENIAC). The machine is the first "automatic, general-purpose, electronic, decimal, digital computer," according to Edwin D. Reilly's book "Milestones in Computer Science and Information Technology" (Greenwood Press, 2003). 

Computer technicians operating the ENIAC

1946: Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census Bureau to build the UNIVAC, the first commercial computer for business and government applications.

1947: William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invent the transistor . They discover how to make an electric switch with solid materials and without the need for a vacuum.

1949: A team at the University of Cambridge develops the Electronic Delay Storage Automatic Calculator (EDSAC), "the first practical stored-program computer," according to O'Regan. "EDSAC ran its first program in May 1949 when it calculated a table of squares and a list of prime numbers ," O'Regan wrote. In November 1949, scientists with the Council of Scientific and Industrial Research (CSIR), now called CSIRO, build Australia's first digital computer called the Council for Scientific and Industrial Research Automatic Computer (CSIRAC). CSIRAC is the first digital computer in the world to play music, according to O'Regan.

Late 20th century

1953: Grace Hopper develops the first computer language, which eventually becomes known as COBOL, which stands for COmmon, Business-Oriented Language according to the National Museum of American History . Hopper is later dubbed the "First Lady of Software" in her posthumous Presidential Medal of Freedom citation. Thomas Johnson Watson Jr., son of IBM CEO Thomas Johnson Watson Sr., conceives the IBM 701 EDPM to help the United Nations keep tabs on Korea during the war.

1954: John Backus and his team of programmers at IBM publish a paper describing their newly created FORTRAN programming language, an acronym for FORmula TRANslation, according to MIT .

1958: Jack Kilby and Robert Noyce unveil the integrated circuit, known as the computer chip. Kilby is later awarded the Nobel Prize in Physics for his work.

1968: Douglas Engelbart reveals a prototype of the modern computer at the Fall Joint Computer Conference, San Francisco. His presentation, called "A Research Center for Augmenting Human Intellect" includes a live demonstration of his computer, including a mouse and a graphical user interface (GUI), according to the Doug Engelbart Institute . This marks the development of the computer from a specialized machine for academics to a technology that is more accessible to the general public.

The first computer mouse, invented in 1963 by Douglas C. Engelbart

1969: Ken Thompson, Dennis Ritchie and a group of other developers at Bell Labs produce UNIX, an operating system that made "large-scale networking of diverse computing systems — and the internet — practical," according to Bell Labs .. The team behind UNIX continued to develop the operating system using the C programming language, which they also optimized. 

1970: The newly formed Intel unveils the Intel 1103, the first Dynamic Access Memory (DRAM) chip.

1971: A team of IBM engineers led by Alan Shugart invents the "floppy disk," enabling data to be shared among different computers.

1972: Ralph Baer, a German-American engineer, releases Magnavox Odyssey, the world's first home game console, in September 1972 , according to the Computer Museum of America . Months later, entrepreneur Nolan Bushnell and engineer Al Alcorn with Atari release Pong, the world's first commercially successful video game. 

1973: Robert Metcalfe, a member of the research staff for Xerox, develops Ethernet for connecting multiple computers and other hardware.

1977: The Commodore Personal Electronic Transactor (PET), is released onto the home computer market, featuring an MOS Technology 8-bit 6502 microprocessor, which controls the screen, keyboard and cassette player. The PET is especially successful in the education market, according to O'Regan.

1975: The magazine cover of the January issue of "Popular Electronics" highlights the Altair 8080 as the "world's first minicomputer kit to rival commercial models." After seeing the magazine issue, two "computer geeks," Paul Allen and Bill Gates, offer to write software for the Altair, using the new BASIC language. On April 4, after the success of this first endeavor, the two childhood friends form their own software company, Microsoft.

1976: Steve Jobs and Steve Wozniak co-found Apple Computer on April Fool's Day. They unveil Apple I, the first computer with a single-circuit board and ROM (Read Only Memory), according to MIT .

Apple I computer 1976

1977: Radio Shack began its initial production run of 3,000 TRS-80 Model 1 computers — disparagingly known as the "Trash 80" — priced at $599, according to the National Museum of American History. Within a year, the company took 250,000 orders for the computer, according to the book " How TRS-80 Enthusiasts Helped Spark the PC Revolution " (The Seeker Books, 2007).

1977: The first West Coast Computer Faire is held in San Francisco. Jobs and Wozniak present the Apple II computer at the Faire, which includes color graphics and features an audio cassette drive for storage.

1978: VisiCalc, the first computerized spreadsheet program is introduced.

1979: MicroPro International, founded by software engineer Seymour Rubenstein, releases WordStar, the world's first commercially successful word processor. WordStar is programmed by Rob Barnaby, and includes 137,000 lines of code, according to Matthew G. Kirschenbaum's book " Track Changes: A Literary History of Word Processing " (Harvard University Press, 2016).

1981: "Acorn," IBM's first personal computer, is released onto the market at a price point of $1,565, according to IBM. Acorn uses the MS-DOS operating system from Windows. Optional features include a display, printer, two diskette drives, extra memory, a game adapter and more.

A worker using an Acorn computer by IBM, 1981

1983: The Apple Lisa, standing for "Local Integrated Software Architecture" but also the name of Steve Jobs' daughter, according to the National Museum of American History ( NMAH ), is the first personal computer to feature a GUI. The machine also includes a drop-down menu and icons. Also this year, the Gavilan SC is released and is the first portable computer with a flip-form design and the very first to be sold as a "laptop."

1984: The Apple Macintosh is announced to the world during a Superbowl advertisement. The Macintosh is launched with a retail price of $2,500, according to the NMAH. 

1985 : As a response to the Apple Lisa's GUI, Microsoft releases Windows in November 1985, the Guardian reported . Meanwhile, Commodore announces the Amiga 1000.

1989: Tim Berners-Lee, a British researcher at the European Organization for Nuclear Research ( CERN ), submits his proposal for what would become the World Wide Web. His paper details his ideas for Hyper Text Markup Language (HTML), the building blocks of the Web. 

1993: The Pentium microprocessor advances the use of graphics and music on PCs.

1996: Sergey Brin and Larry Page develop the Google search engine at Stanford University.

1997: Microsoft invests $150 million in Apple, which at the time is struggling financially.  This investment ends an ongoing court case in which Apple accused Microsoft of copying its operating system. 

1999: Wi-Fi, the abbreviated term for "wireless fidelity" is developed, initially covering a distance of up to 300 feet (91 meters) Wired reported . 

21st century

2001: Mac OS X, later renamed OS X then simply macOS, is released by Apple as the successor to its standard Mac Operating System. OS X goes through 16 different versions, each with "10" as its title, and the first nine iterations are nicknamed after big cats, with the first being codenamed "Cheetah," TechRadar reported.  

2003: AMD's Athlon 64, the first 64-bit processor for personal computers, is released to customers. 

2004: The Mozilla Corporation launches Mozilla Firefox 1.0. The Web browser is one of the first major challenges to Internet Explorer, owned by Microsoft. During its first five years, Firefox exceeded a billion downloads by users, according to the Web Design Museum . 

2005: Google buys Android, a Linux-based mobile phone operating system

2006: The MacBook Pro from Apple hits the shelves. The Pro is the company's first Intel-based, dual-core mobile computer. 

2009: Microsoft launches Windows 7 on July 22. The new operating system features the ability to pin applications to the taskbar, scatter windows away by shaking another window, easy-to-access jumplists, easier previews of tiles and more, TechRadar reported .  

Apple CEO Steve Jobs holds the iPad during the launch of Apple's new tablet computing device in San Francisco

2010: The iPad, Apple's flagship handheld tablet, is unveiled.

2011: Google releases the Chromebook, which runs on Google Chrome OS.

2015: Apple releases the Apple Watch. Microsoft releases Windows 10.

2016: The first reprogrammable quantum computer was created. "Until now, there hasn't been any quantum-computing platform that had the capability to program new algorithms into their system. They're usually each tailored to attack a particular algorithm," said study lead author Shantanu Debnath, a quantum physicist and optical engineer at the University of Maryland, College Park.

2017: The Defense Advanced Research Projects Agency (DARPA) is developing a new "Molecular Informatics" program that uses molecules as computers. "Chemistry offers a rich set of properties that we may be able to harness for rapid, scalable information storage and processing," Anne Fischer, program manager in DARPA's Defense Sciences Office, said in a statement. "Millions of molecules exist, and each molecule has a unique three-dimensional atomic structure as well as variables such as shape, size, or even color. This richness provides a vast design space for exploring novel and multi-value ways to encode and process data beyond the 0s and 1s of current logic-based, digital architectures."

2019: A team at Google became the first to demonstrate quantum supremacy — creating a quantum computer that could feasibly outperform the most powerful classical computer — albeit for a very specific problem with no practical real-world application. The described the computer, dubbed "Sycamore" in a paper that same year in the journal Nature . Achieving quantum advantage – in which a quantum computer solves a problem with real-world applications faster than the most powerful classical computer —  is still a ways off. 

2022: The first exascale supercomputer, and the world's fastest, Frontier, went online at the Oak Ridge Leadership Computing Facility (OLCF) in Tennessee. Built by Hewlett Packard Enterprise (HPE) at the cost of $600 million, Frontier uses nearly 10,000 AMD EPYC 7453 64-core CPUs alongside nearly 40,000 AMD Radeon Instinct MI250X GPUs. This machine ushered in the era of exascale computing, which refers to systems that can reach more than one exaFLOP of power – used to measure the performance of a system. Only one machine – Frontier – is currently capable of reaching such levels of performance. It is currently being used as a tool to aid scientific discovery.

What is the first computer in history?

Charles Babbage's Difference Engine, designed in the 1820s, is considered the first "mechanical" computer in history, according to the Science Museum in the U.K . Powered by steam with a hand crank, the machine calculated a series of values and printed the results in a table. 

What are the five generations of computing?

The "five generations of computing" is a framework for assessing the entire history of computing and the key technological advancements throughout it. 

The first generation, spanning the 1940s to the 1950s, covered vacuum tube-based machines. The second then progressed to incorporate transistor-based computing between the 50s and the 60s. In the 60s and 70s, the third generation gave rise to integrated circuit-based computing. We are now in between the fourth and fifth generations of computing, which are microprocessor-based and AI-based computing.

What is the most powerful computer in the world?

As of November 2023, the most powerful computer in the world is the Frontier supercomputer . The machine, which can reach a performance level of up to 1.102 exaFLOPS, ushered in the age of exascale computing in 2022 when it went online at Tennessee's  Oak Ridge Leadership Computing Facility (OLCF) 

There is, however, a potentially more powerful supercomputer waiting in the wings in the form of the Aurora supercomputer, which is housed at the Argonne National Laboratory (ANL) outside of Chicago.  Aurora went online in November 2023. Right now, it lags far behind Frontier, with performance levels of just 585.34 petaFLOPS (roughly half the performance of Frontier), although it's still not finished. When work is completed, the supercomputer is expected to reach performance levels higher than 2 exaFLOPS.

What was the first killer app?

Killer apps are widely understood to be those so essential that they are core to the technology they run on. There have been so many through the years – from Word for Windows in 1989 to iTunes in 2001 to social media apps like WhatsApp in more recent years

Several pieces of software may stake a claim to be the first killer app, but there is a broad consensus that VisiCalc, a spreadsheet program created by VisiCorp and originally released for the Apple II in 1979, holds that title. Steve Jobs even credits this app for propelling the Apple II to become the success it was, according to co-creator Dan Bricklin .

  • Fortune: A Look Back At 40 Years of Apple
  • The New Yorker: The First Windows
  • " A Brief History of Computing " by Gerard O'Regan (Springer, 2021)

Sign up for the Live Science daily newsletter now

Get the world’s most fascinating discoveries delivered straight to your inbox.

Timothy is Editor in Chief of print and digital magazines All About History and History of War . He has previously worked on sister magazine All About Space , as well as photography and creative brands including Digital Photographer and 3D Artist . He has also written for How It Works magazine, several history bookazines and has a degree in English Literature from Bath Spa University . 

'Crazy idea' memory device could slash AI energy consumption by up to 2,500 times

World's 'best-performing' quantum computing chip could be used in machines by 2027, scientists claim

Salps: The world's fastest-growing animals that look like buckets of snot

Most Popular

  • 2 Supermoon Blue Moon 2024: Top photos from around the world
  • 3 Why do cats hate closed doors?
  • 4 'Doomsday glacier' won't collapse the way we thought, new study suggests
  • 5 32 truly bizarre deep-sea creatures

age of computer essay

Encyclopedia Britannica

  • History & Society
  • Science & Tech
  • Biographies
  • Animals & Nature
  • Geography & Travel
  • Arts & Culture
  • Games & Quizzes
  • On This Day
  • One Good Fact
  • New Articles
  • Lifestyles & Social Issues
  • Philosophy & Religion
  • Politics, Law & Government
  • World History
  • Health & Medicine
  • Browse Biographies
  • Birds, Reptiles & Other Vertebrates
  • Bugs, Mollusks & Other Invertebrates
  • Environment
  • Fossils & Geologic Time
  • Entertainment & Pop Culture
  • Sports & Recreation
  • Visual Arts
  • Demystified
  • Image Galleries
  • Infographics
  • Top Questions
  • Britannica Kids
  • Saving Earth
  • Space Next 50
  • Student Center
  • Introduction & Top Questions
  • Analog computers
  • Mainframe computer
  • Supercomputer
  • Minicomputer
  • Microcomputer
  • Laptop computer
  • Embedded processors
  • Central processing unit
  • Main memory
  • Secondary memory
  • Input devices
  • Output devices
  • Communication devices
  • Peripheral interfaces
  • Fabrication
  • Transistor size
  • Power consumption
  • Quantum computing
  • Molecular computing
  • Role of operating systems
  • Multiuser systems
  • Thin systems
  • Reactive systems
  • Operating system design approaches
  • Local area networks
  • Wide area networks
  • Business and personal software
  • Scientific and engineering software
  • Internet and collaborative software
  • Games and entertainment

Analog calculators: from Napier’s logarithms to the slide rule

Digital calculators: from the calculating clock to the arithmometer, the jacquard loom.

  • The Difference Engine
  • The Analytical Engine
  • Ada Lovelace, the first programmer
  • Herman Hollerith’s census tabulator
  • Other early business machine companies
  • Vannevar Bush’s Differential Analyzer
  • Howard Aiken’s digital calculators
  • The Turing machine
  • The Atanasoff-Berry Computer
  • The first computer network
  • Konrad Zuse
  • Bigger brains
  • Von Neumann’s “Preliminary Discussion”
  • The first stored-program machines
  • Machine language
  • Zuse’s Plankalkül
  • Interpreters
  • Grace Murray Hopper
  • IBM develops FORTRAN
  • Control programs
  • The IBM 360
  • Time-sharing from Project MAC to UNIX
  • Minicomputers
  • Integrated circuits
  • The Intel 4004
  • Early computer enthusiasts
  • The hobby market expands
  • From Star Trek to Microsoft
  • Application software
  • Commodore and Tandy enter the field
  • The graphical user interface
  • The IBM Personal Computer
  • Microsoft’s Windows operating system
  • Workstation computers
  • Embedded systems
  • Handheld digital devices
  • The Internet
  • Social networking
  • Ubiquitous computing

computer

  • What is a computer?
  • Who invented the computer?
  • What can computers do?
  • Are computers conscious?
  • What is the impact of computer artificial intelligence (AI) on society?

Programming computer abstract

History of computing

Our editors will review what you’ve submitted and determine whether to revise the article.

  • University of Rhode Island - College of Arts and Sciences - Department of Computer Science and Statistics - History of Computers
  • LiveScience - History of Computers: A Brief Timeline
  • Computer History Museum - Timeline of Computer history
  • Engineering LibreTexts - What is a computer?
  • Computer Hope - What is a Computer?
  • computer - Children's Encyclopedia (Ages 8-11)
  • computer - Student Encyclopedia (Ages 11 and up)
  • Table Of Contents

A computer might be described with deceptive simplicity as “an apparatus that performs routine calculations automatically.” Such a definition would owe its deceptiveness to a naive and narrow view of calculation as a strictly mathematical process. In fact, calculation underlies many activities that are not normally thought of as mathematical. Walking across a room, for instance, requires many complex, albeit subconscious, calculations. Computers, too, have proved capable of solving a vast array of problems, from balancing a checkbook to even—in the form of guidance systems for robots—walking across a room.

Before the true power of computing could be realized, therefore, the naive view of calculation had to be overcome. The inventors who labored to bring the computer into the world had to learn that the thing they were inventing was not just a number cruncher, not merely a calculator. For example, they had to learn that it was not necessary to invent a new computer for every new calculation and that a computer could be designed to solve numerous problems, even problems not yet imagined when the computer was built. They also had to learn how to tell such a general problem-solving computer what problem to solve. In other words, they had to invent programming.

They had to solve all the heady problems of developing such a device, of implementing the design, of actually building the thing. The history of the solving of these problems is the history of the computer. That history is covered in this section, and links are provided to entries on many of the individuals and companies mentioned. In addition, see the articles computer science and supercomputer .

Early history

Computer precursors.

The earliest known calculating device is probably the abacus . It dates back at least to 1100 bce and is still in use today, particularly in Asia. Now, as then, it typically consists of a rectangular frame with thin parallel rods strung with beads. Long before any systematic positional notation was adopted for the writing of numbers, the abacus assigned different units, or weights, to each rod. This scheme allowed a wide range of numbers to be represented by just a few beads and, together with the invention of zero in India, may have inspired the invention of the Hindu-Arabic number system . In any case, abacus beads can be readily manipulated to perform the common arithmetical operations—addition, subtraction, multiplication, and division—that are useful for commercial transactions and in bookkeeping.

The abacus is a digital device; that is, it represents values discretely. A bead is either in one predefined position or another, representing unambiguously, say, one or zero.

Calculating devices took a different turn when John Napier , a Scottish mathematician, published his discovery of logarithms in 1614. As any person can attest , adding two 10-digit numbers is much simpler than multiplying them together, and the transformation of a multiplication problem into an addition problem is exactly what logarithms enable. This simplification is possible because of the following logarithmic property: the logarithm of the product of two numbers is equal to the sum of the logarithms of the numbers. By 1624, tables with 14 significant digits were available for the logarithms of numbers from 1 to 20,000, and scientists quickly adopted the new labor-saving tool for tedious astronomical calculations.

Most significant for the development of computing, the transformation of multiplication into addition greatly simplified the possibility of mechanization. Analog calculating devices based on Napier’s logarithms—representing digital values with analogous physical lengths—soon appeared. In 1620 Edmund Gunter , the English mathematician who coined the terms cosine and cotangent , built a device for performing navigational calculations: the Gunter scale, or, as navigators simply called it, the gunter. About 1632 an English clergyman and mathematician named William Oughtred built the first slide rule , drawing on Napier’s ideas. That first slide rule was circular, but Oughtred also built the first rectangular one in 1633. The analog devices of Gunter and Oughtred had various advantages and disadvantages compared with digital devices such as the abacus. What is important is that the consequences of these design decisions were being tested in the real world.

age of computer essay

In 1623 the German astronomer and mathematician Wilhelm Schickard built the first calculator . He described it in a letter to his friend the astronomer Johannes Kepler , and in 1624 he wrote again to explain that a machine he had commissioned to be built for Kepler was, apparently along with the prototype , destroyed in a fire. He called it a Calculating Clock , which modern engineers have been able to reproduce from details in his letters. Even general knowledge of the clock had been temporarily lost when Schickard and his entire family perished during the Thirty Years’ War .

But Schickard may not have been the true inventor of the calculator. A century earlier, Leonardo da Vinci sketched plans for a calculator that were sufficiently complete and correct for modern engineers to build a calculator on their basis.

age of computer essay

The first calculator or adding machine to be produced in any quantity and actually used was the Pascaline, or Arithmetic Machine , designed and built by the French mathematician-philosopher Blaise Pascal between 1642 and 1644. It could only do addition and subtraction, with numbers being entered by manipulating its dials. Pascal invented the machine for his father, a tax collector, so it was the first business machine too (if one does not count the abacus). He built 50 of them over the next 10 years.

age of computer essay

In 1671 the German mathematician-philosopher Gottfried Wilhelm von Leibniz designed a calculating machine called the Step Reckoner . (It was first built in 1673.) The Step Reckoner expanded on Pascal’s ideas and did multiplication by repeated addition and shifting.

Leibniz was a strong advocate of the binary number system . Binary numbers are ideal for machines because they require only two digits, which can easily be represented by the on and off states of a switch. When computers became electronic, the binary system was particularly appropriate because an electrical circuit is either on or off. This meant that on could represent true, off could represent false, and the flow of current would directly represent the flow of logic.

Leibniz was prescient in seeing the appropriateness of the binary system in calculating machines, but his machine did not use it. Instead, the Step Reckoner represented numbers in decimal form, as positions on 10-position dials. Even decimal representation was not a given: in 1668 Samuel Morland invented an adding machine specialized for British money—a decidedly nondecimal system.

Pascal’s, Leibniz’s, and Morland’s devices were curiosities, but with the Industrial Revolution of the 18th century came a widespread need to perform repetitive operations efficiently. With other activities being mechanized, why not calculation? In 1820 Charles Xavier Thomas de Colmar of France effectively met this challenge when he built his Arithmometer , the first commercial mass-produced calculating device. It could perform addition, subtraction, multiplication, and, with some more elaborate user involvement, division. Based on Leibniz’s technology , it was extremely popular and sold for 90 years. In contrast to the modern calculator’s credit-card size, the Arithmometer was large enough to cover a desktop.

Calculators such as the Arithmometer remained a fascination after 1820, and their potential for commercial use was well understood. Many other mechanical devices built during the 19th century also performed repetitive functions more or less automatically, but few had any application to computing. There was one major exception: the Jacquard loom , invented in 1804–05 by a French weaver, Joseph-Marie Jacquard .

age of computer essay

The Jacquard loom was a marvel of the Industrial Revolution. A textile-weaving loom , it could also be called the first practical information-processing device. The loom worked by tugging various-colored threads into patterns by means of an array of rods. By inserting a card punched with holes, an operator could control the motion of the rods and thereby alter the pattern of the weave. Moreover, the loom was equipped with a card-reading device that slipped a new card from a pre-punched deck into place every time the shuttle was thrown, so that complex weaving patterns could be automated.

What was extraordinary about the device was that it transferred the design process from a labor-intensive weaving stage to a card-punching stage. Once the cards had been punched and assembled, the design was complete, and the loom implemented the design automatically. The Jacquard loom, therefore, could be said to be programmed for different patterns by these decks of punched cards.

For those intent on mechanizing calculations, the Jacquard loom provided important lessons: the sequence of operations that a machine performs could be controlled to make the machine do something quite different; a punched card could be used as a medium for directing the machine; and, most important, a device could be directed to perform different tasks by feeding it instructions in a sort of language—i.e., making the machine programmable.

It is not too great a stretch to say that, in the Jacquard loom, programming was invented before the computer. The close relationship between the device and the program became apparent some 20 years later, with Charles Babbage’s invention of the first computer.

SEP home page

  • Table of Contents
  • Random Entry
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Advanced Tools
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Entry Contents

Bibliography

Academic tools.

  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

The Modern History of Computing

Historically, computers were human clerks who calculated in accordance with effective methods. These human computers did the sorts of calculation nowadays carried out by electronic computers, and many thousands of them were employed in commerce, government, and research establishments. The term computing machine , used increasingly from the 1920s, refers to any machine that does the work of a human computer, i.e., any machine that calculates in accordance with effective methods. During the late 1940s and early 1950s, with the advent of electronic computing machines, the phrase ‘computing machine’ gradually gave way simply to ‘computer’, initially usually with the prefix ‘electronic’ or ‘digital’. This entry surveys the history of these machines.

  • Analog Computers

The Universal Turing Machine

Electromechanical versus electronic computation, turing's automatic computing engine, the manchester machine, eniac and edvac, other notable early computers, high-speed memory, other internet resources, related entries.

Charles Babbage was Lucasian Professor of Mathematics at Cambridge University from 1828 to 1839 (a post formerly held by Isaac Newton). Babbage's proposed Difference Engine was a special-purpose digital computing machine for the automatic production of mathematical tables (such as logarithm tables, tide tables, and astronomical tables). The Difference Engine consisted entirely of mechanical components — brass gear wheels, rods, ratchets, pinions, etc. Numbers were represented in the decimal system by the positions of 10-toothed metal wheels mounted in columns. Babbage exhibited a small working model in 1822. He never completed the full-scale machine that he had designed but did complete several fragments. The largest — one ninth of the complete calculator — is on display in the London Science Museum. Babbage used it to perform serious computational work, calculating various mathematical tables. In 1990, Babbage's Difference Engine No. 2 was finally built from Babbage's designs and is also on display at the London Science Museum.

The Swedes Georg and Edvard Scheutz (father and son) constructed a modified version of Babbage's Difference Engine. Three were made, a prototype and two commercial models, one of these being sold to an observatory in Albany, New York, and the other to the Registrar-General's office in London, where it calculated and printed actuarial tables.

Babbage's proposed Analytical Engine, considerably more ambitious than the Difference Engine, was to have been a general-purpose mechanical digital computer. The Analytical Engine was to have had a memory store and a central processing unit (or ‘mill’) and would have been able to select from among alternative actions consequent upon the outcome of its previous actions (a facility nowadays known as conditional branching). The behaviour of the Analytical Engine would have been controlled by a program of instructions contained on punched cards connected together with ribbons (an idea that Babbage had adopted from the Jacquard weaving loom). Babbage emphasised the generality of the Analytical Engine, saying ‘the conditions which enable a finite machine to make calculations of unlimited extent are fulfilled in the Analytical Engine’ (Babbage [1994], p. 97).

Babbage worked closely with Ada Lovelace, daughter of the poet Byron, after whom the modern programming language ADA is named. Lovelace foresaw the possibility of using the Analytical Engine for non-numeric computation, suggesting that the Engine might even be capable of composing elaborate pieces of music.

A large model of the Analytical Engine was under construction at the time of Babbage's death in 1871 but a full-scale version was never built. Babbage's idea of a general-purpose calculating engine was never forgotten, especially at Cambridge, and was on occasion a lively topic of mealtime discussion at the war-time headquarters of the Government Code and Cypher School, Bletchley Park, Buckinghamshire, birthplace of the electronic digital computer.

Analog computers

The earliest computing machines in wide use were not digital but analog. In analog representation, properties of the representational medium ape (or reflect or model) properties of the represented state-of-affairs. (In obvious contrast, the strings of binary digits employed in digital representation do not represent by means of possessing some physical property — such as length — whose magnitude varies in proportion to the magnitude of the property that is being represented.) Analog representations form a diverse class. Some examples: the longer a line on a road map, the longer the road that the line represents; the greater the number of clear plastic squares in an architect's model, the greater the number of windows in the building represented; the higher the pitch of an acoustic depth meter, the shallower the water. In analog computers, numerical quantities are represented by, for example, the angle of rotation of a shaft or a difference in electrical potential. Thus the output voltage of the machine at a time might represent the momentary speed of the object being modelled.

As the case of the architect's model makes plain, analog representation may be discrete in nature (there is no such thing as a fractional number of windows). Among computer scientists, the term ‘analog’ is sometimes used narrowly, to indicate representation of one continuously-valued quantity by another (e.g., speed by voltage). As Brian Cantwell Smith has remarked:

‘Analog’ should … be a predicate on a representation whose structure corresponds to that of which it represents … That continuous representations should historically have come to be called analog presumably betrays the recognition that, at the levels at which it matters to us, the world is more foundationally continuous than it is discrete. (Smith [1991], p. 271)

James Thomson, brother of Lord Kelvin, invented the mechanical wheel-and-disc integrator that became the foundation of analog computation (Thomson [1876]). The two brothers constructed a device for computing the integral of the product of two given functions, and Kelvin described (although did not construct) general-purpose analog machines for integrating linear differential equations of any order and for solving simultaneous linear equations. Kelvin's most successful analog computer was his tide predicting machine, which remained in use at the port of Liverpool until the 1960s. Mechanical analog devices based on the wheel-and-disc integrator were in use during World War I for gunnery calculations. Following the war, the design of the integrator was considerably improved by Hannibal Ford (Ford [1919]).

Stanley Fifer reports that the first semi-automatic mechanical analog computer was built in England by the Manchester firm of Metropolitan Vickers prior to 1930 (Fifer [1961], p. 29); however, I have so far been unable to verify this claim. In 1931, Vannevar Bush, working at MIT, built the differential analyser, the first large-scale automatic general-purpose mechanical analog computer. Bush's design was based on the wheel and disc integrator. Soon copies of his machine were in use around the world (including, at Cambridge and Manchester Universities in England, differential analysers built out of kit-set Meccano, the once popular engineering toy).

It required a skilled mechanic equipped with a lead hammer to set up Bush's mechanical differential analyser for each new job. Subsequently, Bush and his colleagues replaced the wheel-and-disc integrators and other mechanical components by electromechanical, and finally by electronic, devices.

A differential analyser may be conceptualised as a collection of ‘black boxes’ connected together in such a way as to allow considerable feedback. Each box performs a fundamental process, for example addition, multiplication of a variable by a constant, and integration. In setting up the machine for a given task, boxes are connected together so that the desired set of fundamental processes is executed. In the case of electrical machines, this was done typically by plugging wires into sockets on a patch panel (computing machines whose function is determined in this way are referred to as ‘program-controlled’).

Since all the boxes work in parallel, an electronic differential analyser solves sets of equations very quickly. Against this has to be set the cost of massaging the problem to be solved into the form demanded by the analog machine, and of setting up the hardware to perform the desired computation. A major drawback of analog computation is the higher cost, relative to digital machines, of an increase in precision. During the 1960s and 1970s, there was considerable interest in ‘hybrid’ machines, where an analog section is controlled by and programmed via a digital section. However, such machines are now a rarity.

In 1936, at Cambridge University, Turing invented the principle of the modern computer. He described an abstract digital computing machine consisting of a limitless memory and a scanner that moves back and forth through the memory, symbol by symbol, reading what it finds and writing further symbols (Turing [1936]). The actions of the scanner are dictated by a program of instructions that is stored in the memory in the form of symbols. This is Turing's stored-program concept, and implicit in it is the possibility of the machine operating on and modifying its own program. (In London in 1947, in the course of what was, so far as is known, the earliest public lecture to mention computer intelligence, Turing said, ‘What we want is a machine that can learn from experience’, adding that the ‘possibility of letting the machine alter its own instructions provides the mechanism for this’ (Turing [1947] p. 393). Turing's computing machine of 1936 is now known simply as the universal Turing machine. Cambridge mathematician Max Newman remarked that right from the start Turing was interested in the possibility of actually building a computing machine of the sort that he had described (Newman in interview with Christopher Evans in Evans [197?].

From the start of the Second World War Turing was a leading cryptanalyst at the Government Code and Cypher School, Bletchley Park. Here he became familiar with Thomas Flowers' work involving large-scale high-speed electronic switching (described below). However, Turing could not turn to the project of building an electronic stored-program computing machine until the cessation of hostilities in Europe in 1945.

During the wartime years Turing did give considerable thought to the question of machine intelligence. Colleagues at Bletchley Park recall numerous off-duty discussions with him on the topic, and at one point Turing circulated a typewritten report (now lost) setting out some of his ideas. One of these colleagues, Donald Michie (who later founded the Department of Machine Intelligence and Perception at the University of Edinburgh), remembers Turing talking often about the possibility of computing machines (1) learning from experience and (2) solving problems by means of searching through the space of possible solutions, guided by rule-of-thumb principles (Michie in interview with Copeland, 1995). The modern term for the latter idea is ‘heuristic search’, a heuristic being any rule-of-thumb principle that cuts down the amount of searching required in order to find a solution to a problem. At Bletchley Park Turing illustrated his ideas on machine intelligence by reference to chess. Michie recalls Turing experimenting with heuristics that later became common in chess programming (in particular minimax and best-first).

Further information about Turing and the computer, including his wartime work on codebreaking and his thinking about artificial intelligence and artificial life, can be found in Copeland 2004.

With some exceptions — including Babbage's purely mechanical engines, and the finger-powered National Accounting Machine - early digital computing machines were electromechanical. That is to say, their basic components were small, electrically-driven, mechanical switches called ‘relays’. These operate relatively slowly, whereas the basic components of an electronic computer — originally vacuum tubes (valves) — have no moving parts save electrons and so operate extremely fast. Electromechanical digital computing machines were built before and during the second world war by (among others) Howard Aiken at Harvard University, George Stibitz at Bell Telephone Laboratories, Turing at Princeton University and Bletchley Park, and Konrad Zuse in Berlin. To Zuse belongs the honour of having built the first working general-purpose program-controlled digital computer. This machine, later called the Z3, was functioning in 1941. (A program-controlled computer, as opposed to a stored-program computer, is set up for a new task by re-routing wires, by means of plugs etc.)

Relays were too slow and unreliable a medium for large-scale general-purpose digital computation (although Aiken made a valiant effort). It was the development of high-speed digital techniques using vacuum tubes that made the modern computer possible.

The earliest extensive use of vacuum tubes for digital data-processing appears to have been by the engineer Thomas Flowers, working in London at the British Post Office Research Station at Dollis Hill. Electronic equipment designed by Flowers in 1934, for controlling the connections between telephone exchanges, went into operation in 1939, and involved between three and four thousand vacuum tubes running continuously. In 1938–1939 Flowers worked on an experimental electronic digital data-processing system, involving a high-speed data store. Flowers' aim, achieved after the war, was that electronic equipment should replace existing, less reliable, systems built from relays and used in telephone exchanges. Flowers did not investigate the idea of using electronic equipment for numerical calculation, but has remarked that at the outbreak of war with Germany in 1939 he was possibly the only person in Britain who realized that vacuum tubes could be used on a large scale for high-speed digital computation. (See Copeland 2006 for m more information on Flowers' work.)

The earliest comparable use of vacuum tubes in the U.S. seems to have been by John Atanasoff at what was then Iowa State College (now University). During the period 1937–1942 Atanasoff developed techniques for using vacuum tubes to perform numerical calculations digitally. In 1939, with the assistance of his student Clifford Berry, Atanasoff began building what is sometimes called the Atanasoff-Berry Computer, or ABC, a small-scale special-purpose electronic digital machine for the solution of systems of linear algebraic equations. The machine contained approximately 300 vacuum tubes. Although the electronic part of the machine functioned successfully, the computer as a whole never worked reliably, errors being introduced by the unsatisfactory binary card-reader. Work was discontinued in 1942 when Atanasoff left Iowa State.

The first fully functioning electronic digital computer was Colossus, used by the Bletchley Park cryptanalysts from February 1944.

From very early in the war the Government Code and Cypher School (GC&CS) was successfully deciphering German radio communications encoded by means of the Enigma system, and by early 1942 about 39,000 intercepted messages were being decoded each month, thanks to electromechanical machines known as ‘bombes’. These were designed by Turing and Gordon Welchman (building on earlier work by Polish cryptanalysts).

During the second half of 1941, messages encoded by means of a totally different method began to be intercepted. This new cipher machine, code-named ‘Tunny’ by Bletchley Park, was broken in April 1942 and current traffic was read for the first time in July of that year. Based on binary teleprinter code, Tunny was used in preference to Morse-based Enigma for the encryption of high-level signals, for example messages from Hitler and members of the German High Command.

The need to decipher this vital intelligence as rapidly as possible led Max Newman to propose in November 1942 (shortly after his recruitment to GC&CS from Cambridge University) that key parts of the decryption process be automated, by means of high-speed electronic counting devices. The first machine designed and built to Newman's specification, known as the Heath Robinson, was relay-based with electronic circuits for counting. (The electronic counters were designed by C.E. Wynn-Williams, who had been using thyratron tubes in counting circuits at the Cavendish Laboratory, Cambridge, since 1932 [Wynn-Williams 1932].) Installed in June 1943, Heath Robinson was unreliable and slow, and its high-speed paper tapes were continually breaking, but it proved the worth of Newman's idea. Flowers recommended that an all-electronic machine be built instead, but he received no official encouragement from GC&CS. Working independently at the Post Office Research Station at Dollis Hill, Flowers quietly got on with constructing the world's first large-scale programmable electronic digital computer. Colossus I was delivered to Bletchley Park in January 1943.

By the end of the war there were ten Colossi working round the clock at Bletchley Park. From a cryptanalytic viewpoint, a major difference between the prototype Colossus I and the later machines was the addition of the so-called Special Attachment, following a key discovery by cryptanalysts Donald Michie and Jack Good. This broadened the function of Colossus from ‘wheel setting’ — i.e., determining the settings of the encoding wheels of the Tunny machine for a particular message, given the ‘patterns’ of the wheels — to ‘wheel breaking’, i.e., determining the wheel patterns themselves. The wheel patterns were eventually changed daily by the Germans on each of the numerous links between the German Army High Command and Army Group commanders in the field. By 1945 there were as many 30 links in total. About ten of these were broken and read regularly.

Colossus I contained approximately 1600 vacuum tubes and each of the subsequent machines approximately 2400 vacuum tubes. Like the smaller ABC, Colossus lacked two important features of modern computers. First, it had no internally stored programs. To set it up for a new task, the operator had to alter the machine's physical wiring, using plugs and switches. Second, Colossus was not a general-purpose machine, being designed for a specific cryptanalytic task involving counting and Boolean operations.

F.H. Hinsley, official historian of GC&CS, has estimated that the war in Europe was shortened by at least two years as a result of the signals intelligence operation carried out at Bletchley Park, in which Colossus played a major role. Most of the Colossi were destroyed once hostilities ceased. Some of the electronic panels ended up at Newman's Computing Machine Laboratory in Manchester (see below), all trace of their original use having been removed. Two Colossi were retained by GC&CS (renamed GCHQ following the end of the war). The last Colossus is believed to have stopped running in 1960.

Those who knew of Colossus were prohibited by the Official Secrets Act from sharing their knowledge. Until the 1970s, few had any idea that electronic computation had been used successfully during the second world war. In 1970 and 1975, respectively, Good and Michie published notes giving the barest outlines of Colossus. By 1983, Flowers had received clearance from the British Government to publish a partial account of the hardware of Colossus I. Details of the later machines and of the Special Attachment, the uses to which the Colossi were put, and the cryptanalytic algorithms that they ran, have only recently been declassified. (For the full account of Colossus and the attack on Tunny see Copeland 2006.)

To those acquainted with the universal Turing machine of 1936, and the associated stored-program concept, Flowers' racks of digital electronic equipment were proof of the feasibility of using large numbers of vacuum tubes to implement a high-speed general-purpose stored-program computer. The war over, Newman lost no time in establishing the Royal Society Computing Machine Laboratory at Manchester University for precisely that purpose. A few months after his arrival at Manchester, Newman wrote as follows to the Princeton mathematician John von Neumann (February 1946):

I am … hoping to embark on a computing machine section here, having got very interested in electronic devices of this kind during the last two or three years. By about eighteen months ago I had decided to try my hand at starting up a machine unit when I got out. … I am of course in close touch with Turing.

Turing and Newman were thinking along similar lines. In 1945 Turing joined the National Physical Laboratory (NPL) in London, his brief to design and develop an electronic stored-program digital computer for scientific work. (Artificial Intelligence was not far from Turing's thoughts: he described himself as ‘building a brain’ and remarked in a letter that he was ‘more interested in the possibility of producing models of the action of the brain than in the practical applications to computing’.) John Womersley, Turing's immediate superior at NPL, christened Turing's proposed machine the Automatic Computing Engine, or ACE, in homage to Babbage's Difference Engine and Analytical Engine.

Turing's 1945 report ‘Proposed Electronic Calculator’ gave the first relatively complete specification of an electronic stored-program general-purpose digital computer. The report is reprinted in full in Copeland 2005.

The first electronic stored-program digital computer to be proposed in the U.S. was the EDVAC (see below). The ‘First Draft of a Report on the EDVAC’ (May 1945), composed by von Neumann, contained little engineering detail, in particular concerning electronic hardware (owing to restrictions in the U.S.). Turing's ‘Proposed Electronic Calculator’, on the other hand, supplied detailed circuit designs and specifications of hardware units, specimen programs in machine code, and even an estimate of the cost of building the machine (£11,200). ACE and EDVAC differed fundamentally from one another; for example, ACE employed distributed processing, while EDVAC had a centralised structure.

Turing saw that speed and memory were the keys to computing. Turing's colleague at NPL, Jim Wilkinson, observed that Turing ‘was obsessed with the idea of speed on the machine’ [Copeland 2005, p. 2]. Turing's design had much in common with today's RISC architectures and it called for a high-speed memory of roughly the same capacity as an early Macintosh computer (enormous by the standards of his day). Had Turing's ACE been built as planned it would have been in a different league from the other early computers. However, progress on Turing's Automatic Computing Engine ran slowly, due to organisational difficulties at NPL, and in 1948 a ‘very fed up’ Turing (Robin Gandy's description, in interview with Copeland, 1995) left NPL for Newman's Computing Machine Laboratory at Manchester University. It was not until May 1950 that a small pilot model of the Automatic Computing Engine, built by Wilkinson, Edward Newman, Mike Woodger, and others, first executed a program. With an operating speed of 1 MHz, the Pilot Model ACE was for some time the fastest computer in the world.

Sales of DEUCE, the production version of the Pilot Model ACE, were buoyant — confounding the suggestion, made in 1946 by the Director of the NPL, Sir Charles Darwin, that ‘it is very possible that … one machine would suffice to solve all the problems that are demanded of it from the whole country’ [Copeland 2005, p. 4]. The fundamentals of Turing's ACE design were employed by Harry Huskey (at Wayne State University, Detroit) in the Bendix G15 computer (Huskey in interview with Copeland, 1998). The G15 was arguably the first personal computer; over 400 were sold worldwide. DEUCE and the G15 remained in use until about 1970. Another computer deriving from Turing's ACE design, the MOSAIC, played a role in Britain's air defences during the Cold War period; other derivatives include the Packard-Bell PB250 (1961). (More information about these early computers is given in [Copeland 2005].)

The earliest general-purpose stored-program electronic digital computer to work was built in Newman's Computing Machine Laboratory at Manchester University. The Manchester ‘Baby’, as it became known, was constructed by the engineers F.C. Williams and Tom Kilburn, and performed its first calculation on 21 June 1948. The tiny program, stored on the face of a cathode ray tube, was just seventeen instructions long. A much enlarged version of the machine, with a programming system designed by Turing, became the world's first commercially available computer, the Ferranti Mark I. The first to be completed was installed at Manchester University in February 1951; in all about ten were sold, in Britain, Canada, Holland and Italy.

The fundamental logico-mathematical contributions by Turing and Newman to the triumph at Manchester have been neglected, and the Manchester machine is nowadays remembered as the work of Williams and Kilburn. Indeed, Newman's role in the development of computers has never been sufficiently emphasised (due perhaps to his thoroughly self-effacing way of relating the relevant events).

It was Newman who, in a lecture in Cambridge in 1935, introduced Turing to the concept that led directly to the Turing machine: Newman defined a constructive process as one that a machine can carry out (Newman in interview with Evans, op. cit.). As a result of his knowledge of Turing's work, Newman became interested in the possibilities of computing machinery in, as he put it, ‘a rather theoretical way’. It was not until Newman joined GC&CS in 1942 that his interest in computing machinery suddenly became practical, with his realisation that the attack on Tunny could be mechanised. During the building of Colossus, Newman tried to interest Flowers in Turing's 1936 paper — birthplace of the stored-program concept - but Flowers did not make much of Turing's arcane notation. There is no doubt that by 1943, Newman had firmly in mind the idea of using electronic technology in order to construct a stored-program general-purpose digital computing machine.

In July of 1946 (the month in which the Royal Society approved Newman's application for funds to found the Computing Machine Laboratory), Freddie Williams, working at the Telecommunications Research Establishment, Malvern, began the series of experiments on cathode ray tube storage that was to lead to the Williams tube memory. Williams, until then a radar engineer, explains how it was that he came to be working on the problem of computer memory:

[O]nce [the German Armies] collapsed … nobody was going to care a toss about radar, and people like me … were going to be in the soup unless we found something else to do. And computers were in the air. Knowing absolutely nothing about them I latched onto the problem of storage and tackled that. (Quoted in Bennett 1976.)

Newman learned of Williams' work, and with the able help of Patrick Blackett, Langworthy Professor of Physics at Manchester and one of the most powerful figures in the University, was instrumental in the appointment of the 35 year old Williams to the recently vacated Chair of Electro-Technics at Manchester. (Both were members of the appointing committee (Kilburn in interview with Copeland, 1997).) Williams immediately had Kilburn, his assistant at Malvern, seconded to Manchester. To take up the story in Williams' own words:

[N]either Tom Kilburn nor I knew the first thing about computers when we arrived in Manchester University. We'd had enough explained to us to understand what the problem of storage was and what we wanted to store, and that we'd achieved, so the point now had been reached when we'd got to find out about computers … Newman explained the whole business of how a computer works to us. (F.C. Williams in interview with Evans [1976])

Elsewhere Williams is explicit concerning Turing's role and gives something of the flavour of the explanation that he and Kilburn received:

Tom Kilburn and I knew nothing about computers, but a lot about circuits. Professor Newman and Mr A.M. Turing … knew a lot about computers and substantially nothing about electronics. They took us by the hand and explained how numbers could live in houses with addresses and how if they did they could be kept track of during a calculation. (Williams [1975], p. 328)

It seems that Newman must have used much the same words with Williams and Kilburn as he did in an address to the Royal Society on 4th March 1948:

Professor Hartree … has recalled that all the essential ideas of the general-purpose calculating machines now being made are to be found in Babbage's plans for his analytical engine. In modern times the idea of a universal calculating machine was independently introduced by Turing … [T]he machines now being made in America and in this country … [are] in certain general respects … all similar. There is provision for storing numbers, say in the scale of 2, so that each number appears as a row of, say, forty 0's and 1's in certain places or "houses" in the machine. … Certain of these numbers, or "words" are read, one after another, as orders. In one possible type of machine an order consists of four numbers, for example 11, 13, 27, 4. The number 4 signifies "add", and when control shifts to this word the "houses" H11 and H13 will be connected to the adder as inputs, and H27 as output. The numbers stored in H11 and H13 pass through the adder, are added, and the sum is passed on to H27. The control then shifts to the next order. In most real machines the process just described would be done by three separate orders, the first bringing [H11] (=content of H11) to a central accumulator, the second adding [H13] into the accumulator, and the third sending the result to H27; thus only one address would be required in each order. … A machine with storage, with this automatic-telephone-exchange arrangement and with the necessary adders, subtractors and so on, is, in a sense, already a universal machine. (Newman [1948], pp. 271–272)

Following this explanation of Turing's three-address concept (source 1, source 2, destination, function) Newman went on to describe program storage (‘the orders shall be in a series of houses X1, X2, …’) and conditional branching. He then summed up:

From this highly simplified account it emerges that the essential internal parts of the machine are, first, a storage for numbers (which may also be orders). … Secondly, adders, multipliers, etc. Thirdly, an "automatic telephone exchange" for selecting "houses", connecting them to the arithmetic organ, and writing the answers in other prescribed houses. Finally, means of moving control at any stage to any chosen order, if a certain condition is satisfied, otherwise passing to the next order in the normal sequence. Besides these there must be ways of setting up the machine at the outset, and extracting the final answer in useable form. (Newman [1948], pp. 273–4)

In a letter written in 1972 Williams described in some detail what he and Kilburn were told by Newman:

About the middle of the year [1946] the possibility of an appointment at Manchester University arose and I had a talk with Professor Newman who was already interested in the possibility of developing computers and had acquired a grant from the Royal Society of £30,000 for this purpose. Since he understood computers and I understood electronics the possibilities of fruitful collaboration were obvious. I remember Newman giving us a few lectures in which he outlined the organisation of a computer in terms of numbers being identified by the address of the house in which they were placed and in terms of numbers being transferred from this address, one at a time, to an accumulator where each entering number was added to what was already there. At any time the number in the accumulator could be transferred back to an assigned address in the store and the accumulator cleared for further use. The transfers were to be effected by a stored program in which a list of instructions was obeyed sequentially. Ordered progress through the list could be interrupted by a test instruction which examined the sign of the number in the accumulator. Thereafter operation started from a new point in the list of instructions. This was the first information I received about the organisation of computers. … Our first computer was the simplest embodiment of these principles, with the sole difference that it used a subtracting rather than an adding accumulator. (Letter from Williams to Randell, 1972; in Randell [1972], p. 9)

Turing's early input to the developments at Manchester, hinted at by Williams in his above-quoted reference to Turing, may have been via the lectures on computer design that Turing and Wilkinson gave in London during the period December 1946 to February 1947 (Turing and Wilkinson [1946–7]). The lectures were attended by representatives of various organisations planning to use or build an electronic computer. Kilburn was in the audience (Bowker and Giordano [1993]). (Kilburn usually said, when asked from where he obtained his basic knowledge of the computer, that he could not remember (letter from Brian Napper to Copeland, 2002); for example, in a 1992 interview he said: ‘Between early 1945 and early 1947, in that period, somehow or other I knew what a digital computer was … Where I got this knowledge from I've no idea’ (Bowker and Giordano [1993], p. 19).)

Whatever role Turing's lectures may have played in informing Kilburn, there is little doubt that credit for the Manchester computer — called the ‘Newman-Williams machine’ in a contemporary document (Huskey 1947) — belongs not only to Williams and Kilburn but also to Newman, and that the influence on Newman of Turing's 1936 paper was crucial, as was the influence of Flowers' Colossus.

The first working AI program, a draughts (checkers) player written by Christopher Strachey, ran on the Ferranti Mark I in the Manchester Computing Machine Laboratory. Strachey (at the time a teacher at Harrow School and an amateur programmer) wrote the program with Turing's encouragement and utilising the latter's recently completed Programmers' Handbook for the Ferranti. (Strachey later became Director of the Programming Research Group at Oxford University.) By the summer of 1952, the program could, Strachey reported, ‘play a complete game of draughts at a reasonable speed’. (Strachey's program formed the basis for Arthur Samuel's well-known checkers program.) The first chess-playing program, also, was written for the Manchester Ferranti, by Dietrich Prinz; the program first ran in November 1951. Designed for solving simple problems of the mate-in-two variety, the program would examine every possible move until a solution was found. Turing started to program his ‘Turochamp’ chess-player on the Ferranti Mark I, but never completed the task. Unlike Prinz's program, the Turochamp could play a complete game (when hand-simulated) and operated not by exhaustive search but under the guidance of heuristics.

The first fully functioning electronic digital computer to be built in the U.S. was ENIAC, constructed at the Moore School of Electrical Engineering, University of Pennsylvania, for the Army Ordnance Department, by J. Presper Eckert and John Mauchly. Completed in 1945, ENIAC was somewhat similar to the earlier Colossus, but considerably larger and more flexible (although far from general-purpose). The primary function for which ENIAC was designed was the calculation of tables used in aiming artillery. ENIAC was not a stored-program computer, and setting it up for a new job involved reconfiguring the machine by means of plugs and switches. For many years, ENIAC was believed to have been the first functioning electronic digital computer, Colossus being unknown to all but a few.

In 1944, John von Neumann joined the ENIAC group. He had become ‘intrigued’ (Goldstine's word, [1972], p. 275) with Turing's universal machine while Turing was at Princeton University during 1936–1938. At the Moore School, von Neumann emphasised the importance of the stored-program concept for electronic computing, including the possibility of allowing the machine to modify its own program in useful ways while running (for example, in order to control loops and branching). Turing's paper of 1936 (‘On Computable Numbers, with an Application to the Entscheidungsproblem’) was required reading for members of von Neumann's post-war computer project at the Institute for Advanced Study, Princeton University (letter from Julian Bigelow to Copeland, 2002; see also Copeland [2004], p. 23). Eckert appears to have realised independently, and prior to von Neumann's joining the ENIAC group, that the way to take full advantage of the speed at which data is processed by electronic circuits is to place suitably encoded instructions for controlling the processing in the same high-speed storage devices that hold the data itself (documented in Copeland [2004], pp. 26–7). In 1945, while ENIAC was still under construction, von Neumann produced a draft report, mentioned previously, setting out the ENIAC group's ideas for an electronic stored-program general-purpose digital computer, the EDVAC (von Neuman [1945]). The EDVAC was completed six years later, but not by its originators, who left the Moore School to build computers elsewhere. Lectures held at the Moore School in 1946 on the proposed EDVAC were widely attended and contributed greatly to the dissemination of the new ideas.

Von Neumann was a prestigious figure and he made the concept of a high-speed stored-program digital computer widely known through his writings and public addresses. As a result of his high profile in the field, it became customary, although historically inappropriate, to refer to electronic stored-program digital computers as ‘von Neumann machines’.

The Los Alamos physicist Stanley Frankel, responsible with von Neumann and others for mechanising the large-scale calculations involved in the design of the atomic bomb, has described von Neumann's view of the importance of Turing's 1936 paper, in a letter:

I know that in or about 1943 or ‘44 von Neumann was well aware of the fundamental importance of Turing's paper of 1936 … Von Neumann introduced me to that paper and at his urging I studied it with care. Many people have acclaimed von Neumann as the "father of the computer" (in a modern sense of the term) but I am sure that he would never have made that mistake himself. He might well be called the midwife, perhaps, but he firmly emphasized to me, and to others I am sure, that the fundamental conception is owing to Turing, in so far as not anticipated by Babbage … Both Turing and von Neumann, of course, also made substantial contributions to the "reduction to practice" of these concepts but I would not regard these as comparable in importance with the introduction and explication of the concept of a computer able to store in its memory its program of activities and of modifying that program in the course of these activities. (Quoted in Randell [1972], p. 10)

Other notable early stored-program electronic digital computers were:

  • EDSAC, 1949, built at Cambridge University by Maurice Wilkes
  • BINAC, 1949, built by Eckert's and Mauchly's Electronic Control Co., Philadelphia (opinions differ over whether BINAC ever actually worked)
  • Whirlwind I, 1949, Digital Computer Laboratory, Massachusetts Institute of Technology, Jay Forrester
  • SEAC, 1950, US Bureau of Standards Eastern Division, Washington D.C., Samuel Alexander, Ralph Slutz
  • SWAC, 1950, US Bureau of Standards Western Division, Institute for Numerical Analysis, University of California at Los Angeles, Harry Huskey
  • UNIVAC, 1951, Eckert-Mauchly Computer Corporation, Philadelphia (the first computer to be available commercially in the U.S.)
  • the IAS computer, 1952, Institute for Advanced Study, Princeton University, Julian Bigelow, Arthur Burks, Herman Goldstine, von Neumann, and others (thanks to von Neumann's publishing the specifications of the IAS machine, it became the model for a group of computers known as the Princeton Class machines; the IAS computer was also a strong influence on the IBM 701)
  • IBM 701, 1952, International Business Machine's first mass-produced electronic stored-program computer.

The EDVAC and ACE proposals both advocated the use of mercury-filled tubes, called ‘delay lines’, for high-speed internal memory. This form of memory is known as acoustic memory. Delay lines had initially been developed for echo cancellation in radar; the idea of using them as memory devices originated with Eckert at the Moore School. Here is Turing's description:

It is proposed to build "delay line" units consisting of mercury … tubes about 5′ long and 1″ in diameter in contact with a quartz crystal at each end. The velocity of sound in … mercury … is such that the delay will be 1.024 ms. The information to be stored may be considered to be a sequence of 1024 ‘digits’ (0 or 1) … These digits will be represented by a corresponding sequence of pulses. The digit 0 … will be represented by the absence of a pulse at the appropriate time, the digit 1 … by its presence. This series of pulses is impressed on the end of the line by one piezo-crystal, it is transmitted down the line in the form of supersonic waves, and is reconverted into a varying voltage by the crystal at the far end. This voltage is amplified sufficiently to give an output of the order of 10 volts peak to peak and is used to gate a standard pulse generated by the clock. This pulse may be again fed into the line by means of the transmitting crystal, or we may feed in some altogether different signal. We also have the possibility of leading the gated pulse to some other part of the calculator, if we have need of that information at the time. Making use of the information does not of course preclude keeping it also. (Turing [1945], p. 375)

Mercury delay line memory was used in EDSAC, BINAC, SEAC, Pilot Model ACE, EDVAC, DEUCE, and full-scale ACE (1958). The chief advantage of the delay line as a memory medium was, as Turing put it, that delay lines were "already a going concern" (Turing [1947], p. 380). The fundamental disadvantages of the delay line were that random access is impossible and, moreover, the time taken for an instruction, or number, to emerge from a delay line depends on where in the line it happens to be.

In order to minimize waiting-time, Turing arranged for instructions to be stored not in consecutive positions in the delay line, but in relative positions selected by the programmer in such a way that each instruction would emerge at exactly the time it was required, in so far as this was possible. Each instruction contained a specification of the location of the next. This system subsequently became known as ‘optimum coding’. It was an integral feature of every version of the ACE design. Optimum coding made for difficult and untidy programming, but the advantage in terms of speed was considerable. Thanks to optimum coding, the Pilot Model ACE was able to do a floating point multiplication in 3 milliseconds (Wilkes's EDSAC required 4.5 milliseconds to perform a single fixed point multiplication).

In the Williams tube or electrostatic memory, previously mentioned, a two-dimensional rectangular array of binary digits was stored on the face of a commercially-available cathode ray tube. Access to data was immediate. Williams tube memories were employed in the Manchester series of machines, SWAC, the IAS computer, and the IBM 701, and a modified form of Williams tube in Whirlwind I (until replacement by magnetic core in 1953).

Drum memories, in which data was stored magnetically on the surface of a metal cylinder, were developed on both sides of the Atlantic. The initial idea appears to have been Eckert's. The drum provided reasonably large quantities of medium-speed memory and was used to supplement a high-speed acoustic or electrostatic memory. In 1949, the Manchester computer was successfully equipped with a drum memory; this was constructed by the Manchester engineers on the model of a drum developed by Andrew Booth at Birkbeck College, London.

The final major event in the early history of electronic computation was the development of magnetic core memory. Jay Forrester realised that the hysteresis properties of magnetic core (normally used in transformers) lent themselves to the implementation of a three-dimensional solid array of randomly accessible storage points. In 1949, at Massachusetts Institute of Technology, he began to investigate this idea empirically. Forrester's early experiments with metallic core soon led him to develop the superior ferrite core memory. Digital Equipment Corporation undertook to build a computer similar to the Whirlwind I as a test vehicle for a ferrite core memory. The Memory Test Computer was completed in 1953. (This computer was used in 1954 for the first simulations of neural networks, by Belmont Farley and Wesley Clark of MIT's Lincoln Laboratory (see Copeland and Proudfoot [1996]).

Once the absolute reliability, relative cheapness, high capacity and permanent life of ferrite core memory became apparent, core soon replaced other forms of high-speed memory. The IBM 704 and 705 computers (announced in May and October 1954, respectively) brought core memory into wide use.

Works Cited

  • Babbage, C. (ed. by Campbell-Kelly, M.), 1994, Passages from the Life of a Philosopher , New Brunswick: Rutgers University Press
  • Bennett, S., 1976, ‘F.C. Williams: his contribution to the development of automatic control’, National Archive for the History of Computing, University of Manchester, England. (This is a typescript based on interviews with Williams in 1976.)
  • Bowker, G., and Giordano, R., 1993, ‘Interview with Tom Kilburn’, Annals of the History of Computing , 15 : 17–32.
  • Copeland, B.J. (ed.), 2004, The Essential Turing Oxford University Press
  • Copeland, B.J. (ed.), 2005, Alan Turing's Automatic Computing Engine: The Master Codebreaker's Struggle to Build the Modern Computer Oxford University Press
  • Copeland, B.J. and others, 2006, Colossus: The Secrets of Bletchley Park's Codebreaking Computers Oxford University Press
  • Copeland, B.J., and Proudfoot, D., 1996, ‘On Alan Turing's Anticipation of Connectionism’ Synthese , 108 : 361–377
  • Evans, C., 197?, interview with M.H.A. Newman in ‘The Pioneers of Computing: an Oral History of Computing’, London: Science Museum
  • Fifer, S., 1961, Analog Computation: Theory, Techniques, Applications New York: McGraw-Hill
  • Ford, H., 1919, ‘Mechanical Movement’, Official Gazette of the United States Patent Office , October 7, 1919: 48
  • Goldstine, H., 1972, The Computer from Pascal to von Neumann Princeton University Press
  • Huskey, H.D., 1947, ‘The State of the Art in Electronic Digital Computing in Britain and the United States’, in [Copeland 2005]
  • Newman, M.H.A., 1948, ‘General Principles of the Design of All-Purpose Computing Machines’ Proceedings of the Royal Society of London , series A, 195 (1948): 271–274
  • Randell, B., 1972, ‘On Alan Turing and the Origins of Digital Computers’, in Meltzer, B., Michie, D. (eds), Machine Intelligence 7 , Edinburgh: Edinburgh University Press, 1972
  • Smith, B.C., 1991, ‘The Owl and the Electric Encyclopaedia’, Artificial Intelligence , 47 : 251–288
  • Thomson, J., 1876, ‘On an Integrating Machine Having a New Kinematic Principle’ Proceedings of the Royal Society of London , 24 : 262–5
  • Turing, A.M., 1936, ‘On Computable Numbers, with an Application to the Entscheidungsproblem’ Proceedings of the London Mathematical Society , Series 2, 42 (1936–37): 230–265. Reprinted in The Essential Turing (Copeland [2004]).
  • Turing, A.M, 1945, ‘Proposed Electronic Calculator’, in Alan Turing's Automatic Computing Engine (Copeland [2005])
  • Turing, A.M., 1947, ‘Lecture on the Automatic Computing Engine’, in The Essential Turing (Copeland [2004])
  • Turing, A.M., and Wilkinson, J.H., 1946–7, ‘The Turing-Wilkinson Lecture Series (1946-7)’, in Alan Turing's Automatic Computing Engine (Copeland [2005])
  • von Neumann, J., 1945, ‘First Draft of a Report on the EDVAC’, in Stern, N. From ENIAC to UNIVAC: An Appraisal of the Eckert-Mauchly Computers Bedford, Mass.: Digital Press (1981), pp. 181–246
  • Williams, F.C., 1975, ‘Early Computers at Manchester University’ The Radio and Electronic Engineer , 45 (1975): 237–331
  • Wynn-Williams, C.E., 1932, ‘A Thyratron "Scale of Two" Automatic Counter’ Proceedings of the Royal Society of London , series A, 136 : 312–324

Further Reading

  • Copeland, B.J., 2004, ‘Colossus — Its Origins and Originators’ Annals of the History of Computing , 26 : 38–45
  • Metropolis, N., Howlett, J., Rota, G.C. (eds), 1980, A History of Computing in the Twentieth Century New York: Academic Press
  • Randell, B. (ed.), 1982, The Origins of Digital Computers: Selected Papers Berlin: Springer-Verlag
  • Williams, M.R., 1997, A History of Computing Technology Los Alamitos: IEEE Computer Society Press
How to cite this entry . Preview the PDF version of this entry at the Friends of the SEP Society . Look up topics and thinkers related to this entry at the Internet Philosophy Ontology Project (InPhO). Enhanced bibliography for this entry at PhilPapers , with links to its database.
  • The Turing Archive for the History of Computing
  • The Alan Turing Home Page
  • Australian Computer Museum Society
  • The Bletchley Park Home Page
  • Charles Babbage Institute
  • Computational Logic Group at St. Andrews
  • The Computer Conservation Society (UK)
  • CSIRAC (a.k.a. CSIR MARK I) Home Page
  • Frode Weierud's CryptoCellar
  • Logic and Computation Group at Penn
  • National Archive for the History of Computing
  • National Cryptologic Museum

computability and complexity | recursive functions | Turing, Alan | Turing machines

Copyright © 2006 by B. Jack Copeland < jack . copeland @ canterbury . ac . nz >

  • Accessibility

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

The Stanford Encyclopedia of Philosophy is copyright © 2023 by The Metaphysics Research Lab , Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

24/7 writing help on your phone

To install StudyMoose App tap and then “Add to Home Screen”

Age Of Computers Essay Examples

Age Of Computers - Free Essay Examples and Topic Ideas

The age of computers refers to the period of time when computer technology became widely available and played a significant role in society. This era began in the mid-20th century with the development of the first electronic digital computers and has continued to the present day with the rise of artificial intelligence, cloud computing, and the internet. The age of computers has transformed the way we live, work, and communicate, leading to greater efficiency, connectivity, and technological advancements. It has also raised concerns about privacy, security, and the impact of technology on human society.

  • 📘 Free essay examples for your ideas about Age Of Computers
  • 🏆 Best Essay Topics on Age Of Computers
  • ⚡ Simple & Age Of Computers Easy Topics
  • 🎓 Good Research Topics about Age Of Computers

Essay examples

Essay topic.

Save to my list

Remove from my list

  • Life in the information age
  • The History of Computers
  • The Evolution of Computers
  • The Impact of Computers on Society
  • The Future of Computers
  • Age of Mobilism
  • In this age of innovation our everyday lives and social
  • Privacy in Digital Age
  • The Benefits of Computers
  • Early age of Ada Lovelace
  • The Disadvantages of Computers
  • The Uses of Computers
  • In today’s day and age the world of business has expanded to
  • The Types of Computers
  • The Cost of Computers
  • The Maintenance of Computers
  • The Security of Computers
  • The Reliability of Computers
  • The Portability of Computers
  • The Ease of Use of Computers
  • The Convenience of Computers

FAQ about Age Of Computers

search

👋 Hi! I’m your smart assistant Amy!

Don’t know where to start? Type your requirements and I’ll connect you to an academic expert within 3 minutes.

Essay Service Examples Technology Computer

The History of Computers: An Essay

Table of contents

Introduction: the dawn of computing, the mechanical beginnings: from abacus to analytical engine, the birth of binary logic and its impact on computing, the era of innovations: babbage to the first relay computer, the advent of electronic computing: world war ii and beyond, the transition to transistors: the second generation of computers, integrated circuits and minicomputers: the third generation, the microprocessor revolution: the fourth generation and personal computing, conclusion: the future of computing and its infinite possibilities.

  • Proper editing and formatting
  • Free revision, title page, and bibliography
  • Flexible prices and money-back guarantee

document

Our writers will provide you with an essay sample written from scratch: any topic, any deadline, any instructions.

reviews

Cite this paper

Related essay topics.

Get your paper done in as fast as 3 hours, 24/7.

Related articles

The History of Computers: An Essay

Most popular essays

It is impossible to imagine the modern world without computers. Today’s computers help the work...

  • Advantages of Technology

In today’s world, it is necessary to use technology, especially when it comes to education. The...

  • Effects of Computers

The current trend implies that the computers are used nearly everywhere. Firstly, in the...

Computers are normally utilized in numerous zones. It is a significant utility for individuals,...

  • Effects of Technology

In nowadays, the technology that has more impact on human beings is the computer. The computer had...

We all know that computers are a very important part of our modern life, but they do have a...

  • Digital Era

Despite being a relatively new technology the advancement of the computer, which is defined by...

Nowadays everyone interacts with the computer science. The multiple exposure to technology makes...

The computers are increasing day after day, their capabilities and features are developing day...

Join our 150k of happy users

  • Get original paper written according to your instructions
  • Save time for what matters most

Fair Use Policy

EduBirdie considers academic integrity to be the essential part of the learning process and does not support any violation of the academic standards. Should you have any questions regarding our Fair Use Policy or become aware of any violations, please do not hesitate to contact us via [email protected].

We are here 24/7 to write your paper in as fast as 3 hours.

Provide your email, and we'll send you this sample!

By providing your email, you agree to our Terms & Conditions and Privacy Policy .

Say goodbye to copy-pasting!

Get custom-crafted papers for you.

Enter your email, and we'll promptly send you the full essay. No need to copy piece by piece. It's in your inbox!

Essay on Computer and its Uses for School Students and Children

500+ words essay on computer.

In this essay on computer, we are going to discuss some useful things about computers. The modern-day computer has become an important part of our daily life. Also, their usage has increased much fold during the last decade. Nowadays, they use the computer in every office whether private or government. Mankind is using computers for over many decades now. Also, they are used in many fields like agriculture, designing, machinery making, defense and many more. Above all, they have revolutionized the whole world.

essay on computer

History of Computers

It is very difficult to find the exact origin of computers. But according to some experts computer exists at the time of world war-II. Also, at that time they were used for keeping data. But, it was for only government use and not for public use. Above all, in the beginning, the computer was a very large and heavy machine.

Working of a Computer 

The computer runs on a three-step cycle namely input, process, and output. Also, the computer follows this cycle in every process it was asked to do. In simple words, the process can be explained in this way. The data which we feed into the computer is input, the work CPU do is process and the result which the computer give is output.

Components and Types of Computer

The simple computer basically consists of CPU, monitor, mouse, and keyboard . Also, there are hundreds of other computer parts that can be attached to it. These other parts include a printer, laser pen, scanner , etc.

The computer is categorized into many different types like supercomputers, mainframes, personal computers (desktop), PDAs, laptop, etc. The mobile phone is also a type of computer because it fulfills all the criteria of being a computer.

Get the huge list of more than 500 Essay Topics and Ideas

Uses of Computer in Various Fields

As the usage of computer increased it became a necessity for almost every field to use computers for their operations. Also, they have made working and sorting things easier. Below we are mentioning some of the important fields that use a computer in their daily operation.

Medical Field

They use computers to diagnose diseases, run tests and for finding the cure for deadly diseases . Also, they are able to find a cure for many diseases because of computers.

Whether it’s scientific research, space research or any social research computers help in all of them. Also, due to them, we are able to keep a check on the environment , space, and society. Space research helped us to explore the galaxies. While scientific research has helped us to locate resources and various other useful resources from the earth.

For any country, his defence is most important for the safety and security of its people. Also, computer in this field helps the country’s security agencies to detect a threat which can be harmful in the future. Above all the defense industry use them to keep surveillance on our enemy.

Threats from a Computer

Computers have become a necessity also, they have become a threat too. This is due to hackers who steal your private data and leak them on internet. Also, anyone can access this data. Apart from that, there are other threats like viruses, spams, bug and many other problems.

age of computer essay

The computer is a very important machine that has become a useful part of our life. Also, the computers have twin-faces on one side it’s a boon and on the other side, it’s a bane. Its uses completely depend upon you. Apart from that, a day in the future will come when human civilization won’t be able to survive without computers as we depend on them too much. Till now it is a great discovery of mankind that has helped in saving thousands and millions of lives.

Frequently Asked Questions on Computer

Q.1  What is a computer?

A.1 A computer is an electronic device or machine that makes our work easier. Also, they help us in many ways.

Q.2 Mention various fields where computers are used?

A.2  Computers are majorly used in defense, medicine, and for research purposes.

Customize your course in 30 seconds

Which class are you in.

tutor

  • Travelling Essay
  • Picnic Essay
  • Our Country Essay
  • My Parents Essay
  • Essay on Favourite Personality
  • Essay on Memorable Day of My Life
  • Essay on Knowledge is Power
  • Essay on Gurpurab
  • Essay on My Favourite Season
  • Essay on Types of Sports

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Download the App

Google Play

How Computers Affect Our Lives Essay

  • To find inspiration for your paper and overcome writer’s block
  • As a source of information (ensure proper referencing)
  • As a template for you assignment

How Computers Affect Our Lives: Essay Introduction

History of computers, positive effects of computer on human life, computers replacing man, negative computer influences, conflict with religious beliefs, conclusion: how computer influences our life, works cited.

Computers are a common phenomenon in the lives of people in today’s world. Computers are very vital especially to those people who run businesses, industries and other organizations. Today, almost everything that people engage in makes use of a computer. Take for instance, the transport sector: vehicles, trains, airplanes, and even traffic lights on our roads are controlled by computers.

In hospitals, most of the equipments use or are run by computers. Look at space exploration; it was all made possible with the advent of computer technology. In the job sector, many of the jobs require knowledge in computers because they mostly involve the use of computers.

In short, these machines have become so important and embedded in the lives of humans, they have hugely impacted on the whole society to the extent that it will be very hard to survive now, without them. This article discusses the influence of computers on the everyday life of human beings.

One can guess what will exactly happen if the world had no computers. Many of the cures found with help of computer technology would not have been developed without computer technology, meaning that many people would have died from diseases that are now curable. In the entertainment industry, many of the movies and even songs will not be in use without computers because most of the graphics used and the animations we see are only possible with the help of a computer (Saimo 1).

In the field of medicine, pharmacies, will find it hard in determining the type of medication to give to the many patients. Computers have also played a role in the development of democracy in the world. Today votes are counted using computers and this has greatly reduced incidences of vote rigging and consequently reduced conflicts that would otherwise arise from the same.

And as we have already seen, no one would have known anything about space because space explorations become possible only with the help of computer technology. However, the use of computers has generated public discourses whereby people have emerged with different views, some supporting their use and others criticizing them (Saimo 1).

To better understand how computers influence the lives of people, we will have to start from the history, from their invention to the present day. Early computers did not involve complex technologies as the ones that are used today; neither did they employ the use of monitors or chips that are common today.

The early computers were not that small as those used today and they were commonly used to help in working out complex calculations in mathematics that proved tedious to be done manually. This is why the first machine was called by some as a calculator and others as a computer because it was used for making calculations.

Blaise Pascal is credited with the first digital machine that could add and subtract. Many versions of calculators and computers borrowed from his ideas. And as time went by, many developed more needs, which lead to modifications to bring about new and more efficient computers (Edwards 4).

Computer influence in the life of man became widely felt during World War II where computers were used to calculate and track the movements and also strategize the way military attacks were done (Edwards 4). It is therefore clear, that computers and its influence on man have a long history.

Its invention involved hard work dedication and determination, and in the end it paid off. The world was and is still being changed by computers. Man has been able to see into the future and plan ahead because of computers. Life today has been made easier with the help of computers, although some people may disagree with this, but am sure many will agree with me.

Those who disagree say that computers have taken away the role of man, which is not wrong at all, but we must also acknowledge the fact what was seen as impossible initially, become possible because of computers (Turkle 22).

As we mentioned in the introduction, computers are useful in the running of the affairs of many companies today. Companies nowadays use a lot of data that can only be securely stored with the help of computers. This data is then used in operations that are computer run. Without computers companies will find it difficult store thousands of records that are made on a daily basis.

Take for instance, what will happen to a customer checking his or her balance, or one who just want to have information on transactions made. In such a case, it will take long to go through all the transactions to get a particular one.

The invention of computers made this easier; bank employees today give customers their balances, transaction information, and other services just by tapping the computer keyboard. This would not be possible without computers (Saimo 1).

In personal life

Today individuals can store all information be it personal or that of a business nature in a computer. It is even made better by being able to make frequent updates and modifications to the information. This same information can be easily retrieved whenever it is needed by sending it via email or by printing it.

All this have been made possible with the use of computers. Life is easier and enjoyable, individuals now can comfortably entertain themselves at home by watching TV with their families or they can work from the comfort of their home thanks to computer technology.

Computers feature in the everyday life of people. Today one can use a computer even without being aware of it: people use their credit cards when buying items from stores; this has become a common practice that few know that the transaction is processed through computer technology.

It is the computer which process customer information that is fed to it through the credit card, it detects the transaction, and it then pays the bill by subtracting the amount from the credit card. Getting cash has also been made easier and faster, an individual simply walks to an ATM machine to withdraw any amount of cash he requires. ATM machines operate using computer technology (Saimo 1).

I mentioned the use of credit cards as one of the practical benefits of using computers. Today, individual do not need to physically visit shopping stores to buy items. All one needs is to be connected on the internet and by using a computer one can pay for items using the credit card.

These can then be delivered at the door step. The era where people used to queue in crowded stores to buy items, or wasting time in line waiting to buy tickets is over. Today, travelers can buy tickets and make travel arrangements via the internet at any time thanks to the advent of computer technology (Saimo 1).

In communication

Through the computer, man now has the most effective means of communication. The internet has made the world a global village. Today people carry with them phones, which are basically small computers, others carry laptops, all these have made the internet most effective and affordable medium of communication for people to contact their friends, their families, contact business people, from anywhere in the world.

Businesses are using computer technology to keep records and track their accounts and the flow of money (Lee 1). In the area of entertainment, computers have not been left behind either.

Action and science fiction movies use computers to incorporated visual effects that make them look real. Computer games, a common entertainer especially to teenagers, have been made more entertaining with the use of advanced computer technology (Frisicaro et.al 1).

In Education

The education sector has also been greatly influenced by computer technology. Much of the school work is done with the aid of a computer. If students are given assignments all they have to do is search for the solution on the internet using Google. The assignments can then be neatly presented thanks to computer software that is made specifically for such purposes.

Today most high schools have made it mandatory for students to type out their work before presenting it for marking. This is made possible through computers. Teachers have also found computer technology very useful as they can use it to track student performance. They use computers to give out instructions.

Computers have also made online learning possible. Today teachers and students do not need to be physically present in class in order to be taught. Online teaching has allowed students to attend class from any place at any time without any inconveniences (Computers 1).

In the medical sector

Another very crucial sector in the life of man that computers has greatly influenced and continues to influence is the health sector. It was already mentioned in the introduction that hospitals and pharmacies employ the use of computers in serving people.

Computers are used in pharmacies to help pharmacists determine what type and amount of medication patients should get. Patient data and their health progress are recorded using computers in many hospitals. The issue of equipment status and placement in hospitals is recorded and tracked down using computers.

Research done by scientists, doctors, and many other people in the search to find cures for many diseases and medical complications is facilitated through computer technology. Many of the diseases that were known to be dangerous such as malaria are now treatable thanks to computer interventions (Parkin 615).

Many of the opponents of computer technology have argued against the use of computers basing their arguments on the fact that computers are replacing man when carrying out the basic activities that are naturally human in nature.

However, it should be noted that there are situations that call for extraordinary interventions. In many industries, machines have replaced human labor. Use of machines is usually very cheap when compared to human labor.

In addition machines give consistent results in terms of quality. There are other instances where the skills needed to perform a certain task are too high for an ordinary person to do. This is usually experienced in cases of surgery where man’s intervention alone is not sufficient. However, machines that are computer operated have made complex surgeries successful.

There are also cases where the tasks that are to be performed may be too dangerous for a normal human being. Such situations have been experienced during disasters such as people being trapped underground during mining. It is usually dangerous to use people in such situations, and even where people are used, the rescue is usually delayed.

Robotic machines that are computer operated have always helped in such situations and people have been saved. It is not also possible to send people in space duration space explorations, but computer machines such as robots have been effectively used to make exploration outside our world (Gupta 1).

Despite all these good things that computers have done to humans, their opponents also have some vital points that should not just be ignored. There are many things that computers do leaving many people wondering whether they are really helping the society, or they are just being used to deprive man his God given ability to function according to societal ethics.

Take for instance in the workplace and even at home; computers have permeated in every activity done by an individual thereby compromising personal privacy. Computers have been used to expose people to unauthorized access to personal information. There is some personal information, which if exposed can impact negatively to someone’s life.

Today the world does not care about ethics to the extent that it is very difficulty for one to clearly differentiate between what is and is not authentic or trustful. Computers have taken up every aspect of human life, from house chores in the home to practices carried out in the social spheres.

This has seen people lose their human element to machines. Industries and organizations have replaced human labor for the cheap and more effective machine labor. This means that people have lost jobs thanks to the advances made in the computer technology. Children using computers grow up with difficulties of differentiating between reality and fiction (Subrahmanyam et.al 139).

People depend on computers to do tasks. Students generate solutions to assignments using computers; teachers on the other hand use computers to mark assignments. Doctors in hospitals depend on machines to make patient diagnoses, to perform surgeries and to determine type of medications (Daley 56).

In the entertainment industry, computer technology has been used to modify sound to make people think that person singing is indeed great, but the truth of the matter is that it is simply the computer. This has taken away the really function of a musician in the music sector.

In the world of technology today, we live as a worried lot. The issue of hacking is very common and even statistics confirm that huge amounts of money are lost every year through hacking. Therefore, as much as people pride themselves that they are computer literate, they deeply worried that they may be the next victim to practices such as hacking (Bynum 1).

There is also the problem of trying to imitate God. It is believed that in 20 years time, man will come up with another form of life, a man made being. This will not only affect how man will be viewed in terms of his intelligence, but it will also break the long held view that God is the sole provider of life.

Computers have made it possible to create artificial intelligence where machines are given artificial intelligence so that they can behave and act like man. This when viewed from the religious point of view creates conflicts in human beliefs.

It has been long held that man was created in the image of God. Creating a machine in the image of money will distort the way people conceive of God. Using artificial methods to come up with new forms of life with man like intelligence will make man equate himself to God.

This carries the risk of changing the beliefs that mankind has held for millions of years. If this happens, the very computer technology will help by the use of mass media to distribute and convince people to change their beliefs and conceptions of God (Krasnogor 1).

We have seen that computer have and will continue to influence our lives. The advent of the computers has changed man as much as it has the world he lives in.

It is true that many of the things that seemed impossible have been made possible with computer technology. Medical technologies have led to discoveries in medicine, which have in turn saved many lives. Communication is now easy and fast. The world has been transformed into a virtual village.

Computers have made education accessible to all. In the entertainment sector, people are more satisfied. Crime surveillance is better and effective. However, we should be ware not to imitate God. As much as computers have positively influenced our lives, it is a live bomb that is waiting to explode.

We should tread carefully not to be overwhelmed by its sophistication (Computers 1). Many technologies have come with intensities that have seen them surpass their productivity levels thereby destroying themselves in the process. This seems like one such technology.

Bynum, Terrell. Computer and Information Ethics . Plato, 2008. Web.

Computers. Institutional Impacts . Virtual Communities in a Capitalist World, n.d. Web.

Daley, Bill. Computers Are Your Future: Introductory. New York: Prentice, 2007. Print.

Edwards, Paul. From “Impact” to Social Process . Computers in Society and Culture,1994. Web.

Frisicaro et.al. So What’s the Problem? The Impact of Computers, 2011. Web.

Gupta, Satyandra. We, robot: What real-life machines can and can’t do . Science News, 2011. Web.

Krasnogor, Ren. Advances in Artificial Life. Impacts on Human Life. n.d. Web.

Lee, Konsbruck. Impacts of Information Technology on Society in the new Century . Zurich. Web.

Parkin, Andrew. Computers in clinical practice . Applying experience from child psychiatry. 2004. Web.

Saimo. The impact of computer technology in Affect human life . Impact of Computer, 2010. Web.

Subrahmanyam et al. The Impact of Home Computer Use on Children’s Activities and Development. Princeton, 2004. Web.

Turkle, Sherry. The second self : Computers and the human spirit, 2005. Web.

  • Technology Implementation: The Role of People and Culture
  • The Evolution of the Automobile & Its Effects on Society
  • Credit Card: Buy Now and Pay More Later
  • Should College Students Have Credit Cards
  • Credit Cards: Supporting Arguments
  • Concept and Types of the Computer Networks
  • History of the Networking Technology
  • Bellevue Mine Explosion, Crowsnest Pass, Alberta, December 9, 1910
  • Men are Responsible for More Car Accidents Compared to Women
  • Solutions to Computer Viruses
  • Chicago (A-D)
  • Chicago (N-B)

IvyPanda. (2018, May 28). How Computers Affect Our Lives. https://ivypanda.com/essays/how-computers-influence-our-life/

"How Computers Affect Our Lives." IvyPanda , 28 May 2018, ivypanda.com/essays/how-computers-influence-our-life/.

IvyPanda . (2018) 'How Computers Affect Our Lives'. 28 May.

IvyPanda . 2018. "How Computers Affect Our Lives." May 28, 2018. https://ivypanda.com/essays/how-computers-influence-our-life/.

1. IvyPanda . "How Computers Affect Our Lives." May 28, 2018. https://ivypanda.com/essays/how-computers-influence-our-life/.

Bibliography

IvyPanda . "How Computers Affect Our Lives." May 28, 2018. https://ivypanda.com/essays/how-computers-influence-our-life/.

Essay on Computer

500+ words essay on computer.

A computer is an electronic device that performs complex calculations. It is a wonderful product of modern technology. Nowadays, computers have become a significant part of our life. Whether it is in the sector of education or health, computers are used everywhere. Our progress is entirely dependent on computers powered by the latest technology. This ‘Essay on Computer’ also covers the history of computers as well as their uses in different sectors. By going through the ‘Computer’ Essay in English, students will get an idea of writing a good Essay on Computers. After practising this essay, they will be able to write essays on other topics related to computers, such as the ‘Uses of Computer’ Essay.

The invention of the computer has made our lives easier. The device is used for many purposes, such as securing information, messages, data processing, software programming, calculations, etc. A desktop computer has a CPU, UPS, monitor, keyboard, and mouse to work. A laptop is a modern form of computer in which all the components are inbuilt into a single device. Earlier, computers were not so fast and powerful. After thorough and meticulous research and work by various scientists, modern-day computers have come up.

History of Computers

The history of computer development is often used to reference the different generations of computing devices. Each generation of computers is characterised by a major technological development that fundamentally changed the way computers work. Most of the major developments from the 1940s to the present day have resulted in increasingly smaller, more powerful, faster, cheaper and more efficient computing devices.

The evolution of computer technology is often divided into five generations. These five generations of computers are as follows:

Uses of Computers

Computers are used in various fields. Some of the applications are

1. Business

A computer can perform a high-speed calculation more efficiently and accurately, due to which it is used in all business organisations. In business, computers are used for:

  • Payroll calculations
  • Sales analysis
  • Maintenance of stocks
  • Managing employee databases

2. Education

Computers are very useful in the education system. Especially now, during the COVID time, online education has become the need of the hour. There are miscellaneous ways through which an institution can use computers to educate students.

3. Health Care

Computers have become an important part of hospitals, labs and dispensaries. They are used for the scanning and diagnosis of different diseases. Computerised machines do scans, which include ECG, EEG, ultrasound and CT Scan, etc. Moreover, they are used in hospitals to keep records of patients and medicines.

Computers are largely used in defence. The military employs computerised control systems, modern tanks, missiles, weapons, etc. It uses computers for communication, operation and planning, smart weapons, etc.

5. Government

Computers play an important role in government services. Some major fields are:

  • Computation of male/female ratio
  • Computerisation of PAN card
  • Income Tax Department
  • Weather forecasting
  • Computerisation of voters’ lists
  • Sales Tax Department

6. Communication

Communication is a way to convey an idea, a message, a picture, a speech or any form of text, audio or video clip. Computers are capable of doing so. Through computers, we can send an email, chat with each other, do video conferencing, etc.

Nowadays, to a large extent, banking is dependent on computers. Banks provide an online accounting facility, which includes checking current balances, making deposits and overdrafts, checking interest charges, shares, trustee records, etc. The ATM machines, which are fully automated, use computers, making it easier for customers to deal with banking transactions.

8. Marketing

In marketing, computers are mainly used for advertising and home shopping.

Similarly, there are various other applications of computers in other fields, such as insurance, engineering, design, etc.

Students can practise more essays on different topics to improve their writing skills. Keep learning and stay tuned with BYJU’S for the latest update on CBSE/ICSE/State Board/Competitive Exams. Also, download the BYJU’S App for interactive study videos.

Frequently asked Questions on Computer Essay

How has the invention of the computer been useful to students.

Easy and ready access to information has been possible (internet) with the invention of the computer.

How to start writing an essay on a computer?

Before writing an essay, first plan the topics, sub-topics and main points which are going to be included in the body of the essay. Then, structure the content accordingly and check for information and examples.

How to use the computer to browse for information on essays?

Various search engines are available, like Google, where plenty of information can be obtained regarding essays and essay structures.

CBSE Related Links

Leave a Comment Cancel reply

Your Mobile number and Email id will not be published. Required fields are marked *

Request OTP on Voice Call

Post My Comment

age of computer essay

Thank u sir

age of computer essay

Register with BYJU'S & Download Free PDFs

Register with byju's & watch live videos.

  • Trending Now
  • Foundational Courses
  • Data Science
  • Practice Problem
  • Machine Learning
  • System Design
  • DevOps Tutorial

Fifth Generation of Computers

By the time of the discovery of the computer through Charles Babbage, technology had advanced and superior in a completely vast manner. This development in technology and consequently the improvement of computer systems are grouped in numerous generations. Each generation of computer systems has a few vast alternates of their function and far greater benefit than the preceding generation of computer systems. So, it is often stated that a generation is regularly referred to as an alternate and development in the era. Basically, there are 5 generations of computer systems indexed under and they vary from each other in terms of architecture, occupying space, language, specification, function or operation performed, etc. Following is the list of computer generations:

1. First Generation of Computers(1940 – 1956): The duration from 1940-1956 changed into the duration of first-generation computer systems. They are essentially primarily based totally on vacuum tubes, and vacuum tubes are used because of the simple components for memory and circuitry for the CPU (Central Processing Unit). For example, UNIVAC-1 and ENIVAC.

2. Second Generation of Computers (1957 – 1963): This generation includes styles of gadgets transistors and magnetic core in the systems. For example, IBM 1401, IBM 1920, etc.

3. Third Generation of Computers(1964 – 1971): Computer circuits changed the usage of transistors within-side the third generation of computer systems. Integrated Circuits themselves include many transistors, capacitors, and resistors and because of this third-generation computer systems are smaller in size, efficient, and extra reliable. For example, CDC 1700, IBM-360 Series, etc.

4. Fourth Generation of Computers(1972 onward): VLSI (Very Large Scale Integrated) Circuit or they’re additionally referred to as microprocessors are utilized in this generation. A microprocessor chip is made from hundreds of Integrated Circuits construct on a single silicon chip. The use of Personal Computer(PCs) elevated on this generation and First Personal Computer (PC) changed into advanced through IBM. For example, Apple, CRAY-1, etc.  

5. Fifth Generation of Computers(Present and Future): It is primarily based totally on Artificial intelligence (AI) software. Artificial intelligence describes the medium and manner of creating computer systems like people, the manner human thinks, the manner people act, etc. and that is a rising department and has all of the scopes for studies work too. For example, PARAM 10000, IBM notebooks, etc.

Fifth Generation Computers

Fifth-generation computers were introduced after the fourth-generation computers were invented. Fifth-generation computers, also known as modern computers, are still in the development stage and are based on artificial intelligence. In 1982, Japan was invented the FGCS (Fifth Generation Computer System). Computers of this generation are based on microelectronic technology with high computing power and parallel processing.  

This is the most recent and technologically advanced computer generation. Modern high-level languages such as Python, R, C#, Java, and others are used as input methods. These are incredibly dependable and use the Ultra Large Scale Integration (ULSI) technology. War. Parallel processing hardware and artificial intelligence software are used in computers. 

These computers are at the cutting edge of modern scientific computations and are being utilized to develop artificial intelligence (AI) software. Artificial intelligence (AI) is a popular discipline of computer science that examines the meaning and methods for programming computers to behave like humans. It is still in its infancy.

In the fifth generation of computers, all high-level languages are employed. The primary goal of the fifth generation is to create machines that can learn and organize themselves. Artificial intelligence and parallel processing hardware are at the heart of this generation of computers, and artificial intelligence encompasses terms like Robotics, Neural Networks, etc.

The fundamental goal of this system is to make development in artificial intelligence and incorporate it into a new generation of extremely powerful computers that can be used by the average person. AI-based systems are employed in a variety of real-world applications and give a variety of benefits. When a specific set of knowledge and skills is required, systems are capable of performing well in scenarios that a human could encounter with the help of proper training. They do not, however, fit in situations where there is a need for tacit knowledge and a human can get it by talking in natural language and is concerned with form and speech recognition

The usage of AI, which helps to make computers more powerful, is one of the primary elements of 5th generation computers. From navigation to browsing, AI applications may be found everywhere. It’s also used for video analysis, image processing, and other tasks. Artificial intelligence is projected to automate practically every element of computing.

Even though they are still in development, computers in the fifth generation are more powerful, functional, and speedy. Some of the benefits of computers that use ULSI (Ultra Large-Scale Integration) technology. The fifth-generation computers employ AI (artificial intelligence) technology, which includes expert system development, gameplay, and more. These machines were able to interpret human language as well as recognize graphs and photos thanks to AI technology. Fifth-generation computers are being developed to address extremely difficult tasks, such as working with natural language. They will, hopefully, be able to utilize more than one CPU and will be less expensive than the current generation. It is relatively simple to move these computers from one location to another. Some fifth-generation computers are PARAM 10000, IBM notebooks, Intel P4, Laptops, etc.

Features of Fifth-generation Computers

Following are some features of fifth-generation computers:

  • The ULSI (ultra large scale integration) technology is used in this generation of computers.
  • Natural language processing is now in its fifth phase of development.
  • In this generation’s computers, artificial intelligence has progressed.
  • Parallel processing has advanced on these computers.
  • The fifth-generation computer includes more user-friendly interfaces and multimedia functions.
  • These PCs can be purchased for a lower price.
  • Computers that are more portable and powerful.
  • Computers are dependable and less expensive.
  • It’s easier to manufacture in a commercial setting.
  • Desktop computers are straightforward to operate.
  • Mainframe computers are extremely efficient.

Advantages of Fifth Generation of Computer

Following are some advantages of fifth-generation computers:

  • These computers are far quicker than previous generations.
  • These computers are simpler to repair.
  • These computers are substantially smaller in size than other generation computers.
  • They are lightweight and easy to move.
  • True artificial intelligence is being developed.
  • Parallel Processing has progressed.
  • Superconductor technology has progressed.

Disadvantages of Fifth Generation of Computer

Following are some disadvantages of fifth-generation computers:

  • They’re usually sophisticated but could be difficult to use.
  • They can give businesses additional power to monitor your activities and potentially infect your machine.

Sample Questions

Question 1: What this counting machine is called developed by Charles Babbage known as the father of the computer?

Charles Babbage developed a counting machine called a difference engine.

Question 2: Which generation of computers uses integrated circuits?

The third generation computers were the enhanced version of second-generation computers they used integrated circuits.

Question 3: What are the key technologies used in the fifth generation of computers?

VLSI architecture, parallel processing such as data flow control, logic programming, knowledge base based on a relational database, and applied artificial intelligence and pattern processing appear to be the key feature of fifth generation computer.

Question 4: Which generation support AI? 

Fifth generation computers support AI(Artificial Intelligence).

Question 5: Which generation of computers supports the operating system and other application software?

Third generation computers supports operating system and other application software.

Please Login to comment...

Similar reads.

  • School Learning
  • School Programming
  • How to Get a Free SSL Certificate
  • Best SSL Certificates Provider in India
  • Elon Musk's xAI releases Grok-2 AI assistant
  • What is OpenAI SearchGPT? How it works and How to Get it?
  • Content Improvement League 2024: From Good To A Great Article

Improve your Coding Skills with Practice

 alt=

What kind of Experience do you want to share?

Your Article Library

Computers: essay on the importance of computer in the modern society.

age of computer essay

ADVERTISEMENTS:

Read this comprehensive essay on the Importance of Computer in the Modern Society !

As the world progresses on in this never ending chase for a time and wealth, it is undeniable that science has made astounding developments.

Computers

Image Courtesy : upload.wikimedia.org/wikipedia/commons/thumb/f/fb//05.jpg

As the 21st century looms ahead, it is clear to see that it has advancements that humanity may never have dreamed of and one of these shining developments is the well-recognized computer. Having the Latin meaning of ‘computing’ or ‘reckoning’ the computer is an invention that was called the ‘MAN OF THE YEAR’ in a survey carried out by an international magazine.

The computer system is not a simple machine. It is like a very modern and highly complex calculator. It can do all the functions at a speedy rate and also helps us to search and progress in our homes and businesses. A computer can therefore be called a calculator with a twist for not only does it perform fast calculations, but it also has other special characteristics. The computer has thoroughly changed the way we witness things, with its special auto correcting tools, which work with all languages, all logic and all subjects.

There was a time when computers were only heard of as a luxury. However today they are an unavoidable part of success and development. No longer are they owned only through theft and by the filthy rich, in fact computers are and will in the coming days and months be used to accomplish the brilliant goals of success and unparalleled development. For example, in India, the accurate knowledge and use of computers will bring change in a big and astonishing way. It will lead to the demolition of illiteracy, and lead to optimism, efficiency, productivity and high quality.

Even now in our day to day lives, computers have been allotted an integral role to play. They can be seen being used not only at the office or at home, but in all kinds of sectors and businesses. They are used at airports, restaurants, railway stations, banks etc. slowly and gradually, as computers are penetrating through the modern society, people are getting more and more optimistic about the promises its invention made. They are also used in the government sectors, businesses and industry, and through witnessing the rapid progress of the computer; mankind slowly sees the lights it has brought along.

One of the best things about the computer is the fact that it can help us to save so much of manual power, cost, and time. By the use of a computer, tasks can be done automatically and that will lead to saving the countless hours that may otherwise have been spent on doing the job manually.

Computers also ensure more accuracy. Examples of such cases include ticket booking, payment of bills, insurance and shopping. Interestingly, automatic operations of vehicles, like trains also help to ensure further safety and reliability of the journey. Computers can be used to observe and predict traffic patterns which would be a grand benefit to all and would save the hassle of getting stuck for hours in the roadblocks and traffics.

Computers can also drastically change the way agricultural tasks and businesses are carried out all over the world. With regard to agriculture, computers are being used to find out the best possible kinds of soil, plants and to check which match of these would result in the perfect crops. Use of computers thus in this sector along with the use of better agricultural practices and products in several countries, like India, could help the agricultural industry reach soaring heights, directly assuring the welfare of the economy.

It is also wonderful to see that the invention of this unbelievable machine has brought a ray of hope in the darkness of the sick citizens’ world. Computers are very capable of bringing along a medical revolution. Where in health sectors computers are being used for research regarding blood groups, medical histories, etc. and helping to improve medicine in a big way. The knowledge that computers are providing in this field may lead to better use and purchase of medicinal drugs and ensure better health. This also leads to a better diagnosing pattern and makes health care faster and more efficiently.

Although computers are bringing the evolution of technology and changing the way lives are lived, it cannot be denied that there are areas where the impacts of the computer system are not fully recognized yet. For instance if we take the education sector, the literacy rates have not been improved by computers the way other sectors have seemed to have gotten better over night.

The fact remains that 64% of our population remains to date illiterate, and it will be a revolutionary act if computers were made the full use of and worked with to spread educational awareness, in all areas, especially the underprivileged sector. They can be used to plan out lessons, and lessons can be taught on the computers too, the benefit of the prospect lying in the fact that computers excel at lots of different things altogether, which means they can be used to teach not only limited subjects but be used to spread education with reference to all kinds, including text, numbers and graphics.

Perhaps one may think the horrendous thought that computers may take the teacher’s place in the classroom, but we must look at the prospect with the brighter side. No longer will the teacher remain a person who only fits data into a pupil’s mind; and once again become that one supreme authority who inculcates both philosophical and spiritual education amongst his or her students, rising in esteem and role play.

The advantage of computers can also be seen in the fact that they might just be able to improve administration through the world. By providing daily accurate information to the administration departments, computers may change the way decisions are taken across the globe.Keeping all the above mentioned things in mind, we must accept that if used the right way, computers are a gift of science to mankind.

Related Articles:

  • Computers: Essay on Computers (992 Words)
  • Personal Computers: Important Features of Modern Operating System for Personal Computers

No comments yet.

Leave a reply click here to cancel reply..

You must be logged in to post a comment.

web statistics

  • Skip to main content
  • Skip to secondary menu
  • Skip to primary sidebar
  • Skip to footer

A Plus Topper

Improve your Grades

Importance of Computer Essay | Essay on Importance of Computer for Students and Children in English

February 14, 2024 by Prasanna

Importance of Computer Essay: Computers are an essential part of the modern era, and they are beneficial in various fields for various purposes. From education to the technical sector, libraries to railways, all the organisations have computers to serve the essential needs and goals. However, as every coin has two faces, the computers also have both benefits and disadvantages. But the importance of computers overpowers its disadvantages and makes it an essential part of modern lives.

Importance of Computer essay is essential for the students and adults to prepare well for the study purposes, speech and debate preparations, essay generations, and other needs. Importance of Computer essay contains various usages of computers in different fields like defence, medical, entertainment, business, communication, and others.

You can also find more  Essay Writing  articles on events, persons, sports, technology and many more.

Long and Short Essays on Importance of Computer for Students and Kids in English

We have formulated the essays for different word ranges to help serve various purposes. Here are three articles of the Importance of Computer essay in 300 words, 500 words, and 800 words.

Short Essay on Importance of Computer 150 Words in English

Short Essay on Importance of Computer is usually given to classes 1, 2, 3, 4, 5, and 6.

Computers are an essential element in modern lives as everything around us depends highly on them. From the prominent business owners and entrepreneurs to the working professionals, students, and adults, everyone uses computers for various purposes. Due to computers’ expanded service, they are also taught to students in school life as a compulsory subject and then in higher education as an optional one. Computers are useful in enhancing the efficiency of any work and are also crucial for offering the best results. Thus, computers are an essential part of modern lives globally, and it is impossible to work in various fields without them.

Computers are used in different purposes to serve various needs, and thus it is impossible to deny their importance in the modern era. As the corporate World’s dependence on the computers for accomplishing multiple tasks, students can also use the computers for learning development and different mechanics, complete their assignments, access notes and study materials, and as a recreational source for playing games. The banking sector also makes use of computers for handling accounts of their clients and regular customers. Thus, it is evident how computers have their usage in various fields and how it made its way and became an essential part of modern organisations.

Computers emerged in the industry as a fantastic technology that reduced the manual burden of the people for performing various tasks, including the calculations, data handling, etc., but also improved the overall quality of the work done. The demand for computers is rising at a tremendous rate in the modern world. These are the machines that are utilised to the fullest among various sectors and industries. Computers are also a more secure and reliable option for data collection and storing, thus becoming the World’s preferred option. Furthermore, after being connected with another fantastic technological advancement, the internet, the usage and utility of computer increases even more. Several organisations and office across the Globe work only on computers. Thus we can also say that all our lives are entirely surrounded and assisted by computers in one way or another.

Computers are an essential part of the modern era and lives. In today’s World of Science and technology, computers are used in almost all the fields and sectors, and across the people of all age groups for serving various needs and purposes. Computers play a crucial role in people’s day-to-day activities, studying, gathering information, doing research, or completing any work or task. For entertainment purpose too, computers are utilised widely across the Globe. From school children to office goers and adults doing work from home, computers serve an extensive range of purpose and help the users in fascinating ways.

Computers are used in one way or the other for completing various tasks efficiently and for achieving the best results. The usage of computers has rapidly grown, and it has become an essential part of human life, without which most of the modern-day tasks are impossible to be completed.

A computer is a complex and modern machine capable of performing various tasks in just a fraction of a few seconds. It also provides more efficient results, in contrast to the human brain. For the education field, computers are essential for teaching and learning purposes. They are used for preparing presentations to be given while teaching and during any seminar. Computer-based learning, also known as the smart classes, is highly preferred in the modern era to ensure that the students capture the gist of all the topics and understand them well and that they do not skip faster from their mind.

Across the Globe, several prominent software works on the computer for helping the students enhance their skills and be the expert of any concerned matter. Additionally, the teachers can also use the computers to connect with the students quickly and seamlessly and motivate them to use it to browse various topics and understand them.

Computers are essential in the medical field too as it helps keep records, take the X rays, CT scans, MRI scans, and others. They are also used while monitoring the patients admitted to any hospital, and keeping a check on them quickly and efficiently. Computers help the experts for space exploration programs and design the satellites that support the World to track various outer space activities. To create the spacecraft, control any mission, or gather the records of multiple bodies of outer space, computers are widely preferred.

Furthermore, computers are also useful in serving various purposes in the defence sector. They help maintain the weapons like missiles and control them while attacking any target to lower the chances of misses and wrong actions. Computers are thus very crucial and helpful to serve the purposes of saving human lives. Apart from these, computers are useful in finance matters in keeping the records of various transactions and making them more secure and reliable. They help serve the purposes that humans alone cannot and thus automate multiple tasks of modern lives.

Currently, imagining a world without computers and accomplishing any task without their assistance is impossible since they have marked their territories in almost all the fields and sectors to serve a wide range of purposes and satisfy an extensive array of needs.

Long Essay on Importance of Computer 500 Words in English

Long Essay on Importance of Computer is usually given to classes 7, 8, 9, and 10.

Computers are an essential part of the modern lives that also serve as a useful information source for managing various organisations and accomplishing multiple tasks. This is a significant reason why a computer is a tool with a higher demand for almost all the businesses, including banking, entertainment, industry, education, and administration. It is also evident that even in governmental organisations and bodies, computers are beneficial for various tasks. The market of computer generation and delivery is rapidly thriving across the Globe. There are several larger computer systems present in all global organisations. There is no business or organisation that can function entirely independently without any support from the computers.

Computers are widely used in various sectors, and its usage can be subdivided according to the fields and sectors.

Usage of computer in businesses:

Computers are widely used machine for the businesses of both small and larger scale. Computers are often used in smaller businesses due to being a cheaper form of microcomputers. Since the companies every day have a lot of data to handle, computers ease the task for them more securely and reliably. Furthermore, it also gives a pace to the entire task, like the salary calculation, database management, and others. Computers also automate the task of assisting the staff and helping them to accomplish their daily tasks. Not only in the local businesses but the top MNCs across the Globe also make use of computers in an extensive rate, in fact, a few of them do not have any task that can be achieved without the use of the computers.

Usage of computers in finance and banking:

In the banking and finance sector, computers are used for data handling and processing for the customers’ savings accounts, loan-related matters, investments, interest collection, and various other factors. Banking sector effectively uses the computer systems to operate on the budges. Finance related matters are accomplished in lesser time with higher effectiveness using the computers. Furthermore, when using computers, transactions and other financial issues are handled more securely and reliably.

Usage of computers in education:

In the current era of science and technology becoming more advanced with each passing day, computers are even capable of replacing the books for knowledge gathering and learning. Smart classes are already a preferred mode of education in various institutions, and these are only possible because of the computers. The computers can completely change the way we study and are taught in the institutes and schools. Computers also help educational institutions handle various matters related to students’ fee, staff’s payments, and even in calculating results of the students quickly and efficiently. Furthermore, managing the books’ list in the libraries and other institutes’ records also gets easier using computers.

Usage of computers in the medical sector:

Various medical organisations and institute use computers to handle the patients’ data and records, schedule the doctors and nurses, handle personal histories of the patients, etc. Computers are also used for monitoring the heart rate and blood pressure of the hospitalised patients, and while doing the X-rays, CT scan, MRI scan, and other tests. Use of computers in the medical field is also helpful in diagnosing some complex diseases and disorders. Most of the machines of medical purposes make use of computers in one way or the other.

Usage of computers in the legal sector:

Computers serve various purposes in the legislative processes. The most prominent use of the computers in preparing various legal documents, sending emails related to court hearings and court notices for any case, and others. Furthermore, the legislative sector’s computers also contain previous records of the criminals, thus helping the staff to use it whenever required and accessing it in no more than a second.

Usage of computers in government:

Computer usage is at a broader rate in the government sector for practising and implementing various administrative matters. Data collection and retrieval gets more comfortable using computers in this field. Computers also help the government in taxation matters and ease the entire task. Keeping a record of various things eases up too, utilising the machine in the sector. In various governmental organisations too, the role of the computer is rising rapidly for multiple purposes.

Usage of computers for entertainment:

Computers have become a fantastic source of recreation and entertainment in the modern-day. They are used to watch movies, play games, listen to music, or learn any matter if interest in an efficient manner. Computer parts are designed specially to coordinate with the required movements for all these purposes and serve the users best. Computers can now also analyse the images and help draw, do artwork, and other things of interest.

Computer usage in daily lives:

Computers are used in the daily lives for serving various needs. It also makes people’s lives more modern and fascinating and eases the day-to-day activities for them.

Importance of Computer Essay Conclusion

The rapid development of technology and various advancements in modern lives have made the computers an essential part of the daily lives. None of the sector present across the Globe can work and function seamlessly and effectively without using computers, and computers are the most vital aspects for various industries.

Importance of Computer essay contains a brief description of the facts, including computers in various fields. They help the people prepare a speech, debates, and other matters, or for the students to compose the essay for academic purposes.

  • Picture Dictionary
  • English Speech
  • English Slogans
  • English Letter Writing
  • English Essay Writing
  • English Textbook Answers
  • Types of Certificates
  • ICSE Solutions
  • Selina ICSE Solutions
  • ML Aggarwal Solutions
  • HSSLive Plus One
  • HSSLive Plus Two
  • Kerala SSLC
  • Distance Education

The Nibandh

Essay on my computer in english | age of computer | the nibandh, *essay on my computer in english*, introduction :.

My Computer text image in English

Use of computer in daily life : 

Types of computers :, advantages of computers :, conclusion :, you might like, 1 टिप्पणियाँ, एक टिप्पणी भेजें, संपर्क फ़ॉर्म.

Logo

Essay on Importance of Computer

Students are often asked to write an essay on Importance of Computer in their schools and colleges. And if you’re also looking for the same, we have created 100-word, 250-word, and 500-word essays on the topic.

Let’s take a look…

100 Words Essay on Importance of Computer

Introduction to computers.

Computers are important in our lives. They help in various tasks like learning, communication, and entertainment.

Role in Education

Computers make learning fun. They offer educational games and online classes.

Communication

Computers help us communicate with friends and family through emails and social media.

Entertainment

Computers provide entertainment like movies, music, and games.

In conclusion, computers have a significant role in our lives. They make tasks easier and more enjoyable.

Also check:

250 Words Essay on Importance of Computer

The emergence of computers.

The advent of computers has revolutionized the world, dramatically transforming human life and societal structures. Computers, initially designed for complex computations, now permeate every aspect of our daily lives, from education and business to entertainment and communication.

Computers in Education

The importance of computers in education is undeniable. They have transformed the way we learn, making education more interactive and engaging. With the help of computers, vast amounts of information can be accessed within seconds, facilitating research and broadening the scope of knowledge. Moreover, online learning platforms have made education accessible to everyone, irrespective of geographical boundaries.

Role in Business

In the business world, computers have become indispensable. They assist in managing large databases, conducting financial transactions, and executing marketing strategies. The advent of e-commerce, largely facilitated by computers, has reshaped the global economy, enabling businesses to reach customers worldwide.

Impact on Communication

Entertainment and leisure.

In the realm of entertainment and leisure, computers have introduced new dimensions. From digital art and music to online gaming and streaming services, computers have enriched our recreational experiences.

In conclusion, the importance of computers is vast and multifaceted. They have become an integral part of our lives, continually shaping our world. As we move forward, the influence of computers will only continue to grow, making them an undeniable necessity in our modern existence.

500 Words Essay on Importance of Computer

Introduction.

The computer, a revolutionary invention of the twentieth century, has become a fundamental part of our daily lives. Its importance cannot be overstated as it has revolutionized various sectors including business, education, healthcare, and entertainment. This essay explores the significance of computers in our contemporary world.

The role of computers in education is transformative. They serve as an interactive medium where students can learn and explore new concepts. Online learning platforms, digital libraries, and educational software have made learning more accessible, engaging, and personalized. Furthermore, computers have also simplified research, data analysis, and presentation of academic work, enhancing the overall educational experience.

Impact on Business and Economy

Computers have reshaped the business landscape. They have facilitated automation, leading to increased productivity and efficiency. Businesses are now able to manage large volumes of data, aiding in informed decision-making and strategic planning. E-commerce, digital marketing, and online banking are other significant contributions of computers, driving economic growth and globalization.

Healthcare Advancements

Entertainment and communication.

The entertainment industry has been revolutionized by computers. They have given birth to digital media, video games, and computer-generated imagery (CGI) in films. Moreover, computers have redefined communication, making it instant and borderless. Social media, email, and video conferencing are now integral parts of our social and professional lives.

Challenges and Future Prospects

Despite the numerous benefits, the use of computers also brings challenges such as cybersecurity threats and digital divide. Addressing these issues is crucial for a safe and inclusive digital future. On the brighter side, the future of computers is promising with advancements like quantum computing, artificial intelligence, and virtual reality. These technologies are expected to further enhance our lives, solve complex problems, and open new avenues of exploration.

That’s it! I hope the essay helped you.

If you’re looking for more, here are essays on other interesting topics:

Happy studying!

Leave a Reply Cancel reply

Save my name, email, and website in this browser for the next time I comment.

Try AI-powered search

How AI could change computing, culture and the course of history

Expect changes in the way people access knowledge, relate to knowledge and think about themselves.

age of computer essay

Your browser does not support the <audio> element.

A mong the more sombre gifts brought by the Enlightenment was the realisation that humans might one day become extinct. The astronomical revolution of the 17th century had shown that the solar system both operated according to the highest principles of reason and contained comets which might conceivably hit the Earth. The geological record, as interpreted by the Comte de Buffon, showed massive extinctions in which species vanished for ever. That set the scene for Charles Darwin to recognise such extinctions as the motor of evolution, and thus as both the force which had fashioned humans and, by implication, their possible destiny. The nascent science of thermodynamics added a cosmic dimension to the certainty of an ending; Sun, Earth and the whole shebang would eventually run down into a lifeless “heat death”.

The 20th century added the idea that extinction might not come about naturally, but through artifice. The spur for this was the discovery, and later exploitation, of the power locked up in atomic nuclei. Celebrated by some of its discoverers as a way of indefinitely deferring heat death, nuclear energy was soon developed into a far more proximate danger. And the tangible threat of imminent catastrophe which it posed rubbed off on other technologies.

None was more tainted than the computer. It may have been guilt by association: the computer played a vital role in the development of the nuclear arsenal. It may have been foreordained. The Enlightenment belief in rationality as humankind’s highest achievement and Darwin’s theory of evolution made the promise of superhuman rationality the possibility of evolutionary progress at humankind’s expense.

Artificial intelligence has come to loom large in the thought of the small but fascinating, and much written about, coterie of academics which has devoted itself to the consideration of existential risk over the past couple of decades. Indeed, it often appeared to be at the core of their concerns. A world which contained entities which think better and act quicker than humans and their institutions, and which had interests that were not aligned with those of humankind, would be a dangerous place.

It became common for people within and around the field to say that there was a “non-zero” chance of the development of superhuman AI s leading to human extinction. The remarkable boom in the capabilities of large language models ( LLM s), “foundational” models and related forms of “generative” AI has propelled these discussions of existential risk into the public imagination and the inboxes of ministers.

As the special Science section in this issue makes clear, the field’s progress is precipitate and its promise immense. That brings clear and present dangers which need addressing. But in the specific context of GPT-4 , the LLM du jour , and its generative ilk, talk of existential risks seems rather absurd. They produce prose, poetry and code; they generate images, sound and video; they make predictions based on patterns. It is easy to see that those capabilities bring with them a huge capacity for mischief. It is hard to imagine them underpinning “the power to control civilisation”, or to “replace us”, as hyperbolic critics warn.

But the lack of any “Minds that are to our minds as ours are to those of the beasts that perish, intellects vast and cool and unsympathetic [drawing] their plans against us”, to quote H.G. Wells, does not mean that the scale of the changes that AI may bring with it can be ignored or should be minimised. There is much more to life than the avoidance of extinction. A technology need not be world-ending to be world-changing.

The transition into a world filled with computer programs capable of human levels of conversation and language comprehension and superhuman powers of data assimilation and pattern recognition has just begun. The coming of ubiquitous pseudocognition along these lines could be a turning point in history even if the current pace of AI progress slackens (which it might) or fundamental developments have been tapped out (which feels unlikely). It can be expected to have implications not just for how people earn their livings and organise their lives, but also for how they think about their humanity.

For a sense of what may be on the way, consider three possible analogues, or precursors: the browser, the printing press and practice of psychoanalysis. One changed computers and the economy, one changed how people gained access and related to knowledge, and one changed how people understood themselves.

The humble web browser, introduced in the early 1990s as a way to share files across networks, changed the ways in which computers are used, the way in which the computer industry works and the way information is organised. Combined with the ability to link computers into networks, the browser became a window through which first files and then applications could be accessed wherever they might be located. The interface through which a user interacted with an application was separated from the application itself.

The power of the browser was immediately obvious. Fights over how hard users could be pushed towards a particular browser became a matter of high commercial drama. Almost any business with a web address could get funding, no matter what absurdity it promised. When boom turned to bust at the turn of the century there was a predictable backlash. But the fundamental separation of interface and application continued. Amazon, Meta ( née Facebook) and Alphabet ( née Google) rose to giddy heights by making the browser a conduit for goods, information and human connections. Who made the browsers became incidental; their role as a platform became fundamental.

The months since the release of Open AI ’s Chat GPT , a conversational interface now powered by GPT-4 , have seen an entrepreneurial explosion that makes the dotcom boom look sedate. For users, apps based on LLM s and similar software can be ludicrously easy to use; type a prompt and see a result. For developers it is not that much harder. “You can just open your laptop and write a few lines of code that interact with the model,” explains Ben Tossell, a British entrepreneur who publishes a newsletter about AI services.

And the LLM s are increasingly capable of helping with that coding, too. Having been “trained” not just on reams of text, but lots of code, they contain the building blocks of many possible programs; that lets them act as “co-pilots” for coders. Programmers on GitHub, an open-source coding site, are now using a GPT-4 -based co-pilot to produce nearly half their code.

There is no reason why this ability should not eventually allow LLM s to put code together on the fly, explains Kevin Scott, Microsoft’s chief technology officer. The capacity to translate from one language to another includes, in principle and increasingly in practice, the ability to translate from language to code. A prompt written in English can in principle spur the production of a program that fulfils its requirements. Where browsers detached the user interface from the software application, LLM s are likely to dissolve both categories. This could mark a fundamental shift in both the way people use computers and the business models within which they do so.

Every day I write the book

Code-as-a-service sounds like a game-changing plus. A similarly creative approach to accounts of the world is a minus. While browsers mainly provided a window on content and code produced by humans, LLM s generate their content themselves. When doing so they “hallucinate” (or as some prefer “confabulate”) in various ways. Some hallucinations are simply nonsense. Some, such as the incorporation of fictitious misdeeds to biographical sketches of living people, are both plausible and harmful. The hallucinations can be generated by contradictions in training sets and by LLM s being designed to produce coherence rather than truth. They create things which look like things in their training sets; they have no sense of a world beyond the texts and images on which they are trained.

In many applications a tendency to spout plausible lies is a bug. For some it may prove a feature. Deep fakes and fabricated videos which traduce politicians are only the beginning. Expect the models to be used to set up malicious influence networks on demand, complete with fake websites, Twitter bots, Facebook pages, TikTok feeds and much more. The supply of disinformation, Renée DiResta of the Stanford Internet Observatory has warned, “will soon be infinite”.

age of computer essay

This threat to the very possibility of public debate may not be an existential one; but it is deeply troubling. It brings to mind the “Library of Babel”, a short story by Jorge Luis Borges. The library contains all the books that have ever been written, but also all the books which were never written, books that are wrong, books that are nonsense. Everything that matters is there, but it cannot be found because of everything else; the librarians are driven to madness and despair.

This fantasy has an obvious technological substrate. It takes the printing press’s ability to recombine a fixed set of symbols in an unlimited number of ways to its ultimate limit. And that provides another way of thinking about LLM s.

Dreams never end

The degree to which the modern world is unimaginable without printing makes any guidance its history might provide for speculation about LLM s at best partial, at worst misleading. Johannes Gutenberg’s development of movable type has been awarded responsibility, at some time or other, for almost every facet of life that grew up in the centuries which followed. It changed relations between God and man, man and woman, past and present. It allowed the mass distribution of opinions, the systematisation of bureaucracy, the accumulation of knowledge. It brought into being the notion of intellectual property and the possibility of its piracy. But that very breadth makes comparison almost unavoidable. As Bradford DeLong, an economic historian at the University of California, Berkeley puts it, “It’s the one real thing we have in which the price of creating information falls by an order of magnitude.”

Printed books made it possible for scholars to roam larger fields of knowledge than had ever before been possible. In that there is an obvious analogy for LLM s, which trained on a given corpus of knowledge can derive all manner of things from it. But there was more to the acquisition of books than mere knowledge.

Just over a century after Gutenberg’s press began its clattering Michel de Montaigne, a French aristocrat, had been able to amass a personal library of some 1,500 books—something unimaginable for an individual of any earlier European generation. The library gave him more than knowledge. It gave him friends. “When I am attacked by gloomy thoughts,” he wrote, “nothing helps me so much as running to my books. They quickly absorb me and banish the clouds from my mind.”

And the idea of the book gave him a way of being himself no one had previously explored: to put himself between covers. “Reader,” he warned in the preface to his Essays , “I myself am the matter of my book.” The mass production of books allowed them to become peculiarly personal; it was possible to write a book about nothing more, or less, than yourself, and the person that your reading of other books had made you. Books produced authors.

As a way of presenting knowledge, LLM s promise to take both the practical and personal side of books further, in some cases abolishing them altogether. An obvious application of the technology is to turn bodies of knowledge into subject matter for chatbots. Rather than reading a corpus of text, you will question an entity trained on it and get responses based on what the text says. Why turn pages when you can interrogate a work as a whole?

Everyone and everything now seems to be pursuing such fine-tuned models as ways of providing access to knowledge. Bloomberg, a media company, is working on Bloomberg GPT , a model for financial information. There are early versions of a Quran GPT and a Bible GPT ; can a puffer-jacketed Pontiff GPT be far behind? Meanwhile several startups are offering services that turn all the documents on a user’s hard disk, or in their bit of the cloud, into a resource for conversational consultation. Many early adopters are already using chatbots as sounding boards. “It’s like a knowledgeable colleague you can always talk to,” explains Jack Clark of Anthropic, an LLM- making startup.

It is easy to imagine such intermediaries having what would seem like personalities—not just generic ones, such as “avuncular tutor”, but specific ones which grow with time. They might come to be like their users: an externalised version of their inner voice. Or they might be like any other person whose online output is sufficient for a model to train on (intellectual-property concerns permitting). Researchers at the Australian Institute for Machine Learning have built an early version of such an assistant for Laurie Anderson, a composer and musician. It is trained in part on her work, and in part on that of her late husband Lou Reed.

Without you

Ms Anderson says she does not consider using the system as a way of collaborating with her dead partner. Others might succumb more readily to such an illusion. If some chatbots do become, to some extent, their user’s inner voice, then that voice will persist after death, should others wish to converse with it. That some people will leave chatbots of themselves behind when they die seems all but certain.

Such applications and implications call to mind Sigmund Freud’s classic essay on the Unheimliche , or uncanny. Freud takes as his starting point the idea that uncanniness stems from “doubts [as to] whether an apparently animate being is really alive; or conversely, whether a lifeless object might not be in fact animate”. They are the sort of doubts that those thinking about LLM s are hard put to avoid.

Though AI researchers can explain the mechanics of their creations, they are persistently unable to say what actually happens within them. “There’s no ‘ultimate theoretical reason’ why anything like this should work,” Stephen Wolfram, a computer scientist and the creator of Wolfram Alpha, a mathematical search engine, recently concluded in a remarkable (and lengthy) blog post trying to explain the models’ inner workings.

This raises two linked but mutually exclusive concerns: that AI ’s have some sort of internal working which scientists cannot yet perceive; or that it is possible to pass as human in the social world without any sort of inner understanding.

“These models are just representations of the distributions of words in texts that can be used to produce more words,” says Emily Bender, a professor at the University of Washington in Seattle. She is one of the authors of “On the Dangers of Stochastic Parrots: Can Language Models Be Too Big?” a critique of LLM triumphalism. The models, she argues, have no real understanding. With no experience of real life or human communication they offer nothing more than the ability to parrot things they have heard in training, an ability which huge amounts of number crunching makes frequently appropriate and sometimes surprising, but which is nothing like thought. It is a view which is often pronounced in those who have come into the field through linguistics, as Dr Bender has.

For some in the LLM -building trade things are not that simple. Their models are hard to dismiss as “mere babblers”, in the words of Blaise Agüera y Arcas, the leader of a group at Alphabet which works on AI -powered products. He thinks the models have attributes which cannot really be distinguished from an ability to know what things actually mean. It can be seen, he suggests, in their ability reliably to choose the right meaning when translating phrases which are grammatically ambiguous, or to explain jokes.

If Dr Bender is right, then it can be argued that a broad range of behaviour that humans have come to think of as essentially human is not necessarily so. Uncanny “doubts [as to] whether an apparently animate being is really alive” are fully justified.

To accept that human-seeming LLM s are calculation, statistics and nothing more could influence how people think about themselves. Freud portrayed himself as continuing the trend begun by Copernicus—who removed humans from the centre of the universe—and Darwin—who removed them from a special and God-given status among the animals. Psychology’s contribution, as Freud saw it, lay in “endeavouring to prove to the ‘ego’ of each one of us that he is not even master in his own house”. LLM s could be argued to take the idea further still. At least one wing of Freud’s house becomes an unoccupied “smart home”; the lights go on and off automatically, the smart thermostat opens windows and lowers blinds, the roomba roombas around. No master needed at all.

age of computer essay

Uncanny as that may all be, though, it would be wrong to think that many people will take this latest decentring to heart. As far as everyday life is concerned, humankind has proved pretty resilient to Copernicus, Darwin and Freud. People still believe in gods and souls and specialness with little obvious concern for countervailing science. They could well adapt quite easily to the pseudocognitive world, at least as far as philosophical qualms are concerned.

You do not have to buy Freud’s explanation of the unsettling effect of the uncanny in terms of the effort the mind expends on repressing childish animism to think that not worrying and going with the animistic flow will make a world populated with communicative pseudo-people a surprisingly comfortable one. People may simultaneously recognise that something is not alive and treat it as if it were. Some will take this too far, forming problematic attachments that Freud would have dubbed fetishistic. But only a few sensitive souls will find themselves left behind staring into an existential—but personal—abyss opened up by the possibility that their seeming thought is all for naught.

New gold dream

What if Mr Agüera y Arcas is right, though, and that which science deems lifeless is, in some cryptic, partial and emergent way, effectively animate? Then it will be time to do for AI some of what Freud thought he was doing for humans. Having realised that the conscious mind was not the whole show, Freud looked elsewhere for sources of desire that for good or ill drove behaviour. Very few people now subscribe to the specific Freudian explanations of human behaviour which followed. But the idea that there are reasons why people do things of which they are not conscious is part of the world’s mental furniture. The unconscious is probably not a great model for whatever it is that provides LLM s with an apparent sense of meaning or an approximation of agency. But the sense that there might be something below the AI surface which needs understanding may prove powerful.

Dr Bender and those who agree with her may take issue with such notions. But they might find that they lead to useful actions in the field of “ AI ethics”. Winkling out non-conscious biases acquired in the pre-verbal infancy of training; dealing with the contradictions behind hallucinations; regularising rogue desires: ideas from psychotherapy might be seen as helpful analogies for dealing with the pseudocognitive AI transition even by those who reject all notion of an AI mind. A concentration on the relationship between parents, or programmers, and their children could be welcome, too. What is it to bring up an AI well? What sort of upbringing should be forbidden? To what extent should the creators of AI s be held responsible for the harms done by their creation?

And human desires may need some inspection, too. Why are so many people eager for the sort of intimacy an LLM might provide? Why do many influential humans seem to think that, because evolution shows species can go extinct, theirs is quite likely to do so at its own hand, or that of its successor? And where is the determination to turn a superhuman rationality into something which does not merely stir up the economy, but changes history for the better? ■

Explore more

This article appeared in the Essay section of the print edition under the headline “THE AGE OF PSEUDOCOGNITION”

How to worry wisely about AI

From the April 22nd 2023 edition

Discover stories from this section and more in the list of contents

More from Essay

age of computer essay

Solar power is going to be huge

An energy source that gets cheaper and cheaper is a wonderful thing

age of computer essay

The Alaskan wilderness reveals the past and the future

The oil flows more slowly, the climate changes more quickly

age of computer essay

How a free and open Hong Kong became a police state

It was a long time in the planning

Viruses have big impacts on ecology and evolution as well as human health

They are ubiquitous, diverse and very powerful

The South Asian monsoon, past, present and future

A story of famines and trade, science and cupidity

The story of China’s economy as told through the world’s biggest building

It is a microcosm that reveals how much China is master of its own fate

IMAGES

  1. (PDF) History of the Computer

    age of computer essay

  2. ≫ History of Computers Free Essay Sample on Samploon.com

    age of computer essay

  3. Evolution of Computers Essay Example

    age of computer essay

  4. The Significant Changes Ushered by the Age of Computers Free Essay Example

    age of computer essay

  5. History Of Computers

    age of computer essay

  6. Essay on Computer

    age of computer essay

COMMENTS

  1. Essay on Age of Computer

    250 Words Essay on Age of Computer The Dawn of the Computer Age. In the annals of human history, the advent of computers stands as a monumental shift akin to the discovery of fire or the invention of the wheel. The computer age, a period marked by rapid technological advancements, has drastically reshaped our society, economy, and daily lives. ...

  2. Essay on History of Computer

    500 Words Essay on History of Computer The Dawn of Computing. The history of computers dates back to antiquity with devices like the abacus, used for calculations. However, the concept of a programmable computer was first realized in the 19th century by Charles Babbage, an English mathematician. His design, known as the Analytical Engine, is ...

  3. History of computers: A brief timeline

    1937: John Vincent Atanasoff, a professor of physics and mathematics at Iowa State University, submits a grant proposal to build the first electric-only computer, without using gears, cams, belts ...

  4. Computer

    The close relationship between the device and the program became apparent some 20 years later, with Charles Babbage's invention of the first computer. Computer - History, Technology, Innovation: A computer might be described with deceptive simplicity as "an apparatus that performs routine calculations automatically.".

  5. Information Age

    The Information Age (also known as the Third Industrial Revolution, Computer Age, Digital Age, Silicon Age, New Media Age, Internet Age, or the Digital Revolution [1]) is a historical period that began in the mid-20th century. It is characterized by a rapid shift from traditional industries, as established during the Industrial Revolution, to an economy centered on information technology. [2]

  6. Computer Technology: Evolution and Developments Essay

    Mechanical Age (1800s -1920s) The development of the computer characterized this period to facilitate mathematical calculations that could not be done manually by individuals. The first notable computing device was the "analytical engine" designed by Charles Babbage in 1834, which used electromechanical relays to function (Zakari 1).

  7. Essay on Generation of Computer

    Students are often asked to write an essay on Generation of Computer in their schools and colleges. And if you're also looking for the same, we have created 100-word, 250-word, and 500-word essays on the topic. ... these computers laid the groundwork for modern computing and marked the beginning of the digital age. The Second Generation (1956 ...

  8. History of Computers: From Abacus to Modern PC Essay

    The first personal computer was marketed in kit form with 256 bytes of memory. The machine used BASIC compiler that was developed by Bill Gates and other technicians. Apple followed the trend and went on to advertise also in personal computers in the same kit form. The computers comprised of a monitor, and keyboard.

  9. The Modern History of Computing

    The term computing machine, used increasingly from the 1920s, refers to any machine that does the work of a human computer, i.e., any machine that calculates in accordance with effective methods. During the late 1940s and early 1950s, with the advent of electronic computing machines, the phrase 'computing machine' gradually gave way simply ...

  10. Age Of Computers

    The age of computers refers to the period of time when computer technology became widely available and played a significant role in society. This era began in the mid-20th century with the development of the first electronic digital computers and has continued to the present day with the rise of artificial intelligence, cloud computing, and the internet.

  11. The History of Computers: An Essay

    The pre-computer age was a time in which mathematic engineers battled with the creating somewhat of a more efficient abacus. John Napier created the logarithm table to simplify calculations needed for astronomy. The Napier bones were a set of bones that when place in order could show the product of computations. ... Essay on How the Computer ...

  12. Essay on Computer and its Uses in 500 Words for Students

    Frequently Asked Questions on Computer. Q.1 What is a computer? A.1 A computer is an electronic device or machine that makes our work easier. Also, they help us in many ways. Q.2 Mention various fields where computers are used? A.2 Computers are majorly used in defense, medicine, and for research purposes.

  13. This Is the Age of Computers

    1187 Words. 5 Pages. Open Document. This is the age of computer. Computer has become an important part of our life. The wild web has become the source of all information. You type a single word in the search engine and multiple pop ups gives you a variety of choices on particular topics. One can not dispute the need for such computerisation but ...

  14. How Computers Influence Our Life

    The impact of computer technology in Affect human life. Impact of Computer, 2010. Web. Subrahmanyam et al. The Impact of Home Computer Use on Children's Activities and Development. Princeton, 2004. Web. Turkle, Sherry. The second self: Computers and the human spirit, 2005. Web.

  15. Essay on Computer For Students In English

    Essay on Computer: Students can go through the 500+ words essay on computers to get ideas for essay writing on the computer. It will help them to frame their thoughts in an organised way for an effective essay. ... Age of Industrialisation MCQs; Work, Life and Leisure MCQs; Print Culture and the Modern World MCQs; Novels, Society and History ...

  16. Fifth Generation of Computers

    Fifth-generation computers, also known as modern computers, are still in the development stage and are based on artificial intelligence. In 1982, Japan was invented the FGCS (Fifth Generation Computer System). Computers of this generation are based on microelectronic technology with high computing power and parallel processing.

  17. Computers: Essay on the Importance of Computer in the Modern Society

    Computers also ensure more accuracy. Examples of such cases include ticket booking, payment of bills, insurance and shopping. Interestingly, automatic operations of vehicles, like trains also help to ensure further safety and reliability of the journey. Computers can be used to observe and predict traffic patterns which would be a grand benefit ...

  18. Essay on Importance of Computer in Our Life

    500 Words Essay on Importance of Computer in Our Life Introduction. The advent of computers has undeniably marked a significant shift in human civilization. From aiding complex calculations to facilitating global connectivity, computers have become an integral part of our lives. This essay will delve into the importance of computers in our ...

  19. Importance of Computer Essay

    200 Words Essay on The Importance of Computer. Technology has changed the aspects of life and has made life better. Computers are popular electronic devices that can be used to write documents, play games, send an email, make presentations and designs and browse the internet for finding information. The older generation systems were less ...

  20. Importance of Computer Essay

    Importance of Computer essay is essential for the students and adults to prepare well for the study purposes, speech and debate preparations, essay generations, and other needs. ... and across the people of all age groups for serving various needs and purposes. Computers play a crucial role in people's day-to-day activities, studying ...

  21. Essay On My Computer in English

    A computer is an electronic device for storing and analysing information fed into it for calculating, or for controlling machines automatically. In other words, we can say that a computer is an electronic machine that can be supplied with a programme. computer essay in english for class 9 essay on laptop essay on computer wikipedia computer era essay essay on importance of computer in our life ...

  22. Essay on Importance of Computer

    500 Words Essay on Importance of Computer Introduction. The computer, a revolutionary invention of the twentieth century, has become a fundamental part of our daily lives. Its importance cannot be overstated as it has revolutionized various sectors including business, education, healthcare, and entertainment. This essay explores the ...

  23. How AI could change computing, culture and the course of history

    This article appeared in the Essay section of the print edition under the headline "THE AGE OF PSEUDOCOGNITION" From the April 22nd 2023 edition Discover stories from this section and more in ...