Friday, July 10, 2009

The History Of Computers


A computer is a machine that manipulates data according to a set of instructions.

Although mechanical examples of computers have existed through much of recorded human history, the first electronic computers were developed in the mid-20th century (1940–1945).

These were the size of a large room, consuming as much power as several hundred modern personal computers (PCs). Modern computers are millions to billions of times more capable than the early machines.Simple computers are small enough to fit into a wristwatch, and can be powered by a watch battery. Personal computers in their various forms are icons of the Information Age and are what most people think of as "computers". The embedded computers found in many devices .

The ability to store and execute lists of instructions called programs makes computers extremely versatile, distinguishing them from calculators. The Church–Turing thesis is a mathematical statement of this versatility any computer with a certain minimum capability is, in principle, capable of performing the same tasks that any other computer can perfom.



The Jacquard loom was one of the first programmable devices.

The first use of the word "computer" was recorded in 1613, referring to a person who carried out calculations, or computations, and the word continued to be used in that sense until the middle of the 20th century. From the end of the 19th century onwards though, the word began to take on its more familiar meaning, describing a machine that carries out computations.

The history of the modern computer begins with two separate technologies automated calculation and programmabiliti but no single device can be identified as the earliest computer, partly because of the inconsistent application of that term.

Examples of early mechanical calculating devices include:=

  1. abacus
  2. the slide rule
  3. arguably
  4. astrolabe
  5. Antikythera mechanism

The Renaissance saw a re-invigoration of European mathematics and engineering. Wilhelm Schickard's 1623 device was the first of a number of mechanical calculators constructed by European engineers, but none fit the modern definition of a computer, because they could not be programmed.

In 1801, Joseph Marie Jacquard made an improve ment to the textile loom by introducing a series of punched paper cards as a template which allowed his loom to weave intricate patterns automatically. The resulting Jacquard loom was an important step in the development of computers because the use of punched cards to define woven patterns can be viewed as an early, albeit limited, form of programmability.It was the fusion of automatic calculation with programmability that produced the first recognizable computers.


A modern model of Babbage's analytical engine, built in 1992 found in the Science Museum (London)

In 1837, Charles Babbage was the first to conceptualize and design a fully programmable mechanical computer, his analytical engine.Limited finances and Babbage's inability to resist tinkering with the design meant that the device was never completed.

Charles Babbage's first attempt at a mechanical computing device was the difference engine, a special-purpose calculator designed to tabulate logarithms and trigonometric functions by evaluating approximate polynomials. As this project faltered for personal and political reasons, he realized that a much more general design was possible and started work designing the analytical engine.

The input (programs and data) was to be provided to the machine via punched cards, a method being used at the time to direct mechanical looms such as the Jacquard loom. For output, the machine would have a printer, a curve plotter and a bell. The machine would also be able to punch numbers onto cards to be read in later. It employed ordinary base-10 fixed-point arithmetic.

There was to be a store (i.e., a memory) capable of holding 1,000 numbers of 50 decimal digits each (ca. 20.7kB). An arithmetical unit (the "mill") would be able to perform all four arithmetic operations, plus comparisons and optionally square roots. Initially it was conceived as a difference engine curved back upon itself, in a generally circular layout, with the long store exiting off to one side. Like the central processing unit (CPU) in a modern computer, the mill would rely upon its own internal procedures, to be stored in the form of pegs inserted into rotating drums called "barrels," in order to carry out some of the more complex instructions the user's program might specify. languages. Loops and conditional branching were possible and so the language as conceived.

The programming language to be employed by users was akin to modern day assembly would have been Turing-complete long before Alan Turing's concept. Three different types of punch cards were used one for arithmetical operations, one for numerical constants, and one for load and store operations, transferring numbers from the store to the arithmetical unit or back. There were three separate readers for the three types of cards.

In 1842, the Italian mathematician Luigi Menabrea, whom Babbage had met while travelling in Italy, wrote a description of the engine in French. In 1843, the description was translated into English and extensively annotated by Ada King, Countess of Lovelace, who had become interested in the engine ten years earlier. In recognition of her additions to Menabrea's paper, which included a way to calculate Bernoulli numbers using the machine, she has been described as the first computer programmer. The modern computer programming language Ada is named in her honour.


Hollerith's keybord,punch used for the 1890 census

A standard blank IBM punched card of the type used to store data

In the late 1880s Herman Hollerith invented the recording of data on a machine readable medium. Prior uses of machine readable media, above, had been for control, not data. "After some initial trials with paper tape, he settled on punched cards ."To process these punched cards he invented the tabulator, and the key punch machines. These three inventions were the foundation of the modern information processing industry. Large-scale automated data processing of punched cards was performed for th e 1890 United States Census by Hollerith's company, which later became the core of IBM.


By the end of the 19th century a number of technologies that would later prove useful in the realization of practical computers had begun to appear: the punched card, Boolean algebra, the vacuum tube (thermionic valve) and the teleprinter.



Polish analog computer AKAT-1

During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers.

Alan Turing is widely regarded to be the father of modern computer science. In 1936 Turing provided an influential for malisation of the concept of the algorithm.

EDSAC was one of the first computer to imlement the word program architure.

A succession of steadily more powerful and flexible computing devices were constructed in the 1930s and 1940s, gradually adding the key features that are seen in modern computers. The use of digital electronics (largely invented by Claude Shannon in 1937) and more flexible programmability were vitally important steps, but defining one point along this road as "the first digital electronic computer" is difficult (Shannon 1940).

Computers using vacuum tubes as their electronic elements were in use throughout the 1950s, but by the 1960s had been largely replaced by transistor-based machines, which were smaller, faster, cheaper to produce, required less power, and were more reliable. The first transistorised computer was demonstrated at the University of Manchester in 1953.

In the 1970s, integrated circuit technology and the subsequent creation of microprocessors, such as the Intel 4004, further decreased size and cost and further increased speed and reliability of computers. By the late 1970s, many products such as video recorders contained dedicated computers called microcontrollers, and they started to appear as a replacement to mechanical controls in domestic appliances such as washing machines. The 1980s witnessed home computers and the now ubiquitous personal computer. With the evolution of the Internet, personal computers are becoming as common as the television and the telephone in the household.

Modern smartphone are fully-programmable computers in their own right, and as of 2009 may well be the most common form of such computers in existence.

Computer program and computer programming.

These instructions are read from the computer's memory and are generally carried out in the order they were given. However, there are usually specialized instructions to tell the computer to jump ahead or backwards to some other place in the program and to carry on executing from there. These are called "jump" instructions. Furthermore, jump instructions may be made to happen conditionally so that different sequences of instructions may be used depending on the result of some previous calculation or some external event. Many computers directly support subroutines by providing a type of jump that "remembers" the location it jumped from and another instruction to return to the instruction following that jump instruction.

Program execution might be likened to reading a book. While a person will normally read each word and line in sequence, they may at times jump back to an earlier place in the text or skip sections that are not of interest. Similarly, a computer may sometimes go back and repeat the instructions in some section of the program over and over again until some internal condition is met. This is called the flow of control within the program and it is what allows the computer to perform tasks repeatedly without human intervention.

An old mechanical calculator.

Comparatively, a person using a can perform a basic arithmetic operation such as adding two numbers with just a few button presses. But to add together all of the numbers from 1 to 1,000 would take thousands of button presses and a lot of time with a near certainty of making a mistake. On the other hand, a computer may be programmed to do this with just a few simple instructions. For example:

mov #0,sum ; set sum to 0

mov #1,num ; set num to 1

loop: add num,sum ; add num to sum

add #1,num ; add 1 to num

cmp num,#1000 ; compare num to 1000

ble loop ; if num <= 1000, go back to 'loop'

halt ; end of program. stop running


Once told to run this program, the computer will perform the repetitive addition task without further human intervention. It will almost never make a mistake and a modern PC can complete the task in about a millionth of a second.

However, computers cannot "think" for themselves in the sense that they only solve problems in exactly the way they are programmed to. An intelligent human faced with the above addition task might soon realize that instead of actually adding up all the numbers one can simply use the equation


and arrive at the correct answer (500,500) with little work. In other words, a computer programmed to add up the numbers one by one as in the example above would do exactly that without regard to efficiency or alternative solutions.



References:

Kempf, Karl (1961). Historical Monograph: Electronic Computers Within the Ordnance Corps. http://ed-thelen.org/comp-hist/U-S-Ord-61.html.


Phillips, Tony (2000). "The Antikythera Mechanism I". American Mathematical Society. http://www.math.sunysb.edu/~tony/whatsnew/column/antikytheraI-0400/kyth1.html.

Shannon, Claude Elwood (1940). A symbolic analysis of relay and switching circuits. Massachusetts Institute of Technology.

Equipment Corporation. http://bitsavers.vt100.net/dec/www.computer.museum.uq.edu.au_mirror/D-09-30_PDP11-40_Processor_Handbook.pdF


Lavington, Simon (1998), A History of Manchester Computers (2 ed.), Swindon: The British Computer Society.

Stokes, Jon (2007). Inside the Machine: An Illustrated Introduction to Microprocessors and Computer Architecture. San Francisco: No Starch Press.

No comments:

Post a Comment