Автор работы: Пользователь скрыл имя, 21 Ноября 2012 в 16:34, реферат
One of the earliest machines designed to assist people in calculations was the abacus which is still being used some 5000 years after its invention.
In 1642 Blaise Pascal (a famous French mathematician) invented an adding machine based on mechanical gears in which numbers were represented by the cogs on the wheels.
A Short History of Computers and Computing
One of the earliest machines designed to assist people in calculations was the abacus which is still being used some 5000 years after its invention.
In 1642 Blaise Pascal (a famous French mathematician) invented an adding machine based on mechanical gears in which numbers were represented by the cogs on the wheels.
Englishman, Charles Babbage, invented in the 1830's a "Difference Engine" made out of brass and pewter rods and gears, and also designed a further device which he called an "Analytical Engine". His design contained the five key characteristics of modern computers:-
Augusta Ada Byron (later Countess of Lovelace) was an associate of Babbage who has become known as the first computer programmer.
An American, Herman Hollerith, developed (around 1890) the first electrically driven device. It utilised punched cards and metal rods which passed through the holes to close an electrical circuit and thus cause a counter to advance. This machine was able to complete the calculation of the 1890 U.S. census in 6 weeks compared with 7 1/2 years for the 1880 census which was manually counted.
In 1936 Howard Aiken of Harvard University convinced Thomas Watson of IBM to invest $1 million in the development of an electromechanical version of Babbage's analytical engine. The Harvard Mark 1 was completed in 1944 and was 8 feet high and 55 feet long.
At about the same time (the late 1930's) John Atanasoff of Iowa State University and his assistant Clifford Berry built the first digital computer that worked electronically, the ABC (Atanasoff-Berry Computer). This machine was basically a small calculator.
In 1943, as part of the British war effort, a series of vacuum tube based computers (named Colossus) were developed to crack German secret codes. The Colossus Mark 2 series (pictured) consisted of 2400 vacuum tubes.
John Mauchly and J. Presper Eckert of the University of Pennsylvania developed these ideas further by proposing a huge machine consisting of 18,000 vacuum tubes. ENIAC (Electronic Numerical Integrator And Computer) was born in 1946. It was a huge machine with a huge power requirement and two major disadvantages. Maintenance was extremely difficult as the tubes broke down regularly and had to be replaced, and also there was a big problem with overheating. The most important limitation, however, was that every time a new task needed to be performed the machine need to be rewired. In other words programming was carried out with a soldering iron.
In the late 1940's John von Neumann (at the time a special consultant
to the ENIAC team) developed the EDVAC (Electronic Discrete Variable
The Generations of Computers
It used to be quite popular to refer to computers as belonging to one of several "generations" of computer. These generations are:-
The First Generation (1943-1958): This generation is often described as starting with the delivery of the first commercial computer to a business client. This happened in 1951 with the delivery of the UNIVAC to the US Bureau of the Census. This generation lasted until about the end of the 1950's (although some stayed in operation much longer than that). The main defining feature of the first generation of computers was that vacuum tubes were used as internal computer components. Vacuum tubes are generally about 5-10 centimeters in length and the large numbers of them required in computers resulted in huge and extremely expensive machines that often broke down (as tubes failed).
The Second Generation (1959-1964): In the mid-1950's Bell Labs developed the transistor. Transistors were capable of performing many of the same tasks as vacuum tubes but were only a fraction of the size. The first transistor-based computer was produced in 1959. Transistors were not only smaller, enabling computer size to be reduced, but they were faster, more reliable and consumed less electricity.
The other main improvement of this period was the development of computer languages.Assembler languages or symbolic languages allowed programmers to specify instructions in words (albeit very cryptic words) which were then translated into a form that the machines could understand (typically series of 0's and 1's: Binary code). Higher level languages also came into being during this period. Whereas assembler languages had a one-to-one correspondence between their symbols and actual machine functions, higher level language commands often represent complex sequences of machine codes. Two higher-level languages developed during this period (Fortran and Cobol) are still in use today though in a much more developed form.
The Third Generation (1965-1970): In 1965 the first integrated circuit (IC) was developed in which a complete circuit of hundreds of components were able to be placed on a single silicon chip 2 or 3 mm square. Computers using these IC's soon replaced transistor based machines. Again, one of the major advantages was size, with computers becoming more powerful and at the same time much smaller and cheaper. Computers thus became accessible to a much larger audience. An added advantage of smaller size is that electrical signals have much shorter distances to travel and so the speed of computers increased.
Another feature of this period is that computer software became much more powerful and flexible and for the first time more than one program could share the computer's resources at the same time (multi-tasking). The majority of programming languages used today are often referred to as 3GL's (3rd generation languages) even though some of them originated during the 2nd generation.
The Fourth Generation (1971-present): The boundary between the third and fourth generations is not very
clear-cut at all. Most of the developments since the mid 1960's can
be seen as part of a continuum of gradual miniaturisation. In 1970 large-scale integration was achieved where
the equivalent of thousands of integrated circuits were crammed onto
a single silicon chip. This development again increased computer performance
(especially reliability and speed) whilst reducing computer size and
cost. Around this time the first complete general-purpose microprocessor
During this period Fourth Generation Languages (4GL's) have come into existence. Such languages are a step further removed from the computer hardware in that they use language much like natural language. Many database languages can be described as 4GL's. They are generally much easier to learn than are 3GL's.
The Fifth Generation (the future): The "fifth generation" of computers were defined by the Japanese government in 1980 when they unveiled an optimistic ten-year plan to produce the next generation of computers. This was an interesting plan for two reasons. Firstly, it is not at all really clear what the fourth generation is, or even whether the third generation had finished yet. Secondly, it was an attempt to define a generation of computers before they had come into existence. The main requirements of the 5G machines was that they incorporate the features ofArtificial Intelligence, Expert Systems, and Natural Language. The goal was to produce machines that are capable of performing tasks in similar ways to humans, are capable of learning, and are capable of interacting with humans in natural language and preferably using both speech input (speech recognition) and speech output (speech synthesis). Such goals are obviously of interest to linguists and speech scientists as natural language and speech processing are key components of the definition. As you may have guessed, this goal has not yet been fully realised, although significant progress has been made towards various aspects of these goals.
Parallel Computing
Up until recently most computers were serial computers. Such computers
had a single processor chip containing a single processor. Parallel
computing is based on the idea that if more than one task can be processed
simultaneously on multiple processors then a program would be able to
run more rapidly than it could on a single processor. The supercomputers
of the 1990s, such as the Cray computers, were extremely expensive to
purchase (usually over $1,000,000) and often required cooling by liquid
helium so they were also very expensive to run. Clusters of networked
computers (eg. a Beowulf culster of PCs running Linux) have been, since
1994, a much cheaper solution to the problem of fast processing of complex
computing tasks. By 2008, most new desktop and laptop computers contained
more than one processor on a single chip (eg. the Intel "Core 2
Duo" released in 2006 or the Intel "Core 2 Quad" released
in 2007). Having multiple processors does not necessarily mean that
parallel computing will work automatically. The operating system must
be able to distribute programs between the processors
Devopment of computers and technology
Computers in some form are in almost
everything these days. From Toasters to Televisions, just about all
electronic things has some form of processor in them. This is a very
large change from the way it used to be, when a computer that would
take up an entire room and weighed tons of pounds has the same amount
of power as a scientific calculator. The changes that computers have
undergone in the last 40 years have been colossal.
Changes in computer hardware and software have taken great leaps
and jumps since the first video games and word processors. Video games
started out with a game called Pong...monochrome (2 colors, typically
amber and black, or green and black), you had 2 controller paddles,
and the game resembled a slow version of Air Hockey. The first word
processors had their roots in MS-DOS, these were not very sophisticated
nor much better than a good typewriter at the time. About the only benefits
were the editing tools available with the word processors. But, since
these first two dinosaurs of software, they have gone through some major
changes. Video games are now placed in fully 3-D environments and word
processors now have the abilities to change grammar and check your spelling.
Hardware has also undergone some fairly major changes. When computers entered their 4th generation, with the 8088 processor, it was just a base computer, with a massive processor, with little power, running at 3-4 MHz, and there was no sound to speak of, other than blips and bleeps from an internal speaker. Graphics cards were limited to two colors (monochrome), and RAM was limited to 640k and less. By this time, though, computers had already undergone massive changes. The first computers were massive beasts of things that weighed thousands of pounds. The first computer was known as the ENIAC, it was the size of a room, used punched cards as input and didn't have much more power than a calculator. The reason for it being so large is that it used vacuum tubes to process data. It also broke down very often...to the tune of once every fifteen minutes, and then it would take 15 minutes to locate the problem and fix it. This beast also used massive amount of power, and people used to joke that the lights would dim in the city of origin whenever the computer was used.
Информация о работе A Short History of Computers and Computing