gfhgfnhbgvjghj
Saturday, 6 January 2018
Friday, 8 April 2016
-
09:50:00
-
Unknown
Trying to trace the
history of computers, how far should we go? Computer Science is the youngest
discipline among human repository of knowledge. We know, the Internet became
popular among generalpublic since 1990s; Mark I, the first computer, was
built in 1944; computers have never gone earlier than 20th century?
That’s true, but,
because computers are elegant devices for calculation, and if we skip the
development of calculating devices, we’ll just puzzle how computers came into
existence! And to understand how these devices emerged, you should know the
need to calculate and count. Oh! ‘the counting devices‘? How did people
start to count?
It’s strange at first
to see every book on history of computers points to the ancient times – the
time when mankind just developed from Stone Age. Uhh? That’s the beginning of
mankind! Does the history of computers go that distant? Yes.
The
First Problem – Counting
Life in jungle must be
enjoying. Though, they did not have luxury of materials, their brain must be
free from all the tensions of today’s world. All they cared was food and
shelter. I wish Eve[1] had never eaten that apple!
The quest for easy
food drove mankind into the agricultural age. People started to possess
properties. When there is something you possess, you need to remember the
quantity, type and so on. How many sheep do I have? How could a man express a
quantity? Numbers were not invented yet!
Knots in vine and
ropes, notches in sticks, scratches made on rocks must be the first counting
aid for people at that time. Later on, those scratches on rocks and drawings on
ground must have given a way to develop numbers.
Guess, what after the
numbers came into existence? Obviously, the need to add and subtract them!
Mechanical
Devices – Counting and Calculating
In this section we
will learn about the counting and calculating devices that contribute a way
through for the development of modern day computers.
Abacus –
The first known calculating device

Abacus[2] is a simple wooden box with beads strung
which are moved towards the mid-bar[3] to perform
calculations. You bring the beads near the bar and count to get result.
Obtaining result is a manual process. Thus Abacus is essentially a
memory aid rather than truly a calculating device. It is generally agreed that
Abacus was invented in China around 2500 BC.
An Abacus is divided
into two parts – heaven, the upper deck and earth, the lower deck – divided by
a mid-bar. On each string there are two beads on heaven and 5 beads on earth.
The value of each bead on heaven is 5 and on earth it is 1. So if you pull one
heaven bead and 3 earth bead near the mid-bar, it represented the number 8.
Napier’s
bones
Abacus is about
ancient past. When we look upon the modern history, it is 1614 when John Napier
inventedLogarithm[4] – a branch of
mathematics to multiply and divide extremely large or small numbers. This is
considered the principal invention of Napier.
In Computer Science
what interests us more about Napier’s invention, in addition to the rule of
Logarithm, is Napier’s bones. It is a set of rods (10 rods in a
set). Numbers are carved on each rod and can be used to perform multiplication,
division with the help of logarithm. These rods were made up of bones, and must
be the reason for the name.
Calculation is
done by aligning the proper rods against each other and by inspection.
Slide
Rule

Slide Rule was
invented by William Oughtred towards 1620. This device consists of logarithmic
scales where one can slide upon other. The sliding rule is aligned properly
against other scale and a reading is done through the indicator
slide.
Slide rule could be
used to perform multiplication and divisions efficiently.
Pascal’s
Adding Machine – the Pascaline

17 Century was the
most fertile for devising different calculating equipment. Blaise Pascal, at an
age of 19 years, designed an adding machine to find the sum of numbers. The
machine resulted as his effort to help his father. Pascal’s father worked in tax
office and in every evening he had to calculate the sum of collection
throughout the day.
It was in 1642 Pascal
developed Pascaline which could be used to add, subtract, multiply and divide
the numbers by dialing wheels.
Leibnitz’s
Calculator – The Stepped Reckoner
German philosopher
Gottfried Wilhelm von Leibnitz improved Pascal’s adding machine and made
Stepped Reckoner that could even find square roots. This is the first digital
mechanical calculator that can perform all four basic arithmetic operations –
add, subtract, multiply and divide.
Charles
Babbage and his engines

Charles Babbage is
considered the father of modern computers. It is his ideas – the idea of input,
mill (processing), output and storage – the modern computers followed and been
successfully miraculous device! Though he could not complete his Analytical Engine
(conceived in 1830s) due to insufficient funding and technological advancement
of the day, it proved to be a foundation for the birth of computers.
Babbage however
completed a working model of his first machine – The Difference
Engine and was awarded by Royal Society. Difference engine implemented the
mechanical memory to store results. It was based on the difference tables of
squares of the number, and thus the name – Difference Engine.
INTERESTING FACT:
Babbage conceived of a computer 100 years earlier. Howard Aikin builds the
first computer Mark I based on Babbage’s idea in 1944.
Lady
Augusta Ada Lovelace – The first programmer
Lady Augusta Ada
Lovelace was a great supporter of Charles Babbage and she convinced him to use
binary systems in his engines. Because she devised a way to program Babbage’s
engines, she is considered the first programmer.
Ada is the daughter of
Lord Byron, a famous English poet.
US Defense developed a
programming language and named it ADA to honor her contribution
Dr.
Herman Hollerith & his Tabulating Machines
Herman Hollerith
invented a tabulating machine for the census of 1880s. He used punched cards to
code the numbers and feed them into the machine. That’s why he is considered to
be the man to use punched cards practically for the first time. Though Charles
Babbage used punched cards for his analytical engine, it was never built and
Hollerith was successful in designing a machine that could accept input through
punched cards.
Hollerith founded
Tabulating Machine Company to build and sell his products and later on it was
merged with some other companies to form International Business Machine (IBM)
Company. IBM is the largest computer manufacturing company even today.
INTERESTING
FACT: Punched cards were
originally invented by Joseph Jacquard, a textile manufacturer. He used them to
automate the weaving loom. These cards were later used by Charles Babbage in
his design of Analytical Engine and Herman Hollerith practically used them for
the first time in his Tabulating Machine.
EXTRA DOZE: Calculating devices such as Abacus,
Slide Rule, and Napier’s bones etc. were very simple machines that could add,
subtract and repeated operation to perform multiplication and division. Though
they appear trivial today, they were great invention of that time.
By mechanical part, it
means it works by moving wheels and bars. Electronic components do not have any
moving parts to perform calculation and can work with the flow of
electricity in its circuitry. Because it does not have moving parts, these
devices are very low at failure rate.
Electro
Mechanical Computers
In 1944 the first
electro-mechanical computer Mark -I was built by Howard Aiken with the help of
IBM. Mark I, Mark II, and Zues Computers (Z2,
Z3) are the examples of Electro Mechanical Computers. Let’s look at Mark I and
Z3 computer here.
Mark I
Mark-I, originally
known as IBM Automatic Sequence Controlled Calculator (ASCC),
is the first computer of the world. Mark-I is described as the beginning of the
era of the modern computer. It was built in Harvard University by Howard H.
Aiken.
Mark I was a gigantic
computer. It was 51 feet long, 8 feet tall and 2 feet wide which weight 4500
Kg. It could do three additions or subtractions in a second. A multiplication
took six seconds, a division took 15.3 seconds, and a logarithm or a
trigonometric function took over one minute.
Device
|
Inventor
|
Date
|
Specialty
|
Mark – I
|
Howard Aiken
|
1944
|
First Computer
|
Z3
Computer
Apart from Mark I and
Mark II computers, there are other contemporary computers like Z2 and Z3
(designed by Konrad Zuse) on this category.
The contribution of
Zuse was ignored for long due to political reasons. He was a German Engineer
and Computer Pioneer. Zuse completed his work entirely independently of other
leading computer scientists and mathematicians of his day. Between 1936 and
1945, he was in near-total intellectual isolation.
Improving the basic Z2
machine, Konrad built the Z3 in 1941. It was a binary 22-bit floating point
calculator featuring programmability with loops but without conditional jumps,
with memory and a calculation unit based on telephone relays.
Zuse’s company (with
the Z1, Z2 and Z3) was destroyed in 1945 by an Allied air attack.
Electronic
Computers
In 1947 John Mauchly
and J. P. Eckert developed the first general purpose electronic computer –
ENIAC. This begins a new era in computing history. Apart from ENIAC, ABC, EDVAC
and UNIVAC are some early electronic computers. We will be studying these
computers in this section.
Calculating devices
were fairly simple aid for human head. Electromechanical calculators were
moderately complex. There were wheels, drums and bars that rotate and move to
produce result. Because they had some mechanical parts, those devices are
called electro-mechanical computers.
Electronic computers, on the other hand, work with the flow of
electrons in its different components. Because electronic components are more
reliable and speedy, electronic computers are very reliable compared to the
earlier computers.
ABC
ABC, the first electronic
digital computer, was invented by John v. Atanasoff and his assistant Clifford
Berry and thus the name Atanasoff Berry Computer (ABC).
Earlier, ENIAC was
considered to be the first electronic computer until in 1973 a U.S. District
Court invalidated the ENIAC patent. Thus, ABC is the first electronic digital
computer. However, because ABC is a special purpose computer and not
programmable, ENIAC still is the first general purpose electronic computer.
It is the ABC that
first implements the three critical features of modern computers:
ü Using binary
digits to represent all numbers and data
ü Performing all
calculations using electronics rather than wheels, ratchets, or mechanical
switches
ü Organizing a
system in which computation and memory are separated.
Device
|
Inventor
|
Date
|
Specialty
|
ABC
|
John v. Atanasoff & Clifford
Berry
|
1942
|
First Electronic Digital Computer
|
ENIAC

ENIAC stands for Electrical
Numerical Integrator And Calculator. It was developed in 1946 by John
Mauchly and John Presper Eckert.
ENIAC is the first
general-purpose electronic digital computer. It used to be considered the first
electronic computer till 1973 when a U.S. District Court invalidated the ENIAC
patent and concluded that the ENIAC inventors had derived the subject matter of
the electronic digital computer from Atanasoff. Anyway, it is still the first
general purpose electronic computer.
ENIAC used decimal
numbering system for its operation and contained 17,468 vacuum tubes, along
with 70,000 resistors, 10,000 capacitors, 1,500 relays, 6,000 manual switches and
5 million soldered joints. It covered 1800 square feet (167 square meters) of
floor space, weighed 30 tons, and consumed 160 kilowatts of electrical power.
Note: In abbreviation
the character ‘C’ in these computers stands for ‘Computer’ or ‘Calculator’.
Consider both as correct.
Device
|
Inventor
|
Date
|
Specialty
|
ENIAC
|
J.P. Eckert & John Mauchly
|
1946
|
First General Purpose Electronic Digital Computer
|
EDVAC –
Electronic Discrete Variable Automatic computer

Well that’s it – the
name itself includes the word ‘automatic’ like ENIAC used the word
‘electronic’. ENIAC must be excited as it was not depending upon mechanical
components, so, called it ELECTRONIC!
EDVAC was developed by
John Mauchly and John Presper Eckert in 1949 with the help of John von Neumann.
Device
|
Inventor
|
Date
|
Specialty
|
EDVAC
|
J.P. Eckert & John
Mauchly
|
1949
|
Stored Program Computer
|
EDSAC –
Electronic Delay Storage Automatic Computer

EDSAC (Electronic
Delay Storage Automatic Calculator) was an early British computer (one of the
first computers to be created). The machine, having been inspired by John von Neumann’s
seminal EDVAC report, was constructed by Professor Sir Maurice Wilkes and his
team at the University of Cambridge Mathematical Laboratory in England.
EDSAC was the world’s
first practical stored program electronic computer, although not the first
stored program computer (that honor goes to the Small-Scale Experimental
Machine).
Device
|
Inventor
|
Date
|
Specialty
|
EDSAC
|
Sir Maurice Wilkes
|
1949
|
The first Practical Stored Program
Computer
|
UNIVAC
– Universal Automatic Computer
After the successful
development of ENIAC and EDVAC, John Mauchly& J.P. Eckert founded their own
company in 1946 and began to work on the Universal Automatic computer.
UNIVAC was the first
general purpose commercial computer.
Device
|
Inventor
|
Date
|
Specialty
|
UNIVAC
|
J.P. Eckert & John Mauchly
|
1951
|
The first general purpose commercial
computer
|
Thursday, 7 April 2016
-
11:36:00
-
Unknown
Devices have been used to aid computation for thousands of years, mostly using one-to-one correspondence with fingers. The earliest counting device was probably a form of tally stick. Later record keeping aids throughout the Fertile Crescent included calculi (clay spheres, cones, etc.) which represented counts of items, probably livestock or grains, sealed in hollow unbaked clay containers.[4][5] The use of counting rods is one example.
The abacus was initially used for arithmetic tasks. The Roman abacus was used in Babylonia as early as 2400 BC. Since then, many other forms of reckoning boards or tables have been invented. In a medieval European counting house, a checkered cloth would be placed on a table, and markers moved around on it according to certain rules, as an aid to calculating sums of money.

The ancient Greek-designedAntikythera mechanism, dating between 150 to 100 BC, is the world's oldest analog computer.
The Antikythera mechanism is believed to be the earliest mechanical analog "computer", according to Derek J. de Solla Price.[6] It was designed to calculate astronomical positions. It was discovered in 1901 in the Antikythera wreckoff the Greek island of Antikythera, between Kythera and Crete, and has been dated to circa 100 BC. Devices of a level of complexity comparable to that of the Antikythera mechanism would not reappear until a thousand years later.
Many mechanical aids to calculation and measurement were constructed for astronomical and navigation use. The planisphere was a star chart invented by Abū Rayhān al-Bīrūnī in the early 11th century.[7] The astrolabe was invented in the Hellenistic world in either the 1st or 2nd centuries BC and is often attributed to Hipparchus. A combination of the planisphere and dioptra, the astrolabe was effectively an analog computer capable of working out several different kinds of problems inspherical astronomy. An astrolabe incorporating a mechanical calendar computer[8][9] and gear-wheels was invented by Abi Bakr ofIsfahan, Persia in 1235.[10] Abū Rayhān al-Bīrūnī invented the first mechanical geared lunisolar calendar astrolabe,[11] an early fixed-wiredknowledge processing machine[12] with a gear train and gear-wheels,[13] circa 1000 AD.
The sector, a calculating instrument used for solving problems in proportion, trigonometry, multiplication and division, and for various functions, such as squares and cube roots, was developed in the late 16th century and found application in gunnery, surveying and navigation.
The planimeter was a manual instrument to calculate the area of a closed figure by tracing over it with a mechanical linkage.
The slide rule was invented around 1620–1630, shortly after the publication of the concept of the logarithm. It is a hand-operated analog computer for doing multiplication and division. As slide rule development progressed, added scales provided reciprocals, squares and square roots, cubes and cube roots, as well as transcendental functions such as logarithms and exponentials, circular and hyperbolic trigonometry and other functions. Aviation is one of the few fields where slide rules are still in widespread use, particularly for solving time–distance problems in light aircraft. To save space and for ease of reading, these are typically circular devices rather than the classic linear slide rule shape. A popular example is the E6B.
In the 1770s Pierre Jaquet-Droz, a Swiss watchmaker, built a mechanical doll (automata) that could write holding a quill pen. By switching the number and order of its internal wheels different letters, and hence different messages, could be produced. In effect, it could be mechanically "programmed" to read instructions. Along with two other complex machines, the doll is at the Musée d'Art et d'Histoire of Neuchâtel, Switzerland, and still operates.[14]
The tide-predicting machine invented by Sir William Thomson in 1872 was of great utility to navigation in shallow waters. It used a system of pulleys and wires to automatically calculate predicted tide levels for a set period at a particular location.
The differential analyser, a mechanical analog computer designed to solve differential equations by integration, used wheel-and-disc mechanisms to perform the integration. In 1876 Lord Kelvin had already discussed the possible construction of such calculators, but he had been stymied by the limited output torque of the ball-and-disk integrators.[15] In a differential analyzer, the output of one integrator drove the input of the next integrator, or a graphing output. The torque amplifier was the advance that allowed these machines to work. Starting in the 1920s, Vannevar Bush and others developed mechanical differential analyzers.
First general-purpose computing device
Charles Babbage, an English mechanical engineer and polymath, originated the concept of a programmable computer. Considered the "father of the computer",[16] he conceptualized and invented the first mechanical computer in the early 19th century. After working on his revolutionary difference engine, designed to aid in navigational calculations, in 1833 he realized that a much more general design, anAnalytical Engine, was possible. The input of programs and data was to be provided to the machine via punched cards, a method being used at the time to direct mechanical looms such as the Jacquard loom. For output, the machine would have a printer, a curve plotter and a bell. The machine would also be able to punch numbers onto cards to be read in later. The Engine incorporated an arithmetic logic unit,control flow in the form of conditional branching and loops, and integrated memory, making it the first design for a general-purpose computer that could be described in modern terms as Turing-complete.[17][18]
The machine was about a century ahead of its time. All the parts for his machine had to be made by hand — this was a major problem for a device with thousands of parts. Eventually, the project was dissolved with the decision of the British Government to cease funding. Babbage's failure to complete the analytical engine can be chiefly attributed to difficulties not only of politics and financing, but also to his desire to develop an increasingly sophisticated computer and to move ahead faster than anyone else could follow. Nevertheless, his son, Henry Babbage, completed a simplified version of the analytical engine's computing unit (the mill) in 1888. He gave a successful demonstration of its use in computing tables in 1906.
Later analog computers
During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers.[19]
The first modern analog computer was a tide-predicting machine, invented by Sir William Thomson in 1872. The differential analyser, a mechanical analog computer designed to solve differential equations by integration using wheel-and-disc mechanisms, was conceptualized in 1876 by James Thomson, the brother of the more famous Lord Kelvin.[15]
The art of mechanical analog computing reached its zenith with the differential analyzer, built by H. L. Hazen and Vannevar Bush at MITstarting in 1927. This built on the mechanical integrators of James Thomson and the torque amplifiers invented by H. W. Nieman. A dozen of these devices were built before their obsolescence became obvious.
By the 1950s the success of digital electronic computers had spelled the end for most analog computing machines, but analog computers remain in use in some specialized applications such as education (control systems) and aircraft (slide rule).
Digital computer development
The principle of the modern computer was first described by mathematician and pioneering computer scientist Alan Turing, who set out the idea in his seminal 1936 paper,[20] On Computable Numbers. Turing reformulated Kurt Gödel's 1931 results on the limits of proof and computation, replacing Gödel's universal arithmetic-based formal language with the formal and simple hypothetical devices that became known as Turing machines. He proved that some such machine would be capable of performing any conceivable mathematical computation if it were representable as an algorithm. He went on to prove that there was no solution to the Entscheidungsproblem by first showing that the halting problem for Turing machines is undecidable: in general, it is not possible to decide algorithmically whether a given Turing machine will ever halt.
He also introduced the notion of a 'Universal Machine' (now known as a Universal Turing machine), with the idea that such a machine could perform the tasks of any other machine, or in other words, it is provably capable of computing anything that is computable by executing a program stored on tape, allowing the machine to be programmable.Von Neumann acknowledged that the central concept of the modern computer was due to this paper.[21] Turing machines are to this day a central object of study in theory of computation. Except for the limitations imposed by their finite memory stores, modern computers are said to be Turing-complete, which is to say, they have algorithm execution capability equivalent to a universal Turing machine.
Electromechanical
By 1938 the United States Navy had developed an electromechanical analog computer small enough to use aboard a submarine. This was the Torpedo Data Computer, which used trigonometry to solve the problem of firing a torpedo at a moving target. During World War II similar devices were developed in other countries as well.
Early digital computers were electromechanical; electric switches drove mechanical relays to perform the calculation. These devices had a low operating speed and were eventually superseded by much faster all-electric computers, originally using vacuum tubes. The Z2, created by German engineer Konrad Zuse in 1939, was one of the earliest examples of an electromechanical relay computer.[22]
In 1941, Zuse followed his earlier machine up with the Z3, the world's first working electromechanical programmable, fully automatic digital computer.[23][24] The Z3 was built with 2000 relays, implementing a 22 bit word length that operated at a clock frequency of about 5–10 Hz.[25]Program code was supplied on punched film while data could be stored in 64 words of memory or supplied from the keyboard. It was quite similar to modern machines in some respects, pioneering numerous advances such as floating point numbers. Replacement of the hard-to-implement decimal system (used in Charles Babbage's earlier design) by the simpler binary system meant that Zuse's machines were easier to build and potentially more reliable, given the technologies available at that time.[26] The Z3 was Turing complete.[27][28]
Vacuum tubes and digital electronic circuits
Purely electronic circuit elements soon replaced their mechanical and electromechanical equivalents, at the same time that digital calculation replaced analog. The engineerTommy Flowers, working at the Post Office Research Station in London in the 1930s, began to explore the possible use of electronics for the telephone exchange. Experimental equipment that he built in 1934 went into operation 5 years later, converting a portion of the telephone exchange network into an electronic data processing system, using thousands of vacuum tubes.[19] In the US, John Vincent Atanasoff and Clifford E. Berry of Iowa State University developed and tested the Atanasoff–Berry Computer (ABC) in 1942,[29] the first "automatic electronic digital computer".[30] This design was also all-electronic and used about 300 vacuum tubes, with capacitors fixed in a mechanically rotating drum for memory.[31]

Colossus was the first electronicdigital programmable computing device, and was used to break German ciphers during World War II.
During World War II, the British at Bletchley Park achieved a number of successes at breaking encrypted German military communications. The German encryption machine, Enigma, was first attacked with the help of the electro-mechanical bombes. To crack the more sophisticated German Lorenz SZ 40/42 machine, used for high-level Army communications, Max Newman and his colleagues commissioned Flowers to build the Colossus.[31] He spent eleven months from early February 1943 designing and building the first Colossus.[32] After a functional test in December 1943, Colossus was shipped to Bletchley Park, where it was delivered on 18 January 1944[33] and attacked its first message on 5 February.[31]
Colossus was the world's first electronic digital programmable computer.[19] It used a large number of valves (vacuum tubes). It had paper-tape input and was capable of being configured to perform a variety of boolean logical operations on its data, but it was not Turing-complete. Nine Mk II Colossi were built (The Mk I was converted to a Mk II making ten machines in total). Colossus Mark I contained 1500 thermionic valves (tubes), but Mark II with 2400 valves, was both 5 times faster and simpler to operate than Mark 1, greatly speeding the decoding process.[34][35]

ENIAC was the first Turing-complete device, and performed ballistics trajectory calculations for the United States Army.
The US-built ENIAC[36] (Electronic Numerical Integrator and Computer) was the first electronic programmable computer built in the US. Although the ENIAC was similar to the Colossus it was much faster and more flexible. It was unambiguously a Turing-complete device and could compute any problem that would fit into its memory. Like the Colossus, a "program" on the ENIAC was defined by the states of its patch cables and switches, a far cry from the stored program electronic machines that came later. Once a program was written, it had to be mechanically set into the machine with manual resetting of plugs and switches.
It combined the high speed of electronics with the ability to be programmed for many complex problems. It could add or subtract 5000 times a second, a thousand times faster than any other machine. It also had modules to multiply, divide, and square root. High speed memory was limited to 20 words (about 80 bytes). Built under the direction of John Mauchly and J. Presper Eckert at the University of Pennsylvania, ENIAC's development and construction lasted from 1943 to full operation at the end of 1945. The machine was huge, weighing 30 tons, using 200 kilowatts of electric power and contained over 18,000 vacuum tubes, 1,500 relays, and hundreds of thousands of resistors, capacitors, and inductors.[37]
Stored programs
Early computing machines had fixed programs. Changing its function required the re-wiring and re-structuring of the machine.[31] With the proposal of the stored-program computer this changed. A stored-program computer includes by design an instruction set and can store in memory a set of instructions (a program) that details the computation. The theoretical basis for the stored-program computer was laid by Alan Turing in his 1936 paper. In 1945 Turing joined the National Physical Laboratory and began work on developing an electronic stored-program digital computer. His 1945 report ‘Proposed Electronic Calculator’ was the first specification for such a device. John von Neumann at the University of Pennsylvania also circulated his First Draft of a Report on the EDVAC in 1945.[19]
The Manchester Small-Scale Experimental Machine, nicknamed Baby, was the world's first stored-program computer. It was built at the Victoria University of Manchester by Frederic C. Williams, Tom Kilburn and Geoff Tootill, and ran its first program on 21 June 1948.[38] It was designed as a testbedfor the Williams tube the first random-access digital storage device.[39] Although the computer was considered "small and primitive" by the standards of its time, it was the first working machine to contain all of the elements essential to a modern electronic computer.[40] As soon as the SSEM had demonstrated the feasibility of its design, a project was initiated at the university to develop it into a more usable computer, the Manchester Mark 1.
The Mark 1 in turn quickly became the prototype for the Ferranti Mark 1, the world's first commercially available general-purpose computer.[41]Built by Ferranti, it was delivered to the University of Manchester in February 1951. At least seven of these later machines were delivered between 1953 and 1957, one of them to Shell labs in Amsterdam.[42] In October 1947, the directors of British catering company J. Lyons & Company decided to take an active role in promoting the commercial development of computers. The LEO I computer became operational in April 1951 [43] and ran the world's first regular routine office computer job.
Transistors
The bipolar transistor was invented in 1947. From 1955 onwards transistors replaced vacuum tubes in computer designs, giving rise to the "second generation" of computers. Compared to vacuum tubes, transistors have many advantages: they are smaller, and require less power than vacuum tubes, so give off less heat. Silicon junction transistors were much more reliable than vacuum tubes and had longer, indefinite, service life. Transistorized computers could contain tens of thousands of binary logic circuits in a relatively compact space.
At the University of Manchester, a team under the leadership of Tom Kilburn designed and built a machine using the newly developedtransistors instead of valves.[44] Their first transistorised computer and the first in the world, was operational by 1953, and a second version was completed there in April 1955. However, the machine did make use of valves to generate its 125 kHz clock waveforms and in the circuitry to read and write on its magnetic drum memory, so it was not the first completely transistorized computer. That distinction goes to the Harwell CADET of 1955,[45] built by the electronics division of the Atomic Energy Research Establishment at Harwell.[46][47]
Integrated circuits
The next great advance in computing power came with the advent of the integrated circuit. The idea of the integrated circuit was first conceived by a radar scientist working for the Royal Radar Establishment of the Ministry of Defence, Geoffrey W.A. Dummer. Dummer presented the first public description of an integrated circuit at the Symposium on Progress in Quality Electronic Components in Washington, D.C. on 7 May 1952.[48]
The first practical ICs were invented by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor.[49] Kilby recorded his initial ideas concerning the integrated circuit in July 1958, successfully demonstrating the first working integrated example on 12 September 1958.[50] In his patent application of 6 February 1959, Kilby described his new device as "a body of semiconductor material ... wherein all the components of the electronic circuit are completely integrated".[51][52] Noyce also came up with his own idea of an integrated circuit half a year later than Kilby.[53] His chip solved many practical problems that Kilby's had not. Produced at Fairchild Semiconductor, it was made of silicon, whereas Kilby's chip was made of germanium.
This new development heralded an explosion in the commercial and personal use of computers and led to the invention of the microprocessor. While the subject of exactly which device was the first microprocessor is contentious, partly due to lack of agreement on the exact definition of the term "microprocessor", it is largely undisputed that the first single-chip microprocessor was the Intel 4004,[54] designed and realized by Ted Hoff, Federico Faggin, and Stanley Mazor at Intel.[55
Subscribe to:
Posts
(
Atom
)