Скачать .docx  

Реферат: Usa. Howard h. Aiken and the computer

HOWARD H. AIKEN AND THE COMPUTER

Howard Aiken’s contributions to the development of the computer -notably the Harvard Mark I (IBM ASSC) machine, and its successor the Mark II - are often excluded from the mainstream history of computers on two technicalities. The first is that Mark I and Mark II were electro-mechanical rather than electronic; the second one is that Aiken was never convinced that computer programs should be treated as data in what has come to be known as the von Neumann concept, or the stored program.

It is not proposed to discuss here the origins and significance of the stored program. Nor I wish to deal with the related problem of whether the machines before the stored program were or were not “computers”. This subject is complicated by the confusion in actual names given to machines. For example, the ENIAC, which did not incorporate a stored program, was officially named a computer: Electronic Numeral Integrator And Computer. But the first stored-program machine to be put into regular operation was Maurice Wiles’ EDSAC: Electronic Delay Storage Automatic Calculator. It seems to be rather senseless to deny many truly significant innovations (by H.H.Aiken and by Eckert and Mauchly), which played an important role in the history of computers, on the arbitrary ground that they did not incorporate the stored-program concept. Additionally, in the case of Aiken, it is significant that there is a current computer technology that does not incorporate the stored programs and that is designated as (at least by TEXAS INSTRUMENTSв ) as “Harvard architecture”, though, it should more properly be called “Aiken architecture”. In this technology the program is fix and not subject to any alteration save by intent - as in some computers used for telephone switching and in ROM.

OPERATION OF THE ENIAC

Aiken was a visionary, a man ahead of his times. Grace Hopper and others remember his prediction in the late 1940s, even before the vacuum tube had been wholly replaced by the transistor, that the time would come when a machine even more powerful than the giant machines of those days could be fitted into a space as small as a shoe box.

Some weeks before his death Aiken had made another prediction. He pointed out that hardware considerations alone did not give a true picture of computer costs. As hardware has become cheaper, software has been apt to get more expensive. And then he gave us his final prediction: “The time will come”, he said, “when manufacturers will gave away hardware in order to sell software”. Time alone will tell whether or not this was his final look ahead into the future.

THE DEVELOPMENT OF COMPUTERS IN THE USA

In the early 1960s, when computers were hulking mainframes that took up entire rooms, engineers were already toying with the then - extravagant notion of building a computer intended for the sole use of one person. by the early 1970s, researches at Xerox’s Polo Alto Research Center (Xerox PARC) had realized that the pace of improvement in the technology of semiconductors - the chips of silicon that are the building blocks of present-day electronics - meant that sooner or later the PC would be extravagant no longer. They foresaw that computing power would someday be so cheap that engineers would be able to afford to devote a great deal of it simply to making non-technical people more comfortable with these new information - handling tools. in their labs, they developed or refined much of what constitutes PCs today, from “mouse” pointing devices to software “windows”.

Although the work at Xerox PARC was crucial, it was not the spark that took PCs out of the hands of experts and into the popular imagination. That happened inauspiciously in January 1975, when the magazine Popular Electronics put a new kit for hobbyists, called the Altair, on its cover. for the first time, anybody with $400 and a soldering iron could buy and assemble his own computer. The Altair inspired Steve Wosniak and Steve Jobs to build the first Apple computer, and a young college dropout named Bill Gates to write software for it. Meanwhile. the person who deserves the credit for inventing the Altair, an engineer named Ed Roberts, left the industry he had spawned to go to medical school. Now he is a doctor in small town in central Georgia.

To this day, researchers at Xerox and elsewhere pooh-pooh the Altair as too primitive to have made use of the technology they felt was needed to bring PCs to the masses. In a sense, they are right. The Altair incorporated one of the first single-chip microprocessor - a semiconductor chip, that contained all the basic circuits needed to do calculations - called the Intel 8080. Although the 8080 was advanced for its time, it was far too slow to support the mouse, windows, and elaborate software Xerox had developed. Indeed, it wasn’t until 1984, when Apple Computer’s Macintosh burst onto the scene, that PCs were powerful enough to fulfill the original vision of researchers. “The kind of computing that people are trying to do today is just what we made at PARC in the early 1970s,” says Alan Kay, a former Xerox researcher who jumped to Apple in the early 1980s.

Researchers today are proceeding in the same spirit that motivated Kay and his Xerox PARC colleagues in the 1970s: to make information more accessible to ordinary people. But a look into today’s research labs reveals very little that resembles what we think of now as a PC. For one thing, researchers seem eager to abandon the keyboard and monitor that are the PC’s trademarks. Instead they are trying to devise PCs with interpretive powers that are more humanlike - PCs that can hear you and see you, can tell when you’re in a bad mood and know to ask questions when they don’t understand something.

It is impossible to predict the invention that, like the Altair, crystallize new approaches in a way that captures people’s imagination.