Waseda University, SILS, Science, Technology and … · Waseda University, SILS, Science,...

32
Recent History of Computers: Machines for Mass Communication Waseda University, SILS, Science, Technology and Society (LE202)

Transcript of Waseda University, SILS, Science, Technology and … · Waseda University, SILS, Science,...

Recent History of Computers:Machines for Mass Communication

Waseda University, SILS,Science, Technology and Society (LE202)

The communication revolution

• In the first period of the history of computers, we see thatalmost all development is driven by the needs and thefinancial backing of large organizations: government, military,space R&D, large corporations.

• In the second period, we will notice that the focus is nowshifting to small companies, individual programers, hobbyistsand mass consumers.

• The focus in the first period was on computation and control.In the second period, it is on usability and communication.

• A mass market for computers was created, through thedevelopment of a user-friendly personal computer.

Four generations of computers

1st 2nd 3rd 4th 5th

Period 1940s–1955 1956–1963 1964–1967 1971–present ?Tech- vacuum transistors integrated micro- ?nology tubes circuits processorsSize full room large desk sized desk-top, ?

(huge) machine hand-heldSoftware machine assembly operating GUI ?

language language systems interface

The microprocessor

Schematic: The Intel 4004

• In 1968, the “traitorous seven” leftFairchild Semiconductor to foundIntel.

• In 1969, Busicom, a Japanese firm,commissioned Intel to make amicroprocessor for a handheldcalculator.

• This lead to the Intel 4004. Intelbought the rights to sell the chip toother companies.

• Intel immediately began the process ofdesigning more and more powerfulmicrochips.

• This has lead to computers smallenough to fit in our hands.

Consumer electronics

Sony Walkman, 1979

• The microprocessor made itpossible to create moreaffordable consumer electronics.

• The Walkman came out in1979. Through the 1980s videoplayers, recorders and stereoswere marketed.

• In the 1980s-1990s, data storagemoved from analog (paper tape,record) format to digital (CD,DVD).

• We now use watches, TVs,automobiles, mobile phones,microwave ovens, etc., all ofwhich contain microprocessors.

The Altair 8800

• In 1975, the Altair was marketed asa “computer kit” and advertised inPopular Electronics. (Over 4,000orders in the first month.)

• No keyboard, monitor, etc.• The operating system was based on

BASIC and written by Microsoft.• It had an open architecture, which

encouraged users and start-ups tomake their own peripherals.

• (The first microcomputer wasmarketed in France, but did notcatch on. Only 500 machines.)

The Apple II• The Apple I was built by Steve

Wozniak, with the help of hisfriend Steve Jobs, in 1976.

• Jobs secured $300,000 in capitaland they founded a company andbrought out the Apple II.

• The Apple II came with a keyboardand could connect to a TV. It wasthe first personal computer for the‘general’ market.

• In 1979, two Harvard MBAstudents wrote a spreadsheetprogram, VisiCalc, which becamethe first “killer app,” driving upsales.

The Apple I, 1976

The IBM PC

IBM PC Advertisement, 1981

• In 1981, “Big Blue” came out withits first personal computer (PC).

• In order to get into the marketquickly, IBM decided not to doeverything in-house: they usedexisting chipsets, an openarchitecture, and wrote completedocumentation.

• They contracted with Microsoft towrite the operating system(PC-DOS, MS-DOS).

• This made it acceptable forbusinesses to buy a PC.

• “No one ever got fired for buyingIBM.”

Xerox PARC, mouse, GUI and Eithernet

• Up to this point, computers relied on command line interfaces.• At the Xerox Palo Alto Research Center (PARC), however, a

group of talented computer experts had put together a systemwhich was far ahead of anything else.

• They had created a computer, the Alto, with a graphical userinterface (GUI), a mouse and programs using the principle of“what you see is what you get” (WYSIWYG). Multiplecomputers were linked together over an Ethernet networksharing laser printers, etc.

• They had created an object-orientated language, SmallTalk,based on an earlier language developed in Norway.

• Xerox management, however, failed to exploit theseinnovations. (The Star sold for $40,000.)

The Xerox Alto, with mouse and windowing GUI

The Macintosh

• In 1979, Steve Jobs and some other Apple employees visitedXerox PARC and came away with a new vision of the future ofcomputers.

• Apple’s first attempt at a GUI machine, the Lisa, was acommercial failure. (Too expensive and incompatible withanything else.)

• In 1984, the Mac was announced with a US Superbowlcommercial, directed by Ridley Scott. (It won many topcommercial awards.)

• Aldus Pagemaker, 1985, became the killer app that drove Macsales.

• Jobs was forced out of Apple in 1985. He came back in 1996,on the condition that they buy NeXT Computer.

The Macintosh, 1984

PC clones and the rise of Microsoft

• It was possible to clone PCs for three reasons:1) Intel could sell their microchips to anyone.2) Microsoft could sell their operating system to anyone.3) The read-only memory basic input/output system (ROM

BIOS) chips could be reverse engineered, using “clean roomdesign.”

• This lead to a host of knock-off companies like Compaq, Dell,Gateway, Toshiba, etc.

• Since Microsoft, and eventually, Windows was shipping on allof these machines, this contributed to the rise of Microsoft.

• This lead to IBM losing its market dominance and the newstandard became “Wintel” (Windows OS running on an Intelchipset).

ARPAnet• The Advanced Research Projects Agency (ARPA), was started

in the US, in 1958, in response to the launch of Sputnik 1, bythe Soviet Union (USSR).

• The ARPAnet was designed to solve the technical issues —like packet switching, timesharing, terminal usage, etc. —involved in getting one computer to talk to multiplecomputers.

• It was meant to be a research tool, allowing multiple users toshare computer time and facilitating collaborative work.

• Faculty and graduate students at host universities wrote theprotocols and software to enable the different computers tounderstand each other. (Email became the “killer app” – onlyused by small groups of researchers in the sciences.)

• The network was originally set up between 4 computers. Overthe next decade, it rapidly expanded.

• There were similar networks developing in other countries.

ARPAnet, 1969, diagram

ARPAnet, 1969

ARPAnet, 1971

ARPAnet, 1977

The Internet• In the 1970s, there were a number of large networks all over

the world, but they could not be linked together. In 1974,Kahn and Cerf proposed the Transport Control Protocol andInternet Protocol (TCP/IP) as a way to link these variousnetworks together.

• TCP/IP was an open protocol, available to anyone, andanyone could contribute through “requests for comments”(RFCs).

• In the late 1970s, ARPAnet switched over to TCP/IP. In 1981,the National Science Foundation (NSF) created a network forUS universities that were not on ARPAnet using TCP/IP.

• The TCP/IP functions using unique addresses for eachmachine, such as 192.168.34.2. In 1983, a domain namesystem (DNS) was established.

• The networks were linked internationally, using TCP/IP. In1990, and 1995, the APRAnet and NSF net were dismantled.The Internet was now maintained by thousand of nodes.

The World Wide Web (WWW)• The early Internet organized information through electronic

bulletin board systems (BBSs), listserves, and Usenet. (Alltext based.)

• The WWW was developed by Tim Berners-Lee (1955–) andRobert Cailliau (1947–) at Conseil Européen pour laRecherche Nucléaire (CERN).

• Using a NeXT personal computer, they created hypertexttransfer protocol (HTTP), so that documents could bedelivered over a network.

• They made a human-readable display language that they calledhypertext markup language (HTML).

• They developed a system so that every document would beuniquely identified, universal resource location (URL).

• In 1991, Berners-Lee gave copies of these WWW programs tohis colleagues and they gave them to their colleagues. Thetools spread through academia.

• The first graphical web browser, Mosaic, was developed atUniversity of Illinois at Urbana-Champaign, 1993.

Search engines

• The WWW became useful to the average user with thedevelopment of search engines.

• The first were automated web-crawling programs — such asWebcrawler & Altavista — that prowled the internet trying tofigure out the purpose of a webpage by analyzing the content.They often returned very strange results.

• Another type of search engine was to have human readerscategorize pages. Yahoo!, started by two Stanford graduates,used a combination of both approaches.

• Larry Page (1971–) and Sergey Brin (1973–) developed analgorithm for ranking webpages based on how many otherpages linked to them. This became the core of Google, whichhas become synonymous with web search in the Englishlanguage, and the model for almost all modern search engines.

The digital divide• In the mid-1990s, people began to talk about a digital divide

between the information rich and poor.• In 2000, over 50% of all computers were in the US.• Over 41% of US homes had internet access. Western Europe

was also well connected, but 61% of Swedish homes wereconnected while 30% of Spanish homes were connected. InAsia on the whole 30% of homes had access, but the rate wasmuch higher in Japan and S. Korea.

• In the US, 46% of White Americans had access, while 23% ofAfrican Americans and Hispanics had access.

• 86% of homes with income over $75,000 had access, while lessthan 12% of homes with income less than $15,000 had access.

• 64% of homes with a university graduate had access, while lessthan 11% of homes in which no one went to university hadaccess.

• Since internet access is a vital part of contemporary life, manypeople have argued that governments need to do more tobridge these divides.

Internet use vs. GDP, 2011

Broadband internet per capita vs. GDP, 2010

Personal computers per capita vs. GDP, 2006

The open source movement• Open source software is published with the source code, so

that it can be studied and modified.• Software companies do not actually sell their software, they

license it. This allows them to sell buggy software withoutbeing sued.

• Open source became a major programing paradigm in the1990s.

• GNU, Linux, TeX, Apache, Mozilla, OpenOffice, etc.• Open source is made possible through the internet, with

hundred of developers contributing code and usually oneperson maintaining the primary distribution.

• Developers: amateurs, professionals donating their time (oftenwith the approval of their company), companies.

• Open source project development acts as a form of naturalselection, since interesting projects attract a lot of attentionand unmaintained projects are dropped.

• The attention of so many developers produces robust code.

Hacking and information security• In the 1960s a hacker was someone who could write code or

modify hardware. (Phreakers could manipulate the phonesystem.) It was used as a kind of praise.

• The term hacker came to mean someone who breaks intocomputer systems, and was made popular by WarGames(1983) and Neuromancer (1984). Hackers are motivated by adesire for knowledge, notoriety, wealth, thrill-seeking, revenge,political expression, etc.

• There is a lot of hacking going on. Most systems are“attacked” at least a couple times a week.

• In 1996, the US National Security Agency authorized a teamof hackers to try to break into military systems. They weresuccessful.

• In 2000, a disgruntled employee broke into the sewage systemin Moroochy Shire, Australia and dumped 1,000,000 liters ofsewage into local streams.

• In 2010, a huge number of US military and diplomatic fileswere leaked on Wikileaks.

Cyberwar• Cyberwar is a broad term that includes any kind of use of

computers in offensive or defensive warfare.• In the US military, each soldier is a node in the network, able

to digitally communicate with his commander.• The global positioning system (GPS) was developed because

the military needs to be able to accurately locate theirpersonnel.

• The US National Security Agency (NSA) is a cryptologicalintelligence agency responsible for the collection and analysisof foreign communications. They are said to have moresupercomputers than any other organization.

• They established a secret system, the Echelon, to interceptand decrypt email, fax, telex and telephone communicationsfrom all over the world. In 2013, details about the successor tothis program, Prism, were leaked.

• In 2010, a worm called Struxnet attacked an Iranian nuclearfacility. In 2012, Flame was discovered spying on hundreds ofcomputers in Iran and Israel.

Artificial intelligence (AI) and robots• The first attempts at AI were not very successful. They used

an approach called expert systems, which are based on aseries of rules and an inference engine.

• In the 1990s, AI research, cognitive science and neurosciencebegan to develop and become interlinked. This lead to the“neural networks” approach. (Fuzzy logic, machine learning,etc.)

• Deep Blue defeated Kasparov at chess, 1996; Watson defeatedRutter and Jennings at Jeopardy!; 2011, Eugene Goostmanbeat the Turing Test, 2014.

• There are now many limited AI systems, but still no generalAI – an AI that determines its own goals.

• We have many special purpose robots: manufacturing, spaceexploration, military, etc. There are some robots used inresearch projects.

• Current research in AI focuses on machine learning. Anexample is IBM’s Watson.

“The singularity”• S. Ulam, writing about a conversation with von Neumann,

1958:• “One conversation centered on the ever accelerating progress

of technology and changes in the mode of human life, whichgives the appearance of approaching some essential singularityin the history of the race beyond which human affairs, as weknow them, could not continue.”

• Vernor Vinge popularized this idea in 1983:• “We will soon create intelligences greater than our own. When

this happens, human history will have reached a kind ofsingularity, an intellectual transition as impenetrable as theknotted space-time at the center of a black hole, and the worldwill pass far beyond our understanding. This singularity, Ibelieve, already haunts a number of science-fiction writers. Itmakes realistic extrapolation to an interstellar futureimpossible. To write a story set more than a century hence,one needs a nuclear war in between ... so that the worldremains intelligible.”

Final Remarks

• We have looked at the 4th generation of computers and therise of personal computing. We can speculate about the 5th

generation of computers.• We have seen how personal computing can turn the computer

into a tool for self-expression and personal freedom.• We have looked at the rise of large computing networks, such

as the internet and WWW, which allow people all over theworld to communicate instantly.

• We have seen, again, how the spread and use of computertechnology is both social and political.