History of computers
-
Upload
barbara-m-king -
Category
Education
-
view
142 -
download
0
Transcript of History of computers
February 18, 2015
The History of Computers
February 18, 2015
1200 A.D. The Chinese invent the Abacus
The oldest surviving counting board is the Salamis tablet, which was used as early as 300 BC in Babylon, and which was discovered on the island of Salamis in Greece.
The Chinese used the abacus, a hand-held wooden device with rows of beads, to add, subtract, divide, and multiply.
February 18, 2015
1834Charles Babbage invented his Analytical Engine in
1834, a large device that used memory, programming, and stored data by way of punched cards to calculate
numbers.
February 18, 2015
1920'sExpanding on many of Babbage’s concepts,
Herman Hollerith founded International Business Machines (IBM) in 1924 out of 4 companies.
His tabulating machine which included the key
concept that data could be coded numerically. His
machine is the first to useelectricity.
Hollerith card puncher which used punched holes to encode data by the U.S. Census Bureau starting in 1890. Used by theNazis to track prisoners in out of concentration camps.
February 18, 2015
1930'sA wave of research produced an electronic computer that
could store data digitally as 0s and 1s, a vast improvement over mechanical machines that operated on gears.
Z3 used floating-point numbers which improved the accuracy of calculations
February 18, 2015
George BooleThe work of English mathematician George Boole was a key to further
development. By means of determining that all mathematical calculations can be stated as either true or false, Boole defined the
binary system – to be used by all future computers.
Binary system - uses 0 and 1
0 = 0
1 = 1
2 = 10
3 = 11
4 = 100
5 = 101
6 = 110
7 = 111
8 = 1000
9 = 1001
10 = 1010
11 = 1011
12 = 1100
13 = 1101
14 = 1110
15 = 1111
16 = 10000
17 = 10001
18 = 10010
19 = 10011
20 = 10100
1011 = (1 * 2^3) + (0 * 2^2) + (1 * 2^1) + (1 * 2^0) = 8 + 0 + 2 + 1 = 11
February 18, 2015
Boolean logic consists of three logical operators:
• OR
• AND
• NOT
Boolean logic "Gates"
February 18, 2015
Bit and Bytes
The word bit is a shortening of the words "Binary digIT."
Bits are rarely seen alone in computers. They are almost always bundled together into 8-bit collections, and these collections are called bytes. Why are there 8 bits in a byte? A similar question is, "Why are there 12 eggs in a dozen?" The 8-bit byte is something that people settled on through trial and error over the past 50 years.
With 8 bits in a byte, you can represent values as shown here: 0 = 00000000
1 = 00000001
2 = 00000010
February 18, 2015
February 18, 2015
Bit and BytesWhat is a "byte"?
A byte is the unit most computers use to represent a character such as a letter, number, or typographic symbol (for example, "g", "5", or "?"). A
byte is abbreviated with a "B". Computer storage is usually measured in byte multiples.
For example, an 820 MB hard drive holds a nominal 820 million bytes - or megabytes - of data.
What is a "gigabyte"?1,000,000,000 bytes: one billion bytes
What is a "terabyte"?1,000,000,000,000 bytes: one trillion bytes
What is a "kilobyte"?1024 Bytes = 1 KB
KB- Kilobyte - 1024 bytes make one KB
MB- Megabyte - 1024 KB make one MB
GB- Gigabyte - 1024 MB make one GB
What is a "megabyte"? 1024 KB make 1 MB
February 18, 2015
1 TB USB Portable Hard Drive - $457.69
8 GB USB FlashDrive = $4.99
256 GB USB FlashDrive = $169.09
February 18, 2015
What does "USB" stand for?Acronym for Universal Serial Bus
It is a type of connector which allows the user to attach peripheral devices to his computer. It is the most used connection point for data transfer in the world. It was created in the mid-1990s. A standard usb connector is a simple socket with 4 pins : one for power, one for ground
and two for data transfer.
February 18, 2015
Vacuum Tube Computers
February 18, 2015
Colossus - developed to decrypt secret German coded messages during World War II.
It used vacuum tubes and paper tape and could perform Boolean logic (yes/no, true/
false)
February 18, 2015
The working rebuilt bombe at Bletchley Park museum. Each of the rotating drums simulates the action of an Enigma rotor. There are 36 Enigma-equivalents and, on the right-hand end of the middle row, three indicator drums. John Harper led the "Phoenix" team that built this.[1] It was officially switched on by the Duke of Kent, patron of the British Computer Society on 17 July 2008.
Alan Turing & Bombe - also helped decipher Enigma
February 18, 2015
1940'sSecond-generation computers operated via transistors,
which were much smaller and energy efficient than vacuum tubes.
February 18, 2015
1950'sASCII was established as the first standard
industry computer language based upon the
English alphabet.The American Standard Code for Information Interchange
February 18, 2015
The integrated circuit became the standard computer technology, using many transistors and electronic circuits on a
single semi-conducting chip. This led to smaller, faster, and more powerful computers, as well as the first network and initial Internet concepts. In 1965, Lawrence Roberts of MIT connected a computer in Massachusetts to a computer in
California using a dial-up telephone connection.
1960's
February 18, 2015
The modern era of the microprocessor – a phenomenal breakthrough first used in the Apple II and Commodore personal computers. Further
advancements on the microprocessor (CPU) and the Internet have brought about a world powered and connected by computer
technology.
1970's - Present
A microprocessor incorporates the functions of a computer's central processing unit (CPU) on a single integrated circuit
February 18, 2015
The Intel 4004 The first commercial microprocessor
Microprocessors operate on numbers and symbols represented in the binary numeral system.The advent of low-cost computers on integrated circuits has transformed modern society. General-purpose microprocessors in personal computers are used for computation, text editing, multimedia display, and communication over the Internet. Designed by Ted Hoff in 1971, and which led to the development of the microcomputer industry.
February 18, 2015
February 18, 2015
Steve Jobs
Was an American entrepreneur and inventor, best known as the co-founder, chairman, and CEO of Apple Inc. Through Apple, he was widely recognized as a charismatic pioneer of the personal computer revolution and for his influential career in the computer and consumer electronics fields, transforming "one industry after another, from computers and smartphones to music and movies..."Jobs also co-founded and served as chief executive of Pixar Animation Studios; he became a member of the board of directors of The Walt Disney Company in 2006, when Disney acquired Pixar.
(February 24, 1955 – October 5, 2011)
- with Steve Wozniak
February 18, 2015
Bill Gates
William Henry "Bill" Gates III (born October 28, 1955) is an American business magnate, investor, programmer, inventor and philanthropist. Gates is the former chief executive and current chairman of Microsoft, the worldʼs largest personal-computer software company, which he co-founded with Paul Allen. He remains at Microsoft as non-executive chairman.
February 18, 2015
The World Wide Web (abbreviated as WWW or W3, commonly known as the web), is a system of interlinked hypertext documents accessed via the Internet. With a web browser, one can view web pages that may contain text, images, videos, and other multimedia, and navigate between them via hyperlinks.
WWWBritish engineer, computer scientist and at that time employee of the CERN, Sir Tim Berners-Lee, now Director of the World Wide Web Consortium (W3C), wrote a proposal in March 1989 for what would eventually become the World Wide Web
February 18, 2015
February 18, 2015
The FutureThree areas of new technology are artificial intelligence (machines that imitate human thinking and behavior), nanotechnology (cell-sized computer processors), and
membrane technology &data processed within organic, living tissue cells).
February 18, 2015
February 18, 2015
Nanotechnology
February 18, 2015
Membrane TechnologyMembrane separation processes are influencing -
1. Heating systems2. Water filtration3. Food processing 4. Pharmaceutical5. Medicine (artificial kidneys, artificial lungs)
UCSF bioengineering Professor Shuvo Roy holds a prototype model of an implantable artificial kidney that he and his research team are developing at the university's Mission Bay campus.
Read more: http://www.sfchronicle.com/health/article/Kidney-designers-take-cues-from-nature-4458059.php#ixzz2RmiN6XBQ
A device that achieves carbon dioxide/oxygen gas exchange could allow patients more freedom when awaiting a lung transplant
February 18, 2015
Forbes Magazine Article - 3/12/135 Trends that will Drive the Power of Technology
1. No-Touch Interface
Microsoft HoloLens
Google's Project Glass,
Apple Siri,
February 18, 2015
Microsoft HoloLens
February 18, 2015
February 18, 2015
2. Native ContentThe new digital battlefield will be fought in the living room, with Netflix, Amazon, Microsoft, Google, Apple and the cable companies all vying to produce a dominant model for delivering consumer entertainment.
February 18, 2015
3. Massively OnlineIn the last decade, massively multiplayer online games such as World of Warcraft became all the rage. Rather than simply play against the computer, you could play with thousands of others in real time.
Now other facets of life are going massively online. Khan Academy offers thousands of modules for school age kids, Code Academy can teach a variety of programming languages to just about anybody and the latest iteration is Massively Online Open Courses (MOOC’s) that offer university level instruction. (For a good example, see here).
The massively online trend has even invaded politics, with President Obama recently reaching out to ordinary voters through Ask Me Anything on Reddit and Google Hangouts.
February 18, 2015
February 18, 2015
February 18, 2015
4. The Web of Things
Smartphones
Smartcars
February 18, 2015
QR Codes - Quick Response Code
Is the trademark for a type of matrix barcode (or two-dimensional bar code) first designed for the automotive industry in Japan created in 1994. Bar codes are optical machine-readable labels attached to items that record information related to the item.
A QR code is read by an imaging device, such as a camera, and formatted algorithmically by underlying software using Reed-Solomon error correction until the image can be appropriately interpreted.
February 18, 2015
February 18, 2015
5. Computer Driven SupercomputingCompanies ranging from IBM to Google to Microsoft are racing to combine natural language processing with huge Big Data systems in the cloud that we can access from anywhere.
February 18, 2015
February 18, 2015