General Information about CERN

17
General Information about CERN (Central European Organisation of Nuclear Research) CERN in a nutshell CERN, the European Organization for Nuclear Research, is one of the world’s largest and most respected centres for scientific research. Its business is fundamental physics, finding out what the Universe is made of and how it works. At CERN, the world’s largest and most complex scientific instruments are used to study the basic constituents of matter - the fundamental particles. By studying what happens when these particles collide, physicists learn about the laws of Nature. The instruments used at CERN are particle accelerators and detectors. Accelerators boost beams of particles to high energies before they are made to collide with each other or with stationary targets. Detectors observe and record the results of these collisions. Founded in 1954, the CERN Laboratory sits astride the Franco–Swiss border near Geneva. It was one of Europe’s first joint ventures and now has 20 Member States. CERN’s mission Research, technology, collaboration, education The convention that established CERN in 1954 clearly laid down the main missions for the Organization. Primarily, the Convention states; “The Organization shall provide for collaboration among European States in nuclear research of a pure scientific and fundamental character (...). The Organization shall have no concern with work for military requirements and the results of its experimental and theoretical work shall be published or otherwise made generally available”. Today it is the contents of the nucleus – the basic building blocks of the Universe – that provide the key to unlock the frontier of fundamental research, but CERN’s main mission remains essentially the same. The Convention also states that CERN shall organize and sponsor international co- operation in research, promoting contacts between scientists and interchange with other laboratories and institutes. This includes dissemination of information, and the provision of advanced training for research workers, which continue to be reflected in the current programmes for technology transfer and education and training at many levels. Research: Seeking and finding answers to questions about the Universe

description

Research, Science

Transcript of General Information about CERN

Page 1: General Information about CERN

General Information about CERN (Central European

Organisation of Nuclear Research)

CERN in a nutshell

CERN, the European Organization for Nuclear Research, is one of the world’s largest and

most respected centres for scientific research. Its business is fundamental physics,

finding out what the Universe is made of and how it works. At CERN, the world’s largest

and most complex scientific instruments are used to study the basic constituents of

matter - the fundamental particles. By studying what happens when these particles

collide, physicists learn about the laws of Nature.

The instruments used at CERN are particle accelerators and detectors. Accelerators

boost beams of particles to high energies before they are made to collide with each other

or with stationary targets. Detectors observe and record the results of these collisions.

Founded in 1954, the CERN Laboratory sits astride the Franco–Swiss border near Geneva.

It was one of Europe’s first joint ventures and now has 20 Member States.

CERN’s mission

Research, technology, collaboration, education

The convention that established CERN in 1954 clearly laid down the main missions for the

Organization.

Primarily, the Convention states;

“The Organization shall provide for collaboration among European States in

nuclear research of a pure scientific and fundamental character (...). The

Organization shall have no concern with work for military requirements and the

results of its experimental and theoretical work shall be published or otherwise

made generally available”.

Today it is the contents of the nucleus – the basic building blocks of the Universe – that

provide the key to unlock the frontier of fundamental research, but CERN’s main mission

remains essentially the same.

The Convention also states that CERN shall organize and sponsor international co-

operation in research, promoting contacts between scientists and interchange with other

laboratories and institutes. This includes dissemination of information, and the provision

of advanced training for research workers, which continue to be reflected in the current

programmes for technology transfer and education and training at many levels.

Research: Seeking and finding answers to questions about the Universe

Page 2: General Information about CERN

Technology: Advancing the frontiers of technology

Collaborating: Bringing nations together through science

Education: Training the scientists of tomorrow

CERN's structure

The CERN Council is the highest authority of the Organization and has responsibility for

all-important decisions. It controls CERN’s activities in scientific, technical and

administrative matters. The Council approves programmes of activity, adopts the budgets

and reviews expenditure. The Council is assisted by the Scientific Policy Committee and

the Finance Committee.

The Director-General, appointed by the Council, manages the CERN Laboratory. He is

assisted by a Directorate and runs the Laboratory through a structure of Departments.

Council

CERN is run by 20 European Member States, each of which has two official delegates to

the CERN Council. One represents his or her government’s administration; the other

represents national scientific interests. Each Member State has a single vote and most

decisions require a simple majority, although in practice the Council aims for a consensus

as close as possible to unanimity.

Scientific Policy Committee

The Scientific Policy Committee evaluates the scientific merit of activities proposed by

physicists and makes recommendations on CERN’s scientific programme. Its members are

scientists elected by their colleagues on the Committee and appointed by Council on the

basis of scientific eminence without reference to nationality. Some members are also

elected from non-Member States.

Finance Committee

The Finance Committee is composed of representatives from national administrations and

deals with all issues relating to financial contributions by the Member States and to the

Organization’s budget and expenditure.

Director-General

Appointed by Council, usually for five years, the Director-General manages CERN. The

Director-General is assisted by a Directorate, whose members he proposes to Council. The

Director-General reports directly to the Council. He can also propose to Council any

adjustment he deems necessary to meet the evolving needs of the research programme.

Directorate

Director-General: Rolf Heuer

Director for Research and Computing: Sergio Bertolucci

Director for Accelerators and Technology: Stephen Myers

Director for Administration and General Infrastructure: Sigurd Lettow

International Relations

Page 3: General Information about CERN

Co-ordinator for International Relations: Felicitas Pauss

Heads of Departments

PH - Physics: Philippe Bloch

IT - Information Technology: Frederic Hemmer

BE - Beams: Paul Collier

TE - Technology: Frédérick Bordry

EN - Engineering: Roberto Saban

HR - Human Resources: Anne-Sylvie Catherin

FP -Finance, Procurement and Knowledge Transfer : Thierry Lagrange

GS - General Infrastructure Services: Thomas Pettersson

Directorate Office

Isabel Bejar-Alonso

Ewa Rondio

Emmanuel Tsesmelis

The name CERN

CERN is the European Organization for Nuclear Research. The name is derived from the

acronym for the French Conseil Européen pour la Recherche Nucléaire, or European

Council for Nuclear Research, a provisional body founded in 1952 with the mandate of

establishing a world-class fundamental physics research organization in Europe. At that

time, pure physics research concentrated on understanding the inside of the atom, hence

the word ‘nuclear’.

When the Organization officially came into being in 1954, the Council was dissolved, and

the new organization was given the title European Organization for Nuclear Research,

although the name CERN was retained.

Today, our understanding of matter goes much deeper than the nucleus, and CERN’s main

area of research is particle physics — the study of the fundamental constituents of matter

and the forces acting between them. Because of this, the laboratory operated by CERN is

commonly referred to as the European Laboratory for Particle Physics.

A global endeavour

CERN is run by 20 European Member States, but many non-European countries are also

involved in different ways. Scientists come from around the world to use CERN’s facilities.

The current Member States are: Austria, Belgium, Bulgaria, the Czech Republic, Denmark,

Finland, France, Germany, Greece, Hungary, Italy, the Netherlands, Norway, Poland,

Portugal, the Slovak Republic, Spain, Sweden, Switzerland and the United Kingdom.

Romania, Israel and Serbia are candidates to become Member States of CERN.

Page 4: General Information about CERN

Member States have special duties and privileges. They make a contribution to the capital

and operating costs of CERN’s programmes, and are represented in the Council,

responsible for all important decisions about the Organization and its activities.

Some states (or international organizations) for which membership is either not possible

or not yet feasible are Observers. ‘Observer’ status allows non-Member States to attend

Council meetings and to receive Council documents, without taking part in the decision-

making procedures of the Organization.

Scientists from some 608 institutes and universities around the world use CERN’s

facilities.

Physicists and their funding agencies from both Member and non-Member States are

responsible for the financing, construction and operation of the experiments on which

they collaborate. CERN spends much of its budget on building new machines (such as the

Large Hadron Collider), and it only partially contributes to the cost of the experiments.

Observer States and Organizations currently involved in CERN programmes are: the

European Commission, India, Japan, the Russian Federation, Turkey, UNESCO and the USA.

Non-Member States with co-operation agreements with CERN are: Algeria, Argentina,

Armenia, Australia, Azerbaijan, Belarus, Bolivia, Brazil, Canada, Chile, China, Colombia,

Croatia, Cyprus, Ecuador, Egypt, Estonia, Former Yugoslav Republic of Macedonia

(FYROM), Georgia, Iceland, Iran, Jordan, Korea, Lithuania, Malta, Mexico, Montenegro,

Morocco, New Zealand, Pakistan, Peru, Saudi Arabia, Slovenia, South Africa, Ukraine,

United Arab Emirates and Vietnam.

CERN also has scientific contacts with: China (Taipei), Cuba, Ghana, Ireland , Latvia,

Lebanon, Madagascar, Malaysia, Mozambique, Palestinian Authority, Philippines, Qatar,

Rwanda, Singapore, Sri Lanka, Thailand, Tunisia, Uzbekistan and Venezuela.

For further information about CERN's international relations please refer to:

http://cern.ch/international-relations

Half the world’s particle physicists

CERN employs just under 2400 people. The Laboratory’s scientific and technical staff

designs and builds the particle accelerators and ensures their smooth operation. They

also help prepare, run, analyse and interpret the data from complex scientific

experiments.

Some 10,000 visiting scientists, half of the world’s particle physicists, come to CERN for

their research. They represent 608 universities and 113 nationalities.

Page 5: General Information about CERN

History highlights

First excavation work on meyrin site

From a simple green field to the largest particle physics laboratory in the world… Some

people may think this requires a leap of imagination. Actually, that’s exactly the point.

Many imaginative leaps and jumps weave their ways through the story of CERN to make it

what it is today. But a stroll through this collection of highlights doesn’t just tell the story

of a laboratory, it also reflects the different challenges that grip particle physicists

through the decades.

As CERN continues to evolve through changing times, its goal of pure research continues

to contribute to science and technology. From Nobel Prize winning physics to the World

Wide Web. From 1954 to the here and now…

Nobel prizes

J. Steinberger, F. Bloch, S. Ting, G. Charpak, C. Rubbia, S. van der meer

One dream of CERN’s founders, to achieve European eminence in ‘big’ science, was

realised in 1984, when Carlo Rubbia and Simon Van der Meer received the Nobel Prize in

physics for “their decisive contributions to the large project which led to the discovery of

the field particles W and Z, communicators of the weak interaction.” The project was a

magnificently executed scheme to collide protons and antiprotons in the existing Super

Proton Synchrotron. The experimental results confirmed the unification of weak and

electromagnetic forces, the electroweak theory of the Standard Model.

Less than a decade later, Georges Charpak, a CERN physicist since 1959, received the 1992

physics Nobel for “his invention and development of particle detectors, in particular the

multiwire proportional chamber, a breakthrough in the technique for exploring the

innermost parts of matter.” Charpak’s multiwire proportional chamber, invented in 1968,

and his subsequent developments launched the era of fully electronic particle detection.

Charpak’s detectors are also used for biological research and could eventually replace

photographic recording in applied radio-biology. The increased recording speeds translate

Page 6: General Information about CERN

into faster scanning and lower body doses in medical diagnostic tools based on radiation

or particle beams.

The Laboratory not only attracts Nobel Prizes but also Nobel Laureates. Indeed the first

Director-General, Felix Bloch, was awarded the 1952 Nobel prize with Edward Mills Purcell,

“for their development of new methods for nuclear magnetic precision measurements and

discoveries in connection therewith.” The 1976 physics Prize was awarded to the Large

Electron–Positron Collider (LEP) experiment L3 spokesman Sam Ting, with Burt Richter,

“for their pioneering work in the discovery of a heavy elementary particle of a new kind.”

Discovered in 1974, the particle called J/ψ is a charm quark-antiquark composite.

In 1988, Jack Steinberger, a CERN physicist since the late 1960s and head of the LEP

ALEPH experiment at the time, was awarded the physics Prize with Leon Lederman and

Mel Schwartz, “for the neutrino beam method and the demonstration of the doublet

structure of the leptons through the discovery of the muon neutrino.” The discovery,

made in 1962 at the US Brookhaven National Laboratory, showed that there was more

than one type of neutrino.

Why Fundamental Science?

From theory to experiment

Some areas of scientific research, such as particle physics and cosmology, seem remote

from everyday life and unlikely to bring immediate practical applications. Are they

worth the effort in human and material resources?

This research may take us far away from the conditions of everyday life, but because it

continually pushes at boundaries in thinking and in technology it is a springboard for

many new developments.

Fundamental science is where new ideas and methods begin that later become

commonplace - from the electric light, which originated in 19-century curiosity about

electricity, to the World Wide Web, invented at CERN to allow international teams of

particle physicists to communicate more easily. No amount of applied research on the

candle would have brought us the electric light; no amount of R&D on the telephone

would have brought about the Web. Science needs the space for curiosity and

imagination.

Basic science in a competitive world

Page 7: General Information about CERN

Large electron positron experiment

by Robert Aymar,

former Director General of CERN

first published in Symmetry magazine, August 2006

We are constantly being told that we live in a competitive world in which innovation is the

main driver towards growth and prosperity. What is the place in such a world for

fundamental science, whose short-term contribution to society is knowledge without any

immediate application? Is it an unnecessary luxury? Should the world be deploying its

resources in pursuit of more pressing needs: public health, clean energy, safe water? Of

course it should, and I believe that investment in fundamental science serves these

goals. It is a long-term investment, laying the foundations for future innovation and

prosperity.

History teaches us that big jumps in human innovation come about mainly as a basic result

of pure curiosity. Innovation is key to meeting many of today’s development challenges,

and the primary force for innovation is fundamental research. Without it, there would be

no science to apply. Faraday's experiments on electricity, for example, were driven by

curiosity but eventually brought us electric light. No amount of R&D on the candle could

ever have done that. Electric light came from innovation driven by fundamental science.

The long-term role of fundamental science is well understood by the European Investment

Bank, the financial arm of the European Union. In 2003, the EIB gave a strong endorsement

of fundamental science when it lent €300 million to CERN to help finance the construction

of the Large Hadron Collider (LHC). Why should the EIB consider the world’s largest

fundamental physics project to be a worthy investment? I believe the reason is that

fundamental science paves the way to future innovation.

Fundamental research has the power to make people dream, and it attracts the innovators

of the future into science. Without the excitement provided by research and discovery at

the frontiers of knowledge, the pool of scientists would undoubtedly be smaller.

The scientists who work on the LHC are driven by a desire to learn about the Universe,

but that has not stopped them from developing particle acceleration and detection

techniques that have found applications in medicine, for example. Scientists at CERN

invented the World Wide Web, which has revolutionized the way we share information and

do business. Today the LHC community worldwide is working on computing grids, the next

frontier in information technology, which already have applications in fields such as Earth

observation, climate prediction, petroleum exploration, and drug discovery.

Page 8: General Information about CERN

LHC experiments will observe particle collisions at the rate of up to 600 million per

second. This equates to about one petabyte per second, roughly the equivalent of about

150 000 DVD movies. Clearly, storing such quantities would be impossible, so we have

to develop very clever electronics to sift out the interesting data. Even after

draconian data reduction, however, we will be storing around 15 petabytes per year.

Organizing access to this data for thousands of scientists from around the world is

the reason particle physics is at the forefront of grid computing, which will make

access to computing resources as simple as tapping into the electricity grid by

plugging in an electric light.

Fundamental science has a vital role to play in today’s competitive world. It is fundamental

science that lays the long-term foundations for innovation and prosperity. Abdus Salam,

the Nobel prize-winning physicist from Pakistan, said, "In the final analysis, creation,

mastery, and utilization of modern science and technology [are] basically what

distinguishes the South from North. On science and technology depend the standards of

living of a nation." This is the challenge for fundamental science in today’s world of

competition. Fundamental science has a vital role to play in the process of innovation. In

today’s competitive world, it is as important as it has ever been.

The use of basic science

The tracking chamber of the ALICE muon spectrometer

by C.H. Llewellyn Smith,

former Director-General of CERN

Over 200 years ago, at the beginning of 1782, the German physicist and philosopher

Christof Lichtenberg wrote in his diary:

"To invent an infallible remedy against toothache, which would take it away in a moment,

might be as valuable and more than to discover a new planet... but I do not know how to

start the diary of this year with a more important topic than the news of the new planet".

He was referring to the planet Uranus, discovered in 1781. The question Lichtenberg

implicitly raised, of the relative importance of looking for technical solutions to specific

problems, and of searching for new fundamental knowledge, is even more pertinent today

than it was 200 years ago.

In this paper I shall argue that the search for fundamental knowledge, motivated by

curiosity, is as useful as the search for solutions to specific problems. The reasons we

have practical computers now, and did not have them 100 years ago, is not that meanwhile

we have discovered the need for computers. It is because of discoveries in fundamental

Page 9: General Information about CERN

physics which underwrite modern electronics, developments in mathematical logic, and

the need of nuclear physicists in the 1930s to develop ways of counting particles.

I shall cite many examples that demonstrate the practical and economic importance of

fundamental research. But if fundamental, curiosity-driven, research is economically

important, why should it be supported from public, rather than private, funds? The reason

is that there are kinds of science that yield benefits that are general, rather than specific

to individual products, and hence generate economic returns which cannot be captured by

any single company or entrepreneur. Most pure research is consequently funded by

people or organizations who have no commercial interest in the results and the

continuation of this kind of funding is essential for further advance.

It would certainly be naive, even wrong, to equate the pure uniquely with the general, and

the applied with the specific, but it is far more likely that a substantial proportion of the

benefits of applied research will accrue to those who undertake it. Furthermore, once

definite economic returns can clearly be anticipated, the private sector, motivated by

profit, is generally better placed to undertake the necessary research and development. It

follows that a policy of diverting public support from pure to applied scientific research

would also divert funds from investment which only the public sector can make, to areas

where the private sector is generally likely to do better.

Section 2 of this paper contains some general remarks on the difference between basic

and applied science. Section 3 then describes the benefits of basic science. In Section 4,

the above well-known argument that governments have a special responsibility to support

basic science as a "public good" is elaborated. This argument, which is relatively easy to

make, leads to two much harder questions, which are dealt with in Sections 5 and 6

respectively:

If companies can leave funding of basic science to governments, why can some

governments not opt out – leaving it to others – as it is sometimes argued Japan has done

very successfully? How should governments choose what to support, and at what level?

Physics for health

Positron emission tomography (PET)

by Rolf Heuer, Director General of CERN

First published in the CERN Bulletin, Feb 2010

Ever since pioneers like Rolf Wideröe and Ernest Lawrence built the first particle

accelerators in the 1920s and 30s, particle physics has contributed to advances in

medicine.

Page 10: General Information about CERN

Today, over half of the world’s particle accelerators are used in medicine, and more and

varied uses are being found for them all the time. The same is true for particle detector

technology. In the 1970s, CERN played an important role in the emerging technology of

positron emission tomography (PET), building prototype scanners in a collaboration with

Geneva’s hospital. That tradition continues to this day, with crystal technology developed

for LEP, coupled to electronics developed for the LHC, pointing the way to combined

PET/MRI scanners.

It’s a proud track record by any standards, but we can do better. In the past, the transfer of

knowledge and technology between the biomedical professions and physics has been

sporadic: based on chance rather than strategy. That’s why CERN hosted a workshop on

physics for health on 2-4 February 2010, and charged its participants with drafting a

strategy that will ensure that the two communities work more closely together in the

future.

That workshop was a great success, bringing together some 400 physicists, biologists and

healthcare professionals from around the world. These included some of the early

pioneers, such as David Townsend, who was a key player in the early days of PET, as well as

people at the cutting edge of developments today.

The workshop set itself the goal of reviewing progress in the domain of physics

applications in life sciences, stimulating exchanges between the different communities

and indicating the subjects most suitable for further studies in diagnosis and therapy. The

workshop explored synergies between physics and physics spin-offs to fight disease with a

focus on radiobiology, accelerators, radioisotope production, detectors and use of IT. The

strategy paper is still being deliberated, but I feel sure it will provide a sound blueprint for

an ever closer partnership between physics and health.

Where the web was born

Tim Berners-Lee: World-Wide Web inventor

Tim Berners-Lee, a scientist at CERN, invented the World Wide Web (WWW) in 1989. The

Web was originally conceived and developed to meet the demand for automatic

information sharing between scientists working in different universities and institutes all

over the world. CERN is not an isolated laboratory, but rather a focus for an extensive

community that now includes about 60 countries and about 8000 scientists. Although

these scientists typically spend some time on the CERN site, they usually work at

universities and national laboratories in their home countries. Good contact is clearly

essential. The basic idea of the WWW was to merge the technologies of personal

computers, computer networking and hypertext into a powerful and easy to use global

information system.

Page 11: General Information about CERN

How the web began

Commemorative plague for the invention of the Web

The first proposal for the World Wide Web (WWW) was made at CERN by Tim

Berners-Lee in 1989, and further refined by him and Robert Cailliau in 1990.

By the end of that year, prototype software for a basic system was already being

demonstrated. To encourage its adoption, an interface to the CERN Computer

Centre's documentation, to the ‘help service’ and also to the familiar Usenet

newsgroups was provided.

The first web servers were all located in European physics laboratories and only a

few users had access to the NeXT platform on which the first browser ran. CERN

soon provided a much simpler browser, which could be run on any system. In 1991,

an early WWW system was released to the high energy physics community via the

CERN program library. It included the simple browser, web server software and a

library, implementing the essential functions for developers to build their own

software. A wide range of universities and research laboratories started to use it. A

little later it was made generally available via the Internet, especially to the

community of people working on hypertext systems.

Going global

The first web server in the United States came on-line in December 1991, once

again in a pure research institute: the Stanford Linear Accelerator Center (SLAC) in

California.

At this stage, there were essentially only two kinds of browser. One was the

original development version, very sophisticated but only available on the NeXT

machines. The other was the ‘line-mode’ browser, which was easy to install and run

on any platform but limited in power and user-friendliness. It was clear that the

small team at CERN could not do all the work needed to develop the system

further, so Berners-Lee launched a plea via the Internet for other developers to

join in.

Several individuals wrote browsers, mostly for the X-window system. The most

notable from this era are MIDAS by Tony Johnson from SLAC, Viola by Pei Wei from

O'Reilly, Erwise by the Finns from the Helsinki University of Technology.

Page 12: General Information about CERN

Early in 1993, the National Center for Supercomputing Applications (NCSA) at the

University of Illinois released a first version of their Mosaic browser. This software

ran in the X Window System environment, popular in the research community, and

offered friendly window-based interaction. Shortly afterwards the NCSA released

versions also for the PC and Macintosh environments. The existence of reliable

user-friendly browsers on these popular computers had an immediate impact on

the spread of the WWW. The European Commission approved its first web project

(WISE) at the end of the same year, with CERN as one of the partners. By late 1993

there were over 500 known web servers, and the WWW accounted for 1% of

Internet traffic, which seemed a lot in those days! (The rest was remote access, e-

mail and file transfer.) 1994 really was the ‘Year of the Web’. The world’s First

International World Wide Web conference was held at CERN in May. It was

attended by 400 users and developers, and was hailed as the ‘Woodstock of the

Web’. As 1994 progressed, the Web stories got into all the media. A second

conference, attended by 1300 people, was held in the US in October, organised by

the NCSA and the already created the International WWW Conference Committee

(IW3C2).

By the end of 1994, the Web had 10,000 servers, of which 2,000 were commercial,

and 10 million users. Traffic was equivalent to shipping the entire collected works

of Shakespeare every second. The technology was continually extended to cater

for new needs. Security and tools for e-commerce were the most important

features soon to be added.

Open standards

An essential point was that the Web should remain an open standard for all to use

and that no-one should lock it up into a proprietary system.

In this spirit, CERN submitted a proposal to the Commission of the European Union

under the ESPRIT programme: ‘WebCore’. The goal of the project was an

International Consortium, in collaboration with the US Massachusetts Institute of

Technology (MIT). Berners-Lee officially left CERN at the end of 1994 to work on

the Consortium from the MIT base. But with approval of the LHC project clearly in

sight, it was decided that further Web development was an activity beyond the

Laboratory’s primary mission. A new home for basic Web work was needed.

The European Commission turned to the French National Institute for Research in

Computer Science and Controls (INRIA), to take over the role of CERN.

In January 1995, the International World Wide Web Consortium (W3C) was

founded ‘to lead the World Wide Web to its full potential by developing common

protocols that promote its evolution and ensure its interoperability’.

By 2007 W3C, run jointly by MIT/LCS in the US, INRIA in France, and Keio University

in Japan, had more than 430 member organizations from around the world.

Page 13: General Information about CERN

How the web works

The Web is a world of information available at the click of a mouse. To use it, you

need a computer, a connection to the Internet, and a browser.

When you run your browser, it finds and displays pages of information. The

function of a Web browser is to interpret the programming language of the web

pages (HTML, …) and transform it into the words and graphics that you see on your

screen. If you need more information, all you have to do is click on a hyperlink. On

each page, certain words, phrases, or even images are highlighted, and clicking on

them causes the browser to go off and find another page, which probably contains

more highlighted items, and so on.

All Web documents

are stored on so-

called server

computers,

represented in the

image by a factory.

Users can inspect

these documents by

requesting them

from their local

(personal)

computers,

represented by the

house, and called a

client. All computers

involved in the Web

are connected by the Internet, represented by the roads. When you click on a

hyperlink, your computer asks a server computer to return

to you a document.

For example, starting from the CERN ‘Welcome page’ in

Switzerland your next click might fetch a document from a

physics lab at the other side of the world. All the

Page 14: General Information about CERN

information seems to be in the little box in front of you, though in reality it is

spread over the globe.

The web is also friendly to the network: when you click on a piece of highlighted

text your browser ‘orders a document’ from another computer, receives it by

‘return mail’ and displays it. You are then free to read the new page at leisure,

without further consumption of network resources.

The Web may be used to initiate processes on either the client or the server. A

request can start a database search on a server, returning a synthesised document.

A document returned in an unfamiliar format can cause the browser to start a

process on the client machine in order to interpret it.

The Web's ability to negotiate formats between client and server makes it possible

to ship any type of document from a server to a client, provided the client has the

appropriate software to handle that format. This makes video, sound and anything

else accessible without the need for a single application to be able to interpret

everything.

The Web and the Internet

The Web is not identical to the Internet; it is only one of the many Internet-based

communication services. The relation between them may be understood by using

the analogy with the global road system. On the Internet, as in the road system,

three elements are essential: the physical connections (roads and cables), the

common behaviour (circulation rules and Internet protocol) and the services (mail

delivery and the WWW).

The physical connections: cables and roads

Cables are a passive infrastructure, laid down locally by governments and telecoms

companies. Cables have different capacity: a single telephone line like the one

leading from your home can handle about 7 kilobytes per second, the equivalent of

a page of text per second. Optical fibres handle well into the thousand millions of

bytes per second. Although the cables may be of different types and the junctions

may be very complicated, they are all interconnected.

On the roads it is possible for you to drive from home out to a far away place,

perhaps in another country, passing from highways to country roads. Similarly, you

can find a continuous connection through several interchange nodes between your

computer at home and the one of a friend in Australia.

The common behaviour: the Internet

Page 15: General Information about CERN

Connecting computers to the cables is not enough: to be able to talk to each other

they have to agree on a common way of behaving, just like we do when we drive

our cars on the roads. The Internet is like the traffic rules: computers must use the

cables in an agreed fashion.

Thousands of cars can use the same roads even if they all have different

destinations; no problems arise as long as on the road everybody drives on one

side, stop for red traffic lights and so on.

The Internet transfers data in little packets between computers. To use the cables

between them profitably, computers must obey rules too: they have to use the

same communication protocol.

A communication protocol is something you are familiar with if you have ever

talked to someone: in a conversation, people know when to start speaking, when

to stop, which sounds to make to encourage the other person to continue, and so

on. This is an implicit ‘protocol’ for humans. Computers exchanging data over

cables need a similar set of rules for behaviour.

To be ‘connected to the Internet’, a computer must respect the Internet protocols.

It can do so if a compatible layer of software has been installed on it. The common

protocol for the Internet is called the Transmission Control Protocol/Internet

Protocol or TCP/IP.

Services for everyone

Once you have the cables and the protocol to use them, your computer can

communicate with all the others. But what can they say to each other?

You can use the roads to drive on as an individual, you can run scheduled bus lines,

transport heavy goods, you can even run a pizza delivery service. Similarly, on the

Internet, you can run data services: electronic mail, file transfer, remote log-in,

bulletin boards, …

The World Wide Web is just one of them, a bit like a ‘parcel delivery service’: at

your request, the WWW will deliver you the required document.

The website of the world's first-ever web server

1990 was a momentous year in world events. In February, Nelson Mandela was

freed after 27 years in prison. In April, the space shuttle Discovery carried the

Hubble Space Telescope into orbit. And in October, Germany was reunified.

Then at the end of 1990, a revolution took place that changed the way we live

today.

Page 16: General Information about CERN

CERN, the European Organization for Nuclear

Research, is where it all began in March 1989. A

physicist, Tim Berners-Lee, wrote a proposal for

information management showing how information

could be transferred easily over the Internet by using

hypertext, the now familiar point-and-click system of

navigating through information. The following year,

Robert Cailliau, a systems engineer, joined in and

soon became its number one advocate.

The idea was to connect hypertext with the Internet

and personal computers, thereby having a single

information network to help CERN physicists share all

the computer-stored information at the laboratory.

Hypertext would enable users to browse easily between texts on web pages using

links. The first examples were developed on NeXT computers.

Berners-Lee created a browser-editor with the goal of developing a tool to make

the Web a creative space to share and edit information and build a common

hypertext. What should they call this new browser: The Mine of Information? The

Information Mesh? When they settled on a name in May 1990, it was the

WorldWideWeb.

Info.cern.ch was the address of the world's first-ever web site and web server,

running on a NeXT computer at CERN. The first web page address was

http://info.cern.ch/hypertext/WWW/TheProject.html, which centred on

information regarding the WWW project. Visitors could learn more about

hypertext, technical details for creating their own webpage, and even an

explanation on how to search the Web for information. There are no screenshots

of this original page and, in any case, changes were made daily to the information

available on the page as the WWW project developed. You may find a later copy

(1992) on the World Wide Web Consortium website.

However, a website is like a telephone; if there's just one it's not much use.

Berners-Lee's team needed to send out server and browser software. The NeXT

systems however were far advanced over the computers people generally had at

their disposal: a far less sophisticated piece of software was needed for

distribution.

By spring of 1991, testing was underway on a universal line mode browser, which

would be able to run on any computer or terminal. It was designed to work simply

by typing commands. There was no mouse, no graphics, just plain text, but it

allowed anyone with an Internet connection access to the information on the Web.

Page 17: General Information about CERN

The historic NeXT computer used by Tim

Berners-Lee in 1990, on display in the

Microcosm exhibition at CERN. It was the first

web server, hypermedia browser and web

editor.

During 1991 servers appeared in other

institutions in Europe and in December 1991,

the first server outside the continent was

installed in the US at SLAC (Stanford Linear Accelerator Center). By November

1992, there were 26 servers in the world, and by October 1993 the figure had

increased to over 200 known web servers. In February 1993, the National Center

for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-

Champaign released the first version of Mosaic, which was to make the Web

available to people using PCs and Apple Macintoshes.

... and the rest is Web history.

Although the Web's conception began as a tool to aid physicists answer tough

questions about the Universe, today its usage applies to various aspects of the

global community and affects our daily lives.

Today there are upwards of 80 million websites, with many more computers

connected to the Internet, and hundreds of millions of users. If households

nowadays want a computer, it is not to compute, but to go on the Web.