JCOP Workshop September 8th 1999 H.J.Burckhart 1 ATLAS DCS Organization of Detector and Controls...

Post on 12-Jan-2016

215 views 3 download

Transcript of JCOP Workshop September 8th 1999 H.J.Burckhart 1 ATLAS DCS Organization of Detector and Controls...

JCOP Workshop September 8th 1999 H.J.Burckhart

1

ATLAS DCS

• Organization of Detector and Controls Architecture

• Connection to DAQ

• Front-end System

• Practical Points

• Conclusions

JCOP Workshop September 8th 1999 H.J.Burckhart

2

Scope of DCSScope of DCS

Detector Sub-detectors Experiment’s infrastructure

“External Services” Services of CERN infrastructure LHC accelerator DAQ

JCOP Workshop September 8th 1999 H.J.Burckhart

3

ATLAS picture here

JCOP Workshop September 8th 1999 H.J.Burckhart

4

Detector Organisation Detector Organisation

JCOP Workshop September 8th 1999 H.J.Burckhart

5

Detector organisationDetector organisation

Hierarchical organisation of quasi independent units (“objects”) Separation for various reasons (organisational, operational,

geometrical, etc.) Units have to operate stand-alone and integrated Data flow mainly vertically

Common Infrastructure handled like a subdetector

JCOP Workshop September 8th 1999 H.J.Burckhart

6

JCOP Workshop September 8th 1999 H.J.Burckhart

7

Hierarchical levels of DCSHierarchical levels of DCSSupervisor Level

operator console shift operator, sub-system expert

server data base, conn. DAQ, External system

Subsystem control levelLocal Ctrl Station Gas, HV, endcap

--------------------------------------------------------------------------

Device ControlFieldbus node chamber, sector

PLC cooling

VME HV

Sensors

SCADA

FE I/O

JCOP Workshop September 8th 1999 H.J.Burckhart

8

Consequences of Detector organisation Consequences of Detector organisation on DCSon DCS

Subdetector groups choose mapping Unit<==>DCS_level

I/F definition SCADA <==> Front-end I/O HW (industry) standards standard SW protocols

High numbers of channels (>>10k) multiplexed in F/E Separation Tools <==> Applications

work done at different geographical places guaranties homogeneity

JCOP Workshop September 8th 1999 H.J.Burckhart

9

External systemsExternal systems

JCOP Workshop September 8th 1999 H.J.Burckhart

10

Interaction with External SystemsInteraction with External Systems

Exchange of (result) data some imported data fully treated in SCADA

Possibility to send commands and receive feedback Exchange of (dynamic) Status information

==> Use common mechanism with all External Systems

(LDIWG)

JCOP Workshop September 8th 1999 H.J.Burckhart

11

Connection DCS - DAQConnection DCS - DAQ

Complete operational independence DCS needed already during production, assembly, installation DCS needed 365/365, 24/24, DAQ during data taking periods Data paths of DAQ and DCS should be separated

Independence MUST NOT result in restrictions of functionality Seamless information exchange Sending commands and getting feedback in both directions Common Data Base

JCOP Workshop September 8th 1999 H.J.Burckhart

12

Boundary DCS - DAQBoundary DCS - DAQ

DAQ treats all aspects of physics event data (‘event #’): data flow, quality monitoring, storage, etc.

DCS treats other data (‘time stamp’) Interaction DAQ <==> LHC via DCS

JCOP Workshop September 8th 1999 H.J.Burckhart

13

Interaction with DAQInteraction with DAQ

DAQ control and configuration services organised in

“Backend DAQ components” (SCADA has similar

components)

Information ServiceExchange of dynamic information (e.g.status)

==> bi-directional interface with SCADA (filtered)

Message Reporting Systemreports asynchronously events (e.g. error messages, state changes)

==> SCADA has to inject it’s ‘events’ (e.g. actions, state changes)

JCOP Workshop September 8th 1999 H.J.Burckhart

14

Interaction with DAQ (cont.)Interaction with DAQ (cont.)

Run Controlco-ordinates DAQ sub-systems

==> sending/receiving commands and getting feedback about

success/failure

Graphical User Interfaceoperator interaction, viewing of DAQ state

==> top level of DCS should be accessible

JCOP Workshop September 8th 1999 H.J.Burckhart

15

Interaction with DAQ (cont.)Interaction with DAQ (cont.)

Data Base Configuration (HW, SW, parameters, calibration constants, etc.) command logging (operator, “automatic”, etc.) incident logging (alarms, state changes, etc.) storage of measurement results (currents, temperatures, etc.)

ATLAS wide DB for DAQ, DCS and Offline

Naming conventions according to HW (PBS,ABS)

==> This DB is the master DB for DCS

JCOP Workshop September 8th 1999 H.J.Burckhart

16

Product Breakdown StructureProduct Breakdown Structure

JCOP Workshop September 8th 1999 H.J.Burckhart

17

Examples of operations (‘Use case’)Examples of operations (‘Use case’)

Calibration DCS driven interleaved during data taking DAQ driven

Start of run (load, configuration, start) Partitioning Loading parameters

JCOP Workshop September 8th 1999 H.J.Burckhart

18

Why separating DAQ and DCS?Why separating DAQ and DCS?

Time scale DAQ in constant evolution Avoid additional requirements on SCADA

platform real time

4 LHC experiments, 4 DAQ systems (?) ATLAS DAQ design already advanced Past experience

JCOP Workshop September 8th 1999 H.J.Burckhart

19

Front-end I/OFront-end I/O Fieldbus

general purpose node (LMB) dedicated Fieldbus node commercial Fieldbus device

PLC SoftPLC VME with RTUnix

dedicated processor (e.g. image processing), DSP real time response (e.g. radiation monitor), triggered (e.g. beam dump) high number of channels, high volume I/O

JCOP Workshop September 8th 1999 H.J.Burckhart

20

Important SCADA featuresImportant SCADA features

• support of hierarchical objects

• partitioning (stations loosely coupled)

• good connection to external DB

• powerful API

• driver tool kit

• capable to support ATLAS naming convention

• multi-platform

JCOP Workshop September 8th 1999 H.J.Burckhart

21

Subdetector Time scaleSubdetector Time scale

• SCT– production 1/01 =>

– assembly 4/03 =>

– installation 3/04 =>

• Pixel– mounting 4/01

– assembly 4/03

• TileCal– production => 12/01

– assembly + calibration 2002

– installation 2003

• TRT– construction =>mid 2002

– assembly + test 2002

– installation 3/04

• LAr– EC assembly 1/01 => 10/01

– calibration 12/01 => 6/02

– installation 1/03 =>

• Muon– Decision about SCADA begin 2001

JCOP Workshop September 8th 1999 H.J.Burckhart

22

SCADA time scaleSCADA time scale

• SCADA is nucleus of DCS (ATLAS, JCOP)

• subdetector groups want practical experience in their environment

• start bottom up (Front-end I/O, simple devices, composite systems)– avoid stand-alone solutions

– re-use developments

JCOP Workshop September 8th 1999 H.J.Burckhart

23

SCADA time scale (cont.)SCADA time scale (cont.)

• Detector time scale– final elements ready end 2001

– calibration on surface 2002

– installation 2003 ==>

• Use of SCADA– learning (3 months)

– prototype application (6 months)

– real application (12 months)

==> some subdetector groups need to start essentially now

JCOP Workshop September 8th 1999 H.J.Burckhart

24

Organisation of workOrganisation of work

SCADA vendorMaintains product (platform, OS, F/E drivers)

CERN Controls Groupconnection to CERN Infrastructure and LHC, drivers, generic

applications (e.g. gas, HT)

LHC Detector Controls Groupsintegration of subdetectors (rules), general purpose I/O, operations,

subdetector “experiment infrastructure”

JCOP Workshop September 8th 1999 H.J.Burckhart

25

Organisation of work (cont.)Organisation of work (cont.)

Sub-detector Controls expert(s)integration of units (states, info exchange), control procedures

sub-system expertcontrol algorithms, HW functions

JCOP Workshop September 8th 1999 H.J.Burckhart

26

To do now ...To do now ...

• Start field work with subdetectors

• make organizational plan (course, license, etc.)

• make prioritized list of common developments and applications

• start engineering final system

JCOP Workshop September 8th 1999 H.J.Burckhart

27

ConclusionsConclusions

• ATLAS wants to continue the common approach

• SCADA system looks promising (many thanks to IT/CO for evaluation)

• implement now real subdetector application(s) with the most promising (and affordable) product