Control and Data Acquisition for Fusion experiments

Bernardo Brotas Carvalho [email protected]

Instituto de Plasmas e Fusão Nuclear Instituto Superior Técnico Lisbon, Portugal http://www.ipfn.ist.utl.pt

B. Carvalho | EIROforum School on Instrumentation, ESI 2011 | Grenoble Film by Jean-Luc Godard, (1967) “2 ou 3 choses que je sais d'elle” Fusion – a Global Challenge

“The stakes are considerable, not to say vital for our planet.“ José Manuel Barroso, President of the European Commission Fusion powers the sun and the stars

• Essentially limitless fuel, available all over the world On Earth, • No greenhouse gases fusion could provide: • Intrinsic safety • No long-lived radioactive waste • Large-scale energy production The Fusion Reacon on Earth “... is not the same as in the Sun“

41H + 2e --> 4He + 2 υ+ 6 γ + 26.7 MeV (solar process)

+ 3.5 MeV

+ 14.1 MeV Why D-T ?: Cross secon! Fusion Fuel CH4 + 2O2 --> CO2 + 2H2O + 5.5 eV (Chemical) 2D + 3T --> He + n + 17.6 MeV (Fusion)

Raw fuel of a fusion reactor is water and lithium*

Lithium in one laptop battery + half a bath-full of ordinary water (-> one egg cup full of heavy water) 200,000 kW-hours = (current UK electricity average consumption) for 30 years

* Deuterium/hydrogen = 1/6700 + tritium from: (from fusion) + lithium → tritium + helium Basic Principle of Stable Moon of Ions in Magnecally confined Progress in fusion performance

ITER Reactor conditions Control for Fusion Performance

Fusion Control System is a tool to achieve and maintain plasma condions with best performance for • plasma physics invesgaons • energy confinement and stability • and - at the end - yield

10 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Benchmark for Fusion Performance

• The aim is to generate power: Pfusion/Pheat↑ 2 – Pfusion~(nT) : power expelled (lost) with fusion

– Pheat : power needed to sustain plasma • from external heang • from α heang (dominang in a reactor) • For present-day experiments alpha α heang can be neglected:

Pheat=Wplasma/τE and Wplasma~nT

– Wplasma: thermal energy

– τE :energy confinement me (thermal insulaon)

• So: Pfusion/Pheat ~n⋅ T ⋅ τE (fusion product)

11 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Strategies to Improve the Fusion Product

Optimise the fusion product n⋅T⋅τE by • n↑ : increasing density • n⋅T ↑ : increasing pressure

• τE ↑ : increasing confinement

• Ip↑ : increasing current • Simply increasing each individual factor does not work: Complex limits restrict operational space. • Limits depend on spatial distribution of the quantities (profiles). • Each actuator affects multiple factors. • We need to find transition paths to plasmas with suitable combinations of n, Tip: PLAY with the virtual at http://w3.pppl.gov/~dstotler/SSFD T and τE.

12 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Applications of Performance Control in Fusion

Presently, Performance Control is not a monolithic application but a composition of various tools.

Simple Advanced Protection • Electron/Neutral • Gap/Shape Control • Disruption Prediction, Density Control • VS Control Avoidance and Mitigation • Radiation Control • Profile Control • H/D (Isotopes) (current, density, • Hot-Spot Detection Control temperature) • Radiation Peaking • beta control • MHD Control

13 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Rationale for Fusion Performance Control

Performance Control is a tool

• to guide plasma state to a desired domain (scenario, regime) on prescribed paths • to simplify the plant operation scheme  replacing actuator inputs by higher level control variables  linearizing and decoupling the system behaviour • to increase the safety margin to critical limits • to counteract external disturbances • compensate for incomplete system knowledge

For This Feedback from measured quantities is Essential.

14 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Feedback Control System Basics: LTI Systems

Open-Loop transfer funcon of the system

reference error command Output r Controller u Plant y Transfer Functions C(s) P(s) are represented in frequency Y(s) = P(s)U(s) = P(s)C(s)R(s) =H (s)R(s) (Laplace) domain rather than in U(s) = C(s)R(s) o time domain. ∞ H (s) P(s)C(s) −st o = F (s) = L{f (t)}= e f (t)dt ∫0 Closed-loop transfer s = σ + iw funcon of the system P(s)C(s) E(s) R(s) F(s)Y(s) Y(s) = R(s) = Hcl (s)R(s) = − 1+ F(s)P(s)C(s) reference error command Output r e Controller u Plant y r Closed-Loop y C(s) P(s) Hcl(s) - = P(s)C(s) Feedback Hcl (s) = F(s) 1+ F(s)P(s)C(s)

15 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Control Loop with Disturbance

disturbance d reference error Output r e Controller u Plant y C(s) P(s) -

Feedback F(s)

Y(s) = P(s)D(s) + P(s)U(s) = P(s)D(s) + P(s)C(s)(R(s) − F(s)Y(s)) Y(s) + P(s)C(s)F(s)Y(s) = P(s)D(s) + P(s)C(s)R(s)

P(s) P(s)C(s) Y(s) = D(s) + R(s) 1+ P(s)C(s)F(s) 1+ P(s)C(s)F(s) H (s) Closed-loop Y(s) = cl D(s) + H (s)R(s) transfer funcon C(s) cl

16 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Feed-Forward Control

feedfoward disturbance d reference ff error Output r e Controller u Plant y C(s) P(s) -

Feedback F(s)

• same entry point in the loop as standard disturbance input GOAL: • difference: synchronized with the •test control scenarios without reference stability concerns • predicon of required actuator •provide adequate inial values command values when switching on a controller •shortcut and speed-up control reacon

17 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Transfer Function: Poles and Zeros

m b0 + b1s +... + bms (s + q1)⋅...⋅(s + qm ) H(s) = n = bm a0 + a1s +... + bans (s + p1)⋅...⋅(s + pn )

Zeros q1,..., qm : M complex roots of n>= m CAUSALITY CONSTRAIN the transfer funcon numerator Im Poles p1,..., pn : N complex roots of p qi the transfer funcon denominator p p1 Re pa * Examples: Pp A Single real pole Y(s) = ⇒ y(t) = Ae−at (|a|>0; p = -a) s + a As + B Pair of complex −at Y(s) = 2 2 2 ⇒ y(t) = Ce sin(wt +ϕ) poles s + 2as + (a + w ) (|a|>0; p = -a ± i w)

18 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Pole Positioning

Can be roughly denoted as follows:

Im Too oscillatory RHP Debatable Not allowed

OK

Good/fast Good/ok Too slow Re LHP 19 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Control Stability: Effects of Closing the Loop • Feedback preserves the zeros • moves the poles (alters the P(s)C(s) denominator) Y(s) = R(s) = Hcl (s)R(s) 1+ F(s)P(s)C(s) • can stabilize but also destabilize! reference error command Output r e Controller u Plant y A controller changes the dynamic behavior of C(s) P(s) the closed loop system – But how ? - There is no simple analytical formula to translate controller parameters to closed loop Feedback poles and zeros F(s) • Ideal method: pole-placement  requires full feedback of all state variables, or reconstruction by Ho (s) = P(s)C(s) observers P(s)C(s)  potentially complex H (s) =  can be compromised by parasitic c 1+ F(s)P(s)C(s) delays • Pragmatic method: frequency response shaping  infer characteristic properties from open loop to closed loop  live with approximations and incomplete models 20  more robust, less performingB. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Impact of sensor dynamics in the feedback loop

Sensors are in the reference error command Output r e Controller u Plant y feedback branch of the C(s) P(s) loop -

Feedback F(s) 1 Gain inversion: y(t → ∞) = r(t → ∞) K Q (s) f F(s) = f Pf (s)

P(s)C(s) P(s)C(s) P(s)C(s)Pf (s) Hc (s) = = = 1+ P(s)C(s)F(s) Q f (s) P (s) + P(s)C(s)Q (s) 1+ P(s)C(s) f f Pf (s)

Poles of F(s) become Zeros of Hc(s)

21 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Impact of delays in the control loop

Transfer function for a delay in time −sTd Td ⇒ F(s) = e •Constant gain, no damping at F(iω) =1 Originators of delays: all frequencies •Digital control systems •Digital data processors •But connuously increasing phase delay : F(i ) T (real-me diagnoscs) ∠ ω = − d ⋅ω •Event counng sensors limits the achievable bandwidth of the •Switching power supplies closed loop •Transcendent funcon (not representable (e.g. thyristor converters) by poles and zeros)

•TIP: keep measurement delays short (e.g. filtering, computer network communicaon latencies)

22 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Example: Plasma Density Control

Command disturbance (wall influx) Plant d gas flux Output

F Density no ne(core) Transport build-up 1) Describe behaviour -

Pumping

Command disturbance Plant gas flux d 2) Formulate Model Output F no ne 1 K transp Kdens 1 sT - s + transp

3) Idenfy Parameters K pump and Simplify Command Plant P(s) gas flux K1 = 0.01 T1 = 0.1sec Output F no ne K K transp Ktransp = 1 T1 = 0.025sec 1 1 + sT 0.01 1 1 + sT 1 transp P(s) = ⋅ 0.01 4 1 0.1s 1 0.025s P(s) = = + + (1+ 0.1s)⋅(1+ 0.025s) (s +10)⋅(s + 40) DC gain (s=0): K= 0.01

23 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Controlling the example: Proportional Control

Command Plant P(s) gas flux Output nref e F no ne K K transp Add some: 1 K 100 1 + sTtransp - P = 1 + sT 1 Feedback (Unity) P-controller Controller (Proporonal) K=1 Unity Feedback 4 K ⋅ P (s +10)⋅(s + 40) 4⋅ K H (s) = = P cl 4 (s 10) (s 40) 4 K 1+ K ⋅ + ⋅ + + ⋅ P P (s +10)⋅(s + 40) Simulate! Beer sll: if you have a SSE error Tokamak nearby: TRY-IT!! Steady State Error (SSE):

Lets increase KP to 500?

24 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Increasing K by Trial & Error

K = 500

Now we have Overshoot

4⋅ KP Hcl (s) = (s +10)⋅(s + 40) + 4⋅ KP

25 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Improving the Density control: Integration Controller

Command Plant P(s) KI =100 gas flux Output nref e F no ne K K transp KI 1 1 + sT - s 1 + sT 1 transp I-controller

K=1 K 4 I ⋅ Unity Feedback s (s +10)⋅(s + 40) 4⋅ K H (s) = = I cl K 4 s (s 10) (s 40) 4 K 1+ I ⋅ ⋅ + ⋅ + + ⋅ I s (s +10)⋅(s + 40)

Aenon: the controlled loop could get unstable !

26 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Robust Density control

feedfoward Command reference error ff * Output ne e Controller u F Plant ne C(s) P(s) -

Bremstrahlung

Reconstructor: DCN Compute interferometer best estimate Coton Mouton Real-Time Diagnostics Effect Requirement • DCN signals can be compromised by fringe Realisaon: jumps. Compute a validated density • Density measurement from a single central from several diagnosc DCN line-of-sight (LOS) is unsecure. sources • Density from Bremsstrahlung has drifts. o detect sensor failures • A valid density value is required for control and monitoring (NBI interlocks) o replace with other inputs

27 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event ITER CODAC is the primary tool for operation

28 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event ITER CODAC is a challenging endeavour

• ITER will generate a huge quantity of experimental data

– 150 plant systems – 1 000 000 diagnostic channels – 300 000 slow control channels – 5 000 fast control channels – 40 CODAC systems – 5 Gb/s data – 3Pb/year data (e.g. 12 IR cameras in a 10 minutes discharge: 1.728 Tbytes)

In addition... ITER will require a far higher level of

availability and reliability than previous/ existing . 29 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event

International ITER Agreement

Procurement “IN KIND”

IO Team in charge of the integration on site and the 140 slices operation

Need for Standards in HW & SW Architecture

30 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event ITER Instrumentation & Control System physical architecture

ITER Subsystem

is a set of related plant system I&C

31 B. Carvalho| EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Estimate of ITER CODAC system size

ITER subsystem # of PS I&C # of PSH+controllers # of servers+terminals

Tokamak 6 55 6 Cryo and cooling water 5 40 3 Magnets and coil power supply 8 30 3 Building and power 37 66 3 Fuelling and vacuum 6 45 3 Heating 8 55 4 Remote handling 2 15 2 Hot cell and environment 3 20 2 Test blanket 6 24 7

Diagnostics 89 400 20

Central 0 0 170

TOTAL 167 750 220

~1000 computers connected to CODAC

32 B. Carvalho| EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event ITER Instrumentation & Control System physical architecture

Plant System I&C

Is a deliverable by ITER member state. Set of standard components selected from catalogue. One and only one plant system host.

33 B. Carvalho| EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event CODAC Servers and Terminals

are servers running Red Hat Enterprise Linux (RHEL) and EPICS/CSS/???. These servers implements supervision, monitoring, coordination, configuration, automation, data handling, archiving, visualization, HMI…

34 B. Carvalho| EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Plant Operation Network is the work horse general purpose flat network utilizing industrial managed switches and mainstream IT technology

35 B. Carvalho| EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event High Performance Networks are physically dedicated networks to implement functions not achievable by the conventional Plant Operation Network. These functions are distributed real-time feedback control, high accuracy time synchronization and bulk video distribution.

36 B. Carvalho| EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Slow Controller

is a Siemens Simatic S7 industrial automation Programmable Logic Controller (PLC A Slow Controller runs software and plant specific logic programmed on STEP 7. A Slow Controller has normally I/O and IO supports a set of standard I/O modules.

37 B. Carvalho| EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Fast Controller

is a dedicated industrial controller implemented in PCI family form factor and There may be zero, one or many Fast Controllers in a Plant System I&C. A Fast Controller runs LINUX RHEL and EPICS IOC. A Fast Controller has normally I/O and IO supports a set of standard I/O modules with associated EPICS drivers. A Fast Controller may have interface to High Performance Networks (HPN),

38 B. Carvalho| EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event High Performance Networks are physically dedicated networks to implement functions not achievable by the conventional Plant Operation Network..

39 B. Carvalho| EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event High Performance Computer

are dedicated computers (multi core, GPU) running plasma control algorithms.

40 B. Carvalho| EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Fast Controllers for Fusion Devices Supporting Infrastructure Simulation environment

Scheduler R-T signal servers

Actuators

Heating systems Sensors Diagnostics Fueling systems Plasma Corrective systems Analysis codes Magnetic coils

Controllers Plasma Shaping High performance & communication Current Control networks Machine protection

Profiles control

Instabilities control

41 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Vertical Stabilization | an example

Growth Rate

Elongated plasmas are vertically unstable MIMO systems designed to make plasma vertically stable while other controllers control plasma position and shape

42 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event ITER Vertical Position Control How important are control systems?

• Loss of vertical plasma position control in ITER will cause thermal loads on Plasma Facing Components of 30-60 MJ/m2 for ~0.1s. – PFCs cannot be designed to sustain such (repetitive) thermal loads

• Vertical Displacement Events also generates the highest electromagnetic loads – A phenomenological extrapolation of horizontal forces estimates loads ~45MN on ITER vacuum vessel. – Simulations of MHD predicts ~20MN – Vertical loads ~90MN

Plasma vertical position in ITER must be robust & reliable to ensure a vertical plasma position control loss is a 43 B. Carvalhovery | EIROforum unlikely SchoolAuthor’s on Instrumentation, event name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Example: JET Vertical Stabilization system

192 input signals

• 192 signals acquired by ADCs and transferred at each cycle

• 50 µs control loop cycle time with jitter < 1 µs archieved by MARTe.

• Always in real-time (24 hours per day)

• 1.728 x 109 50 µs cycles/day

Front view • Crucial for ITER very long pulses 44 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event ATCA @ JET Vercal Stabilisaon Controller

• x86-based ATCA controller • Up to 12 DGP cards (PCIe links through the ATCA full mesh backplane)

• 32 18 bit ADC channels / board , separately isolated (1 kV) • Parallel execution on FPGAs for MIMO signal processing (Control loop delay < 50 µs, aim < 10 µs) • Linux RT operating system (RTAI) • Aurora and PCI Express communication protocols allow data transport, between modules - expected latencies below 2 µs.

45 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event IPFN’s ATCA-MIMO-ISOL I/O Processing Boards

RTM ADC module 46 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event ATCA JET Gamma ray Spectroscopy

• 19 lines of sight 10 Horizontal + 9 Vertical Channels • 2 FPGA ( Virtex II-Pro) ATCA Boards Digitizing at 200 MSPS, 13bit, 8 channels

47 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Gamma and X-ray Diagnostics Real time Processing DDR SODIMM 1 GBYTES

XilinxTM Analog-to- AAnnaalologg-t-oto-D-Digigitiatal l FPGA Analog Inputs AnaDloigg-tiota-Dl igital 4x 12 bits 4X Rocket IO CCoonnvveertreter rB Blolocckk VirtexII-Pro CConovnevrter tBelro ck XC2VP30FF1152 -6 Block

Channel 11 Sync SYSTEM ACE PCI EXPRESS PCI Express x4 link Clock COMPACT SWITCH Channel 12 Ref CLK Synthesis FLASH Pex 8516 PCI Express x4 link

XilinxTM Analog-to- AAnnaalologg-t-oto-D-Digigitiatal l FPGA Analog Inputs AnaDloigg-tiota-Dl igital 4x 12 bits 4X Rocket IO CCoonnvveertreter rB Blolocckk VirtexII-Pro CConovnevrter tBelro ck XC2VP30FF1152 -6 Block

Clock DDR SODIMM Synthesis • Parallel DPP in FPGA1 G BYTES • Real-Time PHA at 1MHz average pulse rate. • 20 ns resolution timestamp • Data reduction rate of at least 80% attainable 95% of total pulses resolved

48 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Why ATCA?

ATCA platform is gaining traction in the physics community because of • Advanced communication bus architecture (serial gigabit replacing parallel buses) • very high data throughput options and its suitability for real-time applications • Scalable shelf capacity to 2.5Tb/s • Scalable system availability to 99.999% • Robust power infrastructure (distributed 48V power system) and large cooling capacity (cooling for 200W per board) • Ease of integration of multiple functions and new features • The ability to host large pools of DSPs, NPs, processors and storage • Full redundancy support • Reliable mechanics (serviceability, shock and vibration) • Hardware management interface (IPMI Bus)

49 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Who else is using ATCA?

The group of experimenters includes several major laboratories representing different fields of use and a range of applications.

• Active programs are showing up most notably at – DESY for XFEL and JET • Other laboratories – ILC, IHEP, KEK, SLAC, FNAL, ANL, BNL, FAIR, ATLAS at CERN, AGATA, large telescopes, Ocean Observatories • Investigating ATCA solutions for future upgrades – Both the CMS and ATLAS detectors • Setting up prototype experiments to test its potential – ILC and ITER

ATCA is being adapted without significant change as a platform for generic data acquisition processors requiring high throughput and bandwidth.

Most of these programmes put the emphasis on High Availability

50 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event SOFTWARE TOOLS FOR CONTROL: EPICS and ITER

In February 2009 ITER Organization decided to use EPICS for the control system. This decision was based on three independent studies In February 2010 ITER-IO released the first version (V1.0) of CODAC Core System, which basically is a package of selected EPICS products

51 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event What is EPICS?

EPICS is an abbreviation for: Experimental Physics and Industrial Control System

EPICS is: • A collaboration • A tool kit • A control system architecture

52 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event The History

– In1989 started a collaboration between Los Alamos National Laboratory (GTA) and Argonne National Laboratory (APS) GTA: Ground Test Accelerator (Bob Dalesio & Marty Kraimer) APS: Advanced Photon Source – More than 150 licenses agreements were signed, before EPICS became Open Source in 2004 – Team work on problems, for example over “Tech Talk” mailing list – Database and network protocol (CA) basically unchanged since 1990. – Collaborative efforts vary • Assistance in finding bugs • Share tools, schemes, and advice http://www.aps.anl.gov/epics 53 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event EPICS – who is using it?

Some members of the collaboration (very short List!): – ANL (APS Accelerator, APS Beamlines, IPNS) in Chicago, USA – LANL in Los Alamos, USA – ORNL (SNS) in Oak Ridge, USA – SLAC (SSRL, LCLS) in Standford, USA – DESY in Hamburg, Germany – BESSY in Berlin, Germany – PSI (SLS) in Villigen, Switzerland – KEK in Tsukuba, Japan – DIAMOND Light Source (Rutherford Appleton Laboratory) in Oxfordshire, England – In FUSION: NTSX, KSTAR, ITER and ISTTOK

54 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Parts of EPICS

Client Software MEDM OAG Apps ALH TCL/TK StripTool Perl Scripts Many, many others … Input IOC IOC Channel Access Output Output CAS IOC CAS IOC CA Server Software EPICS Database Commercial Custom consists of Process Variables Records Custom Instruments hardware Sequence Programs Programs Technical Realtime Equipment control

55 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event How does it do it?

Channel Access Client

Channel Access Client Operator Channel Access Client

Network (Channel Access Protocol) Channel Access Server

Process Variables: Computer Power S1A:H1:CurrentAO Interface Supply

Beam S1:P1:x Computer Interface Position S1:P1:y Monitor

S1:G1:vacuum Computer Vacuum Interface Gauge Machine IOC

56 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event What is an IOC?

• A special CA Server and CA Client • A computer running “IOC Core” IOC means Input Output Controller • This computer may be: - VME based, operating system vxWorks or RTEMS - PC, operating system Windows, Linux, RTEMS - Apple, operating system OSX - UNIX Workstation, operating system Solaris • An IOC normally is connected to input and/or output hardware • An EPICS control system is based on at least one Channel Access Server (normally an IOC) • An IOC runs a record database, which defines what this IOC is doing

57 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Inside an IOC The major software components of an IOC (IOC Core) LAN (Network)

IOC Channel Access

Database Sequencer

Device Support

I/O Hardware

58 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Control and Data Acquisition for Next Generation Fusion Experiments

Challenges Implicaons

• Increasing number of • Massive processing power interdependent parameters to (parallel, mul-processing be controlled support)

• Increasingly faster real-me • High bandwidth for data- loop-cycle response transfer

•Stricter Operang Safety • Real-me mul-input-mul- Margins output (MIMO) control

• Connuous Operaon • Advanced, intelligent, flexible generaon huge data quanes ming & syncronizaon

59 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event

Concluding remarks

• High performance of fusion depends on real- time MIMO control systems • Control systems are critical for safe operation and reliability of Fusion Devices • ITER is a big challenge for its higher complexity and stricter safety margins • Likely there were will be a grater convergence between Neutron/High energy physics and Fusion on hardware technologies in hardware (ATCA) and software (EPICS)

60 B. Carvalho | EIROforum SchoolAuthor’s on Instrumentation, name | Place, Month ESI 2011 xx, 2007 | Grenoble | Event Her