http://www.nesc.ac.uk/training http://www.ngs.ac.uk

The National Grid Service Guy Warner [email protected]

http://www.pparc.ac.uk/ http://www.eu-egee.org/ Policy for re-use

• This presentation can be re-used for academic purposes.

• However if you do so then please let training- [email protected] know. We need to gather statistics of re-use: no. of events, number of people trained. Thank you!!

2 Acknowledgements

• Some NGS slides are taken from talks by Stephen Pickles and Andy Richards • Also slides from Malcolm Atkinson on the UK e- Science programme

3 Overview

• e-Infrastructure in the UK

• The National Grid Service

4 e-Science Centres in the UK

e-Science Centres Digital Curation Centre Centres of Excellence Lancaster NeSC Other Centres York Newcastle Access Grid Belfast Support Centre

Manchester Leicester National Centre for Text Mining

National Centre for National Institute for e-Social Science Environmental e-Science

Daresbury Cambridge Birmingham Oxford National Grid Service Cardiff Bristol RAL Reading Open Middleware Infrastructure Institute Southampton London London 5 Edinburgh

6 OMII-UK nodes EPCC & National e-Science Centre

School of Computer Science

Edinburgh

School of Electronics and Computer Science University of Southampton Manchester

Southampton

7 UK e-Infrastructure Users get common access, tools, information, http://www.nesc.ac.uk/training Nationally supported services, throughhttp://www.ngs.ac.uk NGS HPCx + HECtoR Regional and Campus grids

Community Grids

Integrated internationally

LHC

VRE, VLE, IE

ISIS TS2 http://www.pparc.ac.uk/ http://www.eu-egee.org/ The National Grid Service

9 The National Grid Service

• The core UK grid, resulting from the UK's e-Science programme. – Grid: virtual computing across admin domains

• Production use of computational and data grid resources.

• Supported by JISC – Entering 2nd phase of funding

10 Commercial Provider UofD PSRE

H C U Man. Leeds U U P S of GOSC of of C A A B C x RAL Oxford R

NGS Core Nodes: Host core services, coordinate integration, deployment and support +free to access resources for all VOs. Monitored interfaces + services NGS Partner Sites: Integrated with NGS, some services/resources available for all VOs Monitored interfaces + services NGS Affiliated Sites: Integrated with NGS, support for some VO’s Monitored interfaces (+security etc.)

11 Commercial Provider UofD PSRE

H C U Man. Leeds U U P S of GOSC of of C A A B C x RAL Oxford R

NGS Core Nodes: Host core services, coordinate integration, deployment and support +free to access resources for all VOs. Monitored interfaces + services NGS Partner Sites: Integrated with NGS, some services/resources available for all VOs Monitored interfaces + services NGS Affiliated Sites: Integrated with NGS, support for some VO’s Monitored interfaces (+security etc.)

12 National Grid Service and partners

Edinburgh

Lancaster CCLRC Rutherford Manchester York Appleton Laboratory

Didcot Cardiff Westminster Bristol

13 National Grid Service and partners

Edinburgh

Lancaster CCLRC Rutherford Manchester York Appleton Laboratory

Didcot Cardiff Westminster Bristol

14 NGS Use

Usage Statistics (Total Hours for all 4 Core Nodes)

250000

200000

150000

Hours User DN Number of Registered NGS Users 100000

Files stored 300

50000 250

200 0 01 3 4 67 9 0 3 6 9 2 3 56 89 1 2 45 7 8 1 4 7 0 1 34 6 7 90 23 5 6 9 2 5 8 9 12 4 5 78 1 4 7 8 01 3 4 6 7 90 2 3 6 9 2 5 6 12 34 56 789 0 0 0 1 1 1 1 1 12 2 2 2 2 3 3 11 121 11511 181 22122224252 2728230313 33433 3733 404 44344 464 449505 52535 5556558596 66266 656 668677177 747 777787 808188384886878 89099 939 99699 991001 10210311051061 101091111121 11151 111811 1211 11241251 127128113013111331341 13137

150 Users (Anonymous) ~400 users NGS User Registrations 100 Linear (NGS User CPU time by user Number of Users Registrations) 50 Total

0 Count of "RC" 14 January 23 April 01 August 09 17 28 May 160 05 14 2004 2004 2004 November February 2005 September December 2004 2005 2005 2005 Total 140 Date Count of OU= 120 50

45 100 40

35 80

30 60 25

20

40 biology 15 Env. Sci PP + Astronomy 10 Large facilities 20 Medicine Eng. + Phys. Sci Sociology 5 Humanities

0 0 bbsrc cclrc epsrc nerc pparc AHRC mrc esrc

"RC" OU=DLS OU=UCL OU=York OU=QUB OU=Leeds OU=CLRC OU=Bristol OU=CPPM OU=Cardiff OU=Oxford OU=OASIS OU=BBSRC OU=Imperial OU=DMPHB OU=Reading OU=Warwick OU=Glasgow OU=Sheffield OU=Liverpool OU=Lancaster OU=Edinburgh OU=Newcastle OU=Cambridge OU=Nottingham OU=Portsmouth OU=Manchester OU=Birmingham OU=Westminster Users by institution OU=Southampton

O=universiteit-utrecht Users by discipline OU=QueenMaryLondon OU= 15 Supporting Services

• UK Grid Services – National Services • Authentication, authorisation, certificate management, VO registration, security, network monitoring, help desk + support centre. – NGS Services and interfaces • Job submission, simple registry, data transfer, data access and integration, resource brokering, monitoring and accounting, grid management services, workflow, notification, operations centre. – NGS core-node Services • CPU, (meta-) data storage, key software – Services coordinated with others (eg OMII, NeSC, EGEE, LCG): • Integration testing, compatibility & Validation Tests, User Management, training

• Administration: – Policies and acceptable use – Service Level Agreements and Definitions – Coordinate deployment and Operations – Operational Security

16 Applications: 2

Systems Biology Econometric analysis

Example: La2-xSrxNiO4 Neutron Scattering

Climate modelling

H. Woo et al, Phys Rev B 72 064437 (2005) 17 Membership options

Two levels of membership (for sharing resouces): 1. Affiliates – run compatible stack, integrate support arrangements – adopt NGS security policies – all access to affiliate’s resources is up to the affiliate • except allowing NGS to insert probes for monitoring purposes 2. Partners also – make “significant resources” available to NGS users – enforce NGS acceptable use policies – provide accounting information – define commitments through formal Service Level Descriptions – influence NGS direction through representation on NGS Technical Board

18 Membership pipeline

September 2006 (not a complete list) •Partners – GridPP sites, initially Imperial, Glasgow – Condor/Windows at Cardiff – Belfast e-Science Centre (service hosting, GridSAM,...) • Affiliates – NW-Grid/Manchester SGI Prism – SunGrid • Data partners (early discussions) – MIMAS and EDINA • Others in discussion

19 New partners

Over the last year, several new full partners have joined the NGS: – Bristol, Cardiff, Lancaster and Westminster – Further details of resources can be found on the NGS web site: www.ngs.ac.uk.

• Resources committed to the NGS for a period of at least 12 months.

• Heterogeneity introduced by these new services

20 NGS Facilities

• Leeds and Oxford (core compute nodes) – 64 dual CPU intel 3.06GHz (1MB cache). Each node: 2GB memory, 2x120GB disk, Redhat ES3.0. Gigabit Myrinet connection. 2TB data server. • Manchester and Rutherford Appleton Laboratory (core data nodes) – 20 dual CPU (as above). 18TB SAN. • Bristol – initially 20 2.3GHz Athlon processors in 10 dual CPU nodes. • Cardiff – 1000 hrs/week on a SGI Origin system comprising 4 dual CPU Origin 300 servers with a Myrinet™ interconnect. • Lancaster – 8 Sun Blade 1000 execution nodes, each with dual UltraSPARC IIICu processors connected via a Dell 1750 head node. • Westminster – 32 Sun V60 compute nodes • HPCx –… For more details: http://www.ngs.ac.uk/resources.html

21 NGS software

• Computation services based on Globus Toolkit version 2 – Use compute nodes for sequential or parallel jobs, primarily from batch queues – Can run multiple jobs concurrently (be reasonable!) • Data services: – Storage Resource Broker: • Primarily for file storage and access • Virtual filesystem with replicated files –“OGSA-DAI”: Data Access and Integration • Primarily for grid-enabling databases (relational, XML) – NGS Oracle service

22 Managing middleware evolution

• Important to coordinate and integrate this with deployment and operations work in EGEE, LCG and similar projects.

• Focus on deployment and operations, NOT development.

EGEE… Other software NGS sources Software with proven UK, ETF Prototypes & capability & realistic Campus specifications deployment Operations experience and other ‘Gold’ services grids Feedback & future requirements

Engineering Task Force Deployment/testing/advice

23 Gaining Access

Free (at point of use) access to National HPC services core and partner NGS nodes •HPCx

1. Obtain digital X.509 certificate • Must apply separately to – from UK e-Science CA research councils – or recognized peer • Digital certificate and 2. Apply for access to the NGS conventional (username/ password) access supported

24 Key facts

• Production: deploying middleware after selection and testing – major developments via Engineering Task Force. • Evolving: –Middleware – Number of sites – Organisation: • VO management • Policy negotiation: sites, VOs • International commitment • Gathering users’ requirements – National Grid Service

25 Web Sites

• NGS – http://www.ngs.ac.uk – To see what’s happening: http://ganglia.ngs.rl.ac.uk/ – New wiki service: http://wiki.ngs.ac.uk – Training events: http://www.nesc.ac.uk/training

•HPCx – http://www.hpcx.ac.uk

26 Summary

• NGS is a production service – Therefore cannot include latest research prototypes! – Formalised commitments - service level agreements

• Core sites provide computation and data services

• NGS is evolving – OMII, EGEE, Globus Alliance all have m/w under assessment for the NGS • Selected, deployed middleware currently provides “low-level” tools – New deployments will follow – New sites and resources being added

27