IntegrationIntegration ofof nationalnational andand EuropeanEuropean ee--InfrastructureInfrastructure
Norbert Meyer
On behalf of PIONIER Consortium
Grids and e-Science workshop June 17th, 2009, Santander 15th years of Polish e‐Infrastructure for Science
• 1993 –starting academic MANs (FDDI) • 1995 – MANs’ transition to ATM • 1997 – POL‐34, 155, 622 • 2000 –PIONIER take‐of • 2001 – dark fiber deployment • 2003 – 10GE • 2004 –multi‐lambda • 2006 – PIONIER2 strategy • 2008 – PLATON, PL‐GRID 21 ACADEMIC METROPOLITAN AREA NETWORKS
• Area 320k sq km • Population 38M • Main academic centers 21 • State universities 120+ • Students 2M+ • R&D institutions and Univ. interconnected via PIONIER network 700+
MAN
MAN & HPC Center What PIONIER is all about
• developing country wide optical infrastructure based on the academic community ownership model • setting up national optical networks interconnecting on separate AS: 21 MANs, 5 HPCCs • extending optical reach to expensive and uniq national labs • grid technology as a main tool to integrate distributed R&D resources • portal as a main service delivery platform • broadband IP based services development • international co‐operation enabling and stimulating
This was not that obvious in 1999, as is today… How PIONIER is organized:
• PIONIER is a consortium of 22 academic MANs and HPCCs • PIONIER is supervised by PIONIER Board consisting of 22 representatives • PIONIER is managed by PIONIER Executive consisting of 4 people • PIONIER network is financed from the member fee • member fee is based on the cost sharing model • each year PIONIER Board take a decision about the framework and parameters of a cost sharing model • one member is selected to play the role of the PIONIER network operator (PSNC) ) ) λ λ MAN CBDF 10Gb/s (2 SDH 2,5 Gb/s ETH 1Gb/s 2 x 10 Gb/s (2 2Q2009 PIONIER BELARUS 2x10 Gb/s UKRAINE 2x10 Gb/s ŚĆ LUBLIN ZAMO LITHUANIA 2x10 Gb/s KI Ł YSTOK AWY Ł SUWA Ł RZESZÓW PU BIA WARSZAWA OLSZTYN RADOM KRAKÓW KIELCE G Ą ELBL Ź TORU Ń A KALININGRAD 2x10 Gb/s Ł ÓD Ł SK Ń STOCHOWA Ę KATOWICE GDA CZ CESNET; SANET 2x10 Gb/s BIELSKO-BIA Ń BYDGOSZCZ OPOLE POZNA KOSZALIN ZIELONA GÓRA AW Ł GORZÓW WROC
SZCZECIN
t
s
/
e
b
n
r
s
G
/
e
t
5
b
,
n
I
G
7
/
t
2
0
s
/
e
1
b
n
x
r
2
G
ANT e
t
5
É
N N
,
n
I
7
G DF GÉANT2 10+10 Gb/s GÉANT2 10+10 PIONIER infrastructure is an integral part of ERA Network Infrastructure
Phosphorus
Porta Optica
FEDERICA
GÉANT/ GÉANT2/ GÉANT3
EMANICS
MUPBED
6NET
SEQUIN
Atrium
PIONIER Cross Border Fiber development directions Grid Infrastructure DORII Chemomentum RINgrid GridLab CrossGrid
EUFORIA PRACE HPC Europa I/II EGEE I/II/III BalticGrid int.eu.grid InteliGrid FP5, FP6, FP7 OMII_Europe PROGRESS, g-Eclipse SGI GRID, CLUSTERIX, QosCos Grid ACGT KMD, PL-GRID ViroLab PLATON EuroGrid Scientific Data Layer Infrastructure
sustained managed
protected trusted
repositories infrastructure concern for discoverable quality
organisational selected contents context
EuropeanaLocal
ENRICH IMPACT, NMDB, METAFOR, CACAO EuroVO-AIDA, DRIVER/DRIVER2 Digital Libraries Federation GENESI-DR, DRIVER II
Virtual Library of Science PIONIER2PIONIER2
Networks
Grids Security Data HPC
Common Services
Added Services
Intelligent Infrastructure for e‐Science A Concept of Platform Services Current national activities in Poland
PLATON – Platform of Integrated Services
PL-GRID – the Polish Grid
Correlated projects
KMD – National Data Storage R&D project
PKI – Public Key Infrastructure Preparatory study, certificate service
VoIP VoIP service PL‐Grid Foundations – Summary
Polish Infrastructure for Supporting Computational Science in the European Research Space
Response to the needs of Polish scientists and ongoing Grid activities in Poland, other European countries and all over the world
•Motivation –E‐Science approach to research – EGI initiative ongoing in collaboration with NGIs •Creationof PolishGrid(PL‐Grid) Consortium: http://plgrid.pl – Consortium Agreement signed in January 2007 • PL‐Grid Project (2009‐2012) – Application in Operational Programme Innovative Economy, Activity 2.3 (in Sept. 2008) –Get funded March 2, 2009 (via European Structural Funds) •Consortiummadeupof five Polish supercomputing and networking centres (founders) –ACK CYFRONET AGH (Cracow) – Coordinator PL‐Grid Consortium Founders
• Academic Computer Center Cyfronet AGH (ACK CYFRONET AGH) Coordinator • Poznań Supercomputing and Networking Center (PCSS) • Wrocław Centre for Networking and Supercomputing (WCSS) • Academic Computer Center in Gdańsk (TASK) • Interdisciplinary Center for Math. and Computat. Modelling, Warsaw University (ICM) PL‐Grid Infrastructure
Polish Grid is going to have a common base infrastructure – similar to solutions adopted in other countries. Specialized, domain Grid systems – including services and tools focused on specific types of applications – will be built upon this infrastructure. These domain Grid systems can be further developed and maintained in the framework of separate projects. Such an approach should enable efficient use of available financial resources. Creation of a Grid infrastructure fully compatible and interoperable with European and World Grids thanks to cooperation with teams involved in the development of European Grid systems (EGEE, DEISA, OMII, C‐OMEGA, ESFRI). PL‐Grid Architecture
Users
Grid Application Programming Grid portals, development tools Interface
Virtual organizations and security systems
Other Grid LCG/gLite UNICORE Grids (EGEE) services systems
Basic Grid services
Distributed Grid Distributed National data computer computational resources resources repositories network The PL‐Grid Project is split into several workpackages
P1 PROJECT MANAGEMENT
Structure Coordination Dissemination Operation Rules
P2 P6 PLANNING AND P3 DEVELOPMENT OPERATIONS CENTER SECURITY CENTER OF INFRASTRUCTURE
P4 P5 GRID SOFTWARE SUPPORT FOR VARIOUS AND USERS gLite Unicore …. DOMAIN GRIDS TOOLS DEVELOPMENT Training
Main Project Indicators: • Peak Perf.: 215 Tflops • Disk Storage: 2500 TB Challenges • Short term –To start – establishing PL‐Grid VO using Partners’ local and EGEE resources –To provide resources for covering operational costs –To select computational/storage resources for PL‐Grid infrastructure •Long term – continously –To provide the necessary infrastructure (!!) • Computer rooms, electrical power, many organizational issues –Be prepared / work on approaching paradigms and integration development • Clouds (internal‐external, computing clouds, data clouds) •SOA paradigm, knowledge usage … PLATON
Integrated services platform for science – HD Videoconferencing services – EDUROAM service – Campus service – Archive service – Interactive research HDTV service VoIP service U6 U1 HD Videoconferencing services
EDUROAM service PKI –global certifcates service U7 U2
U3 Campus services
U4 Archive service
U5 Interactive research HDTV service
Management Platform Archive service
12,5 PB tapes 1 PB disc arrays Campus service in PLATON Project (1) Campus service in PLATON Project (2) Global management layer (open-source) Local resource managers
On-demand Microsoft Graphical Other jobs virtual environment applications (batch, etc.) (Windows HPC machines Server 2008) Videoconferencing Portal Video Resources Reservation EDUROAM service Access Point Institution’s A Users’ Institution’s B Users’ Radius Database Radius Database
Guest Internet id@inst‐b.pl
VLAN VLAN Staff Guests
VLAN Intermediate Students Radius Interactive research HDTV service (1) Interactive research HDTV service (2) Production studios & camera teams
Camera Team
Production Studio PLATON timetable
– Take off 3Q2009 • EduROAM 4Q2009 • Videoconferencing 1Q2010 • Archive 2Q2010 • Campus services 3Q2010 • HDTV 1Q2011 – Harmonization / integration 2Q2012 Summary
• Pionier integrates services on different levels – upto the application (proposition of domain platforms)
• PL‐GRID and PLATON allow to deploy the national computing and data infrastructures
• .. But the e‐Science requires additional services, like • Videoconferencing • HDTV Thank You for Your kind attention.
May I answer Your questions, please ?