Innovative Services and Tools for Nordugrid (Ngin)

Total Page:16

File Type:pdf, Size:1020Kb

Innovative Services and Tools for Nordugrid (Ngin) Innovative services and tools for NorduGrid (NGIn) (rehearsal of talk at) Nordunet 2008, 11.04.08 Jon Kerr Nilsen e-mail: [email protected] University of Oslo, Department of Physics PhD student, NGIn The NorduGrid Collaboration A research collaboration that: Develops and supports ARC middleware Provides middleware to research groups and national Grids Coordinates the various inputs to the KnowARC code Nordic Grid Neighbourhood 3/28/2008 www.nordugrid.org 2 ARC middleware deployed on 50+ sites in 15+ countries all around the world http://www.nordugrid.org/monitor 28/03/2008 www.nordugrid.org 3 Nordic Data Grid Facility NDGF pilot project started 2003 Goal: Create a Nordic Grid infrastructure Builds on history and competence of Nordic Grid collaboration NorduGrid/ARC middleware chosen as basis Funded (2 M.EUR/year) by National Research Councils of the Nordic countries Coordinates and hosts major Grid projects (e.g. the Nordic LHC Tier-1) Operates Nordic storage facility for major projects NOS-N Develops grid middleware and DK SF N S services Nordic Data Grid Facility 28/03/2008 www.nordugrid.org 4 NDGF Facility - 2007Q3 EU KnowARC project: 10 partners from 7 countries NorduGrid members and research teams in medicine, bioinformatics, physics, engineering, automotive industry apps, IT Univ. of Oslo Norway Univ. of Lund Sweden Univ. of Copenhagen Denmark Univ. of Uppsala Sweden NIIF Hungary Geneva Univ. Hospitals Switzerland Univ. of Kosice Slovakia Univ. of Lübeck Germany science+computing ag Germany SUN Microsystems Hungary 28/03/2008 www.nordugrid.org 6 Goals of KnowARC The mission of KnowARC is . to create a novel, powerful Next Generation Grid middleware – extend and re-design ARC – ARC in major Linux distributions – New platforms: Windows, Solaris, Mac OS-X . to promote Grid standardization and ARC interoperability academia – interoperability with other Grid solutions , (EGEE and OGF compliant grids) – possibilities to include (or to be included middleware Existing Existing developers middleware Experienced in) other infrastructures Industry application developers application . to prove usage in Health care, Industry and Science – Bioinformatics: Autoimmune Diseases – Medical Informatics: Lung Diseases – Automotive industry 28/03/2008 www.nordugrid.org 7 Nordic Grid Neighbourhood DENMARK: RUSSIA: University of Copenhagen (NBI, DIKU) St. Petersburg State University, Dep. of Physics Southern Danish University, Dep. of Computer Science Petrodvorets Telecommunication Center Project:Aalborg University,… , NO Dep.27 of, NaboComputer 05 Science-06, Nordic Grid Dubna?Neighbourhood Project no.: 172724SWEDEN:/V11 ESTONIA: WeLund have University, read Department and approved of Physics your final academicNational and Institute financial of Chemical project Physics and report Biophysics for (NICPB) Linköping University, National Supercomputer Center the funding period 2005-2006. Estonian Educational and Research Network (EENet) Umeå University, Department of Computing Science and HPC2N Tartu University, Institute of Technology WeUppsala are pleased University, Departmentthat most of Radiation of the Sciences planned goals of the network have been reached FINLAND: through,Royal Institute i.a. ofworkshops, Technology, NADA/PDC, conferences KTH and tutorials. University of Helsinki, Helsinki Institute of Physics WeStockholm are impressed University and by KTH, the Department number of Physics of partners and participants in the various LITHUANIA: NORWAY: activities. Vilnius University, Institute of Materials Science and Applied University of Oslo, Physics institute Research University of Bergen, Department of Physics LATVIA University of Tromsø, Department of Computer Science ICELAND NTNU, Trondheim, Department of Computer and Information Science NTNU, Trondheim, IT division (ITEA) ... 9 countries, 24 institutions NGIn for Research and Education •Innovative services and tools for NorduGrid. PI: Farid Ould-Saada, University of Oslo. Nordunet3-funded Internet Research Program 4 year grant Oslo, Lund, Uppsala, Copenhagen, Helsinki, Reykjavik Mainly: Data Management: storage, indexing and movement Education and expert training in Grid computing 5 PhD grants starting January 2007 http://www.nordugrid.org/ngin Grid school http://www.nordunet3.org/ call for collaboration with Baltic and Russia … ?? 28/03/2008 www.nordugrid.org 9 NGIn management Project leader Prof. Farid Ould-Saada, Univ. of Oslo, Norway Technical coordinator Prof. Alex Read, Univ. of Oslo, Norway Steering Group Prof. Farid Ould-Saada, Univ. of Oslo, Norway Prof. Paula Eerola, Univ. of Lund, Sweden Prof. Tord Ekelöf, Univ. of Uppsala Dr. Miika Tuisku, Helsinki Institute of Physics, Finland Dr. Anders Wäänänen, Niels Bohr Institute, Denmark Participating institutions Physics Department, Univ. of Oslo, Norway Physics Department, Univ. of Lund, Sweden Physics Department, Univ. of Uppsala, Sweden Department of Radiation Sciences, Univ. of Uppsala, Sweden Niels Bohr Institute, Univ. of Copenhagen and Univ. of Aalborg, Denmark Helsinki Institute of Physics (HIP), Univ. of Helsinki, Finland CSC, Finnish IT Centre for Science, Helsinki, Finland Univ. of Iceland, Reykjavik, Iceland 28/03/2008 www.nordugrid.org 10 NGIn - training new Grid experts From the original application: ”The goal of this project is twofold: To extend the excisting middleware and to train new Grid experts...” NGIn obtained 30% of funding applied for The primary goal of the NGIn project is now to train new Grid experts Training programme includes: Grid PhD students A Nordic Grid school (first held in Copenhagen in September 2007) The NGIn activities will go in line with the other NorduGrid related projects Budget 2008: PhD students: 160 k€ Researcer training: 15 k€ Other expenses: 3 k€ Total: 178 k€ 28/03/2008 www.nordugrid.org 11 NGIn PhD students Four PhD students hired, fifth student soon to be hired In chronological order: PhD 1, Jon Kerr Nilsen, Univ. of Oslo, Norway PhD 2, Mika Silander, Helsinki Institute of Physics, Finland PhD 3, Salman Toor, Univ. of Uppsala, Sweden PhD 4, Phillippe Gros, Univ. of Lund, Sweden PhD 5, to be announced soon, Univ. of Oslo, Norway The students are funded 50% by Nordunet3, 50% by local sources All the PhD projects involves data management 28/03/2008 www.nordugrid.org 12 Jon Kerr Nilsen, Oslo Thesis title: Data Management in Grid Middleware Address limitations and implement solutions in data management and data handling in ARC middleware Use the solutions in ATLAS data analysis Work in collaboration with both NGIn and the Scientific Computing Group at Univ. of Oslo Current status Got familiar with ARC and data management (and Grid acronyms) Survey on data management in conjunction with Grid technology presented at NorduGrid2007 Set up a test cluster with dCache and ARC Storage Element (in collaboration with Salman Toor) 28/03/2008 www.nordugrid.org 13 Jon Kerr Nilsen, Oslo Near future Storage in conjunction with virtualization Virtualization provides possibilities to run Linux applications in Windows Challenge to get data out of virtual machine and store them securely and efficiently Client side of the next generation ARC storage solution Will work with KnowARC in implementing client- side tools for accessing data in the next generation ARC Look at data management of local versus external dCache instances in current ARC Is there any gain in prefering dCache instances in the same network as the computing resource? 28/03/2008 www.nordugrid.org 14 Mika Silander, Helsinki Thesis title: Fine-grained rights delegation in Grids Problem: delegation is too coarse-grained in current middleware, e.g. Users delegate all rights (Impersonation) Limited proxy certificates (exclude shell access) Delegating the least set of rights is desirable Work so far on Analysing ARC components for types of operations and objects for rights modeling Analysing SAML for extending it to support fine- grained rights expressions Evaluation of existing solutions with applicable features, e.g. SecPAL 27/03/08 www.nordugrid.org 15 Mika Silander, Helsinki Potential solution candidates SAML + XACML Pros: other Grid mw committed to these standards Cons: complexity in policy definition and evaluation, fine-grained rights not modeled Security Policy Assertion Language (SecPAL) Pros: broad set of security functionality including rights restrictions, policies expressed with easily understandable syntax Cons: restrictive license and implementation languages 27/03/08 www.nordugrid.org 16 Near future plans Near future plans Explore possibility of combining pros of the two former for ARC: Easily understandable syntax similar to SecPAL, but, Implementation relying on C/C++ and adhering to pure SAML & XACML Ensure this is compatible with the security architecture of ARC v1.x being developed Evaluate SAML + XACML solution of gLite Applicable, parts reusable? 27/03/08 www.nordugrid.org 17 Salman Toor, Uppsala Thesis title: Data indexing and data movement in grid environments Topic: Management of scientific data in Grid systems Making a Grid Storage Resource Manager A layer on top of various Storage Managers (dCache, DPM, Castor, GPFS...) A complete cost based estimation system to find the best SM amongst the SMs registered 28/03/2008 www.nordugrid.org 18 Salman Toor, Uppsala Progress Deployed dCache test site in Uppsala (with a pool- node in Oslo) Started study of SRM specifications in conjunction with different Storage Managers Deployed the full Globus toolkit and implemented test services and clients
Recommended publications
  • Usage Statistics and Usage Patterns on the Nordugrid: Analyzing the Logging Information Collected on One of the Largest Production Grids of the World
    CHEP04 Contribution ID: 102 Type: poster Usage statistics and usage patterns on the NorduGrid: Analyzing the logging information collected on one of the largest production Grids of the world. Wednesday, 29 September 2004 10:00 (0 minutes) The Nordic Grid facility (NorduGrid) came into production operation during the summer of 2002 when the Scandinavian Atlas HEP group started to use the Grid for the Atlas Data Challenges and was thus the first Grid ever contributing to an Atlas production. Since then, the Grid facility has been in continuous 24/7 operation offering an increasing number of resources to a growing set of active users coming from various scientific areas including chemistry, biology, informatics. As of today the Grid has grown into one of the largest production Grids of the world continuously running Grid jobs on the more than 30 Grid-connected sites which offer over 2000 CPUs. This article will start with a short overview of the design and implementation of the Advanced Resource Connector (ARC), the NorduGrid middleware, which delivers reliable Grid services to the NorduGrid production facility. This will be followed by a presentation of the logging facility of NorduGrid, describing the logging service and the collected information. The main part of the talk will focus onthe analysis of the collected logging information: usage statistics, usage patterns (what is a typical grid job on the NorduGrid looks like?). Use cases from different application domains will also be discussed. References: -NorduGrid live: www.nordugrid.org -> Grid Monitor -Atlas Data-Challenge 1 on NorduGrid: http://arxiv.org/abs/physics/0306013 Primary authors: KONSTANTINOV, A.
    [Show full text]
  • Performance of the Nordugrid ARC and the Dulcinea Executor in ATLAS Data Challenge 2
    Performance of the NorduGrid ARC and the Dulcinea Executor in ATLAS Data Challenge 2 Sturrock, R.; Eerola, Paula; Konya, Balazs; Smirnova, Oxana; Lindemann, Jonas; et, al. Published in: CERN-2005-002 2005 Link to publication Citation for published version (APA): Sturrock, R., Eerola, P., Konya, B., Smirnova, O., Lindemann, J., & et, A. (2005). Performance of the NorduGrid ARC and the Dulcinea Executor in ATLAS Data Challenge 2. In A. Aimar, J. Harvey, & N. Knoors (Eds.), CERN- 2005-002 (Vol. 2, pp. 1095-1098). CERN. Total number of authors: 6 General rights Unless other specific re-use rights are stated the following general rights apply: Copyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners and it is a condition of accessing publications that users recognise and abide by the legal requirements associated with these rights. • Users may download and print one copy of any publication from the public portal for the purpose of private study or research. • You may not further distribute the material or use it for any profit-making activity or commercial gain • You may freely distribute the URL identifying the publication in the public portal Read more about Creative commons licenses: https://creativecommons.org/licenses/ Take down policy If you believe that this document breaches copyright please contact us providing details, and we will remove access to the work immediately and investigate your claim. LUND UNIVERSITY PO Box 117 221 00 Lund +46 46-222 00
    [Show full text]
  • Status Report
    Nordic Testbed for Wide Area Computing and Data Handling 9/01/01 NORDIC TESTBED FOR WIDE AREA COMPUTING AND DATA HANDLING (NORDUGRID) Status Report Introduction The purpose of the project is to create the grid computing infrastructure in Nordic countries. Project participants include universities and research centers in Denmark, Sweden, Finland and Norway. The active phase of the project started in May 2001, and involves the Niels Bohr Institute (Denmark), Lund and Uppsala Universities (Sweden), University of Oslo (Norway) and Helsinki Institute of Physics (Finland). From the very beginning, the NorduGrid testbed became an integral part of the EU DataGrid project, initiated at CERN and aimed at creation of the computing infrastructure for the future high−energy physics experiments. This report overviews the status of the NorduGrid project as of September 1, 2001. 1. Organization Due to the considerable geographical spread of the project member sites, the organization and management heavily relies on the Internet and teleconferences. The project Web site (http://www.quark.lu.se/grid) opened in December 2000, and is regularly updated with the most detailed information. 1.1. Responsibilities 1.1.1. Steering Panel Prior to the project’s start, the steering panel was appointed to provide adequate management. It consists of the coordinator, John Renner Hansen, representing Niels Bohr Institute, Denmark; Tord Ekelöf, Paula Eerola and Sten Hellman, representing Uppsala, Lund and Stockholm Universities of Sweden; Farid Ould−Saada from the University of Oslo, Norway, and Matti Heikkurinen, representing Helsinki Institute of Physics, Finland. The group holds regular phone meetings. 1.1.2. New positions Three new postdoctoral positions, funded by the project, were created.
    [Show full text]
  • The Nordugrid Architecture and Middleware for Scientific Applications
    The NorduGrid Architecture And Middleware for Scientific Applications O. Smirnova1, P. Eerola1, T. Ekel¨of2, M. Ellert2, J. R. Hansen3, A. Konstantinov4, B. K´onya1, J. L. Nielsen3, F. Ould-Saada5, and A. W¨a¨an¨anen3 1 Particle Physics, Institute of Physics, Lund University, Box 118, 22100 Lund, Sweden 2 Department of Radiation Sciences, Uppsala University, Box 535, 75121 Uppsala, Sweden 3 Niels Bohr Institutet for Astronomi, Fysik og Geofysik, Blegdamsvej 17, DK-2100, Copenhagen Ø, Denmark 4 Vilnius University, Institute of Material Science and Applied Research, Saul˙etekio al. 9, Vilnius 2040, Lithuania 5 University of Oslo, Department of Physics, P. O. Box 1048, Blindern, 0316 Oslo, Norway Abstract. The NorduGrid project operates a production Grid infras- tructure in Scandinavia and Finland using own innovative middleware solutions. The resources range from small test clusters at academic in- stitutions to large farms at several supercomputer centers, and are used for various scientific applications. This talk reviews the architecture and describes the Grid services, implemented via the NorduGrid middleware. 1 Introduction The NorduGrid project [1] started in May 2001, initiated by academic institutes in Scandinavia and Finland, aiming to build a Nordic testbed for wide area com- puting and data handling. After evaluating the existing Grid solutions (such as the Globus ToolkitTM [2] and the EU Datagrid (EDG) [3]), the NorduGrid devel- opers came up with an original architecture and implementation proposal [4, 5]. The NorduGrid testbed was set up accordingly in May 2002, and is in continuous operation and development since August 2002. By now it has grown into one of the largest operational Grids in the world, including not only test sites, but also several large production clusters.
    [Show full text]
  • Nordugrid ARC Tutorial
    ARARCC TTututoorriialal Aleksandr Konstantinov Vilnius University/Lithuania and University of Oslo/Norway CoContnteentntss What is Grid? – NorduGrid – What changes? ARC • How it works Misunderstandings. • Job's definition Applications • Preparation Middleware • Simple job – Legion • Sophisticated job – Unicore – Globus Grid users – Authentication – Authorization 2005-03-17 www.nordugrid.org 2 WhatWhat iiss GGrriidd?? Relatively new term – definition is fuzzy. – Many users, many definitions – Common misunderstandings – Standardization just http://www.globalgridforum.org/. Old idea – Uniform and safe access to geographically remote and inhomogeneous computing resources. – Dynamic pool of users and resources. – Distributed management. – Resources belonging to different institutions are linked into system. – Collaboration and social networking are as important as technology Grid environment is usually formed a layer over operating system installed on participating resources and other services. – Term used - “middleware”. I. Foster and C. Kesselmann, The Grid: Blueprint for a New Computing Infrastructure. Morgan Kaufman Publishers, 1998 2005-03-17 www.nordugrid.org 3 WhatWhat changechangess?? Physical location of computing and data resources not important. Unification is important – Usernames and passwords are not used – Unification of resources: virtualization or unified interfaces. • A lot of unsolved problems. • Standardization just started. • Maybe virtual computers can help. • What should adapt, user's application or Grid environment? 2005-03-17 www.nordugrid.org 4 CoCommmmoonn mmiissundundeerrssttandandiingsngs Grid increases resources – One computing resource won't be able to compute more if put on a Grid. – But resources can be used more effectively. Grid is black box and everything inside it happens automatically. – So far in future plans only – In reality users still have to learn about peculiarities of implementation.
    [Show full text]
  • The Advanced Resource Connector User Guide
    NORDUGRID NORDUGRID-MANUAL-6 7/12/2010 The NorduGrid/ARC User Guide Advanced Resource Connector (ARC) usage manual 2 Contents 1 Preface 11 2 Roadmap 13 3 Client Installation 15 3.1 Beginner Installation . 15 3.1.1 Download . 15 3.1.2 Unpack . 15 3.1.3 Configure . 16 3.2 System-wide installation . 16 3.2.1 Installation via repositories . 17 3.2.2 Globus Toolkit . 18 3.2.3 Download the client . 18 3.2.4 Build . 19 3.2.5 Install the client . 20 3.3 Client configuration files . 21 3.3.1 client.conf . 21 3.3.2 srms.conf . 23 3.3.3 Deprecated configuration files . 24 4 Grid Certificates 25 4.1 Quick start with certificates . 25 4.1.1 Certificates in local standalone client . 25 4.1.2 Certificates for system-wide client installation . 27 4.1.3 Obtain a personal certificate . 27 4.2 Grid Authentication And Certificate Authorities . 28 4.2.1 Grid Login, Proxies . 28 4.2.2 Certificate Authorities . 29 4.2.3 Globus Grid certificates . 29 4.2.4 Associating yourself with a proper CA . 30 4.2.5 Friendly CAs . 31 4.2.6 Certificate Request . 32 4.2.7 Working with certificates: examples . 33 3 4 CONTENTS 5 Getting Access to Grid Resources: Virtual Organisations 35 5.1 NorduGrid Virtual Organisation . 35 5.2 Other Virtual Organisations . 36 6 Grid Session 37 6.1 Logging Into The Grid . 37 6.1.1 Proxy Handling Tips . 38 6.2 First Grid test . 38 6.3 Logging Out .
    [Show full text]
  • Pos(ACAT08)046
    The Advanced Resource Connector for Distributed LHC Computing PoS(ACAT08)046 David Cameron∗ab, Aleksandr Konstantinova, Farid Ould-Saadaa, Katarina Pajchela, Alexander Reada, Bjørn Samseta, Adrian Tagaa † aDepartment of Physics, University of Oslo, P.b. 1048 Blindern, N-0316 Oslo, Norway bNordic Data Grid Facility, Kastruplundgade 22, DK-2770 Kastrup, Denmark E-mail: [email protected] The NorduGrid collaboration and its middleware product, ARC (the Advanced Resource Con- nector), span institutions in Scandinavia and several other countries in Europe and the rest of the world. The innovative nature of the ARC design and flexible, lightweight distribution make it an ideal choice to connect heterogeneous distributed resources for use by HEP and non-HEP appli- cations alike. ARC has been used by scientific projects for many years and through experience it has been hardened and refined to a reliable, efficient software product. In this paper we present the architecture and design of ARC and show how ARC’s simplicity eases application integration and facilitates taking advantage of distributed resources. Example applications are shown along with some results from one particular application, simulation production and analysis for the ATLAS experiment, as an illustration of ARC’s common usage today. These results demonstrate ARC’s ability to handle significant fractions of the computing needs of the LHC experiments today and well into the future. XII Advanced Computing and Analysis Techniques in Physics Research November 3-7, 2008 Erice, Italy ∗Speaker. †This work was partly supported by the Nordic Data Grid Facility and EU KnowARC (contract nr. 032691). c Copyright owned by the author(s) under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike Licence.
    [Show full text]
  • The Nordugrid: Building a Production Grid in Scandinavia P
    ATL-SOFT-2003-002 29 March 2003 1 The NorduGrid: Building a Production Grid in Scandinavia P. Eerola∗, T. Ekelof¨ †, M. Ellert†, J. R. Hansen‡, A. Konstantinov§, B. Konya´ ∗, J. L. Nielsen‡, F. Ould-Saada§, O. Smirnova∗, A. Wa¨an¨ anen¨ ‡ ∗Particle Physics, Institute of Physics, Lund University, Box 118, 22100 Lund, Sweden †Department of Radiation Sciences, Uppsala University, Box 535, 75121 Uppsala, Sweden ‡Niels Bohr Institutet for Astronomi, Fysik og Geofysik, Blegdamsvej 17, DK-2100, Copenhagen Ø, Denmark §University of Oslo, Department of Physics, P. O. Box 1048, Blindern, 0316 Oslo, Norway Abstract— The aim of the NorduGrid project is to build and A. Hardware resources operate a production Grid infrastructure in Scandinavia and Finland. The innovative middleware solutions allowed setting up The NorduGrid environment consists of five core sites, a working testbed, which connects a dynamic set of computing mostly dedicated to development and testing of the middle- resources by providing uniform access to them. The resources ware, and several production clusters, scattered across the range from small test clusters at academic institutions to large Nordic countries. The core sites run small test clusters (a farms at several supercomputer centers. This paper reviews the few CPUs) and comprised the initial testbed, put up in May architecture and describes the Grid services, implemented via the NorduGrid middleware. As an example of a demanding 2002. Since this testbed was shown to be stable, reliable and application, a real use case of a High Energy Physics study is capable of executing production-level tasks, more sites have described. joined with large production clusters.
    [Show full text]
  • Performance of the Nordugrid ARC and the Dulcinea Executor in ATLAS Data Challenge 2
    Performance of the NorduGrid ARC and the Dulcinea Executor in ATLAS Data Challenge 2 Robert Sturrock, University of Melbourne, Australia Henrik Jensen, Josva Kleist, Alborg˚ University, Denmark John Renner Hansen, Jakob Langgaard Nielsen, Anders Wa¨an¨ anen,¨ NBI, Copenhagen, Denmark Daniel Kalici, University of Southern Denmark, Odense, Denmark Arto Teras,¨ CSC, Helsinki, Finland Helmut Heller, Leibniz-Rechenzentrum, Munich, Germany John Kennedy, Gunter¨ Duckeck, Ludwig-Maximilians-Universitat,¨ Munich, Germany Aleksandr Konstantinov, University of Oslo, Norway and Vilnius University, Lithuania Jan-Frode Myklebust, University of Bergen, Norway Farid Ould-Saada, Katarzyna Pajchel, Alex Read, Haakon Riiser, University of Oslo, Norway Morten Hanshaugen, Sturle Sunde, USIT, University of Oslo, Norway Andrej Filipciˇ c,ˇ Matevzˇ Tadel, Jozefˇ Stefan Institute, Ljubljana, Slovenia Leif Nixon, NSC, Linkoping¨ University, Sweden Paula Eerola, Balazs´ Konya,´ Oxana Smirnova, Lund University, Sweden Jonas Lindemann, LUNARC, Lund University, Sweden Lars Malinowsky, Nils Smeds, PDC, KTH, Stockholm, Sweden Ake˚ Sandgren, Mattias Wadenstein, HPC2N, Umea˚ University, Sweden Tord Ekelof,¨ Uppsala University, Sweden Christian Haberli,¨ University of Bern, Switzerland Mattias Ellert∗, CERN, Geneva, Switzerland Abstract dedicated resources across the world in order to enable ef- fective participation in ATLAS DC2. This was the first at- This talk describes the various stages of ATLAS Data tempt to harness large amounts of strongly heterogeneous Challenge 2 (DC2) in what concerns usage of resources resources in various countries for a single collaborative ex- deployed via NorduGrid’s Advanced Resource Connector ercise using Grid tools. This talk addresses various issues (ARC). It also describes the integration of these resources that arose during different stages of DC2 in this environ- with the ATLAS production system using the Dulcinea ex- ment: preparation, such as ATLAS software installation; ecutor.
    [Show full text]
  • Nordugrid and Related Projects – There Have Been 3 New Projects That Started This Summer
    NorduGridNorduGrid andand relatedrelated Projects:Projects: OverviewOverview forfor IHEPCCCIHEPCCC Farid Ould-Saada University of Oslo NorduGrid and ATLAS A research collaboration that develops and supports Advanced Resource Connector middelware and coordinates the various inputs to the ARC code Provides middleware to research groups, national Grid projects: ARC on 50 sites in 14 countries NorduGrid: yesterday, today, tomorrow From ... ... To EDG >ARC 4Nordic >14 countries Tesbed >50 sites 20cpu’s >5000 cpu’s ... fromHEP a research project to a >research + Bio,Chem.,... collaboration 2001 >2003 ... from a Grid testbed to a major middleware provider May 01 Oct 02 Apr 03 06/06 ... It is NOT an infrastructure, Nordic–does Council not of operate ministers or controlNatural Science resourcesCollaboration of Nordic NDGF (Nordunet 2 program) Research Councils HEP institutes+ others NorduGrid (NOS-N) NGN networkÆ Nordunet3 ... Now several related projects KnowARC 05/10/2006 www.nordugrid.org 3 TheThe bigbig picturepicture aroundaround NorduGridNorduGrid Researh Projects N e Group tw o rk s N3 N G N Partner ARC middleware Institute Home Coder KnowARC NDGF 05/10/2006 www.nordugrid.org 4 WhyWhy furtherfurther developdevelop ARC?ARC? • Lightweight standalone client package, easy to install and use • Reliable resource for scientific applications in many research fields • Available on a wide range of Linux platforms • Non-centralised architecture ARC middleware architecture • Needs no centralized operations infrastructure • Non-intrusive, coexists with other softwares and configurations See Top 10 Reasons to Use ARC for more details Æ Suits needs of heterogeneous distributed shared resources Æ Is in many aspects interoperable with other middlewares 05/10/2006 www.nordugrid.org 5 HEP vs Non HEP: Example of applications run simultaneously on NorduGrid Although HEP initiated, Non HEP applications dominate • Applications "gridified" with NorduGrid include various non HEP examples • biomedical sciences (e.g.
    [Show full text]
  • Grid Technology
    GRID TECHNOLOGY Grid Technology Research Assistant, Dr Oxana Smirnova, sings the praises of the distributed supercomputer… t was only a few years ago when the tendency of deploying bigger and bigger supercomputers for scientific Iresearch and industry started to give way to a new paradigm, the Grid. The idea of joining the existing variety of computing resources by using a layer of dedicated tools, the Grid middleware, became particularly appealing for large multinational collaborations, eager to merge their distributed computing power into a single, well managed facility. Applications from such areas as aeronautics, space sciences, interactive visualisation and teleimmersion were the first to try the emerging Globus Toolkit™ (www.globus.org), which became a defacto standard for the Grid middleware. In 2001, the huge community of high energy physics, famous for inventing the world wide web, made a bold step, to provide the researchers with a reliable, distributed facility, committing resources to the EU-sponsored DataGrid project to be used in their daily work. However, during the evaluation [www.eu-datagrid.org], aiming at developing a solution of existing and developing tools, it became clear that none satisfying their needs of high-throughput, data-intensive satisfies these requirements. Meanwhile, the needs of the distributed computing. ATLAS collaboration to start computing simulation of the physics processes and the detector performance were For us, the high energy physicists in the Nordic countries, the pressing, and by no means
    [Show full text]
  • Nordugrid: Collaboration Status Update
    NorduGrid: Collaboration status update Oxana Smirnova (Lund, NDGF/KnowARC/NGIn/NorduGrid) OSG Consortium All Hands Meeting March 6, 2007 Outline . ARC middleware status . Overview of NorduGrid-spawned projects – EU KnowARC, NDGF . Interoperation and standardization 3/6/2007 www.nordugrid.org 2 Glossary . NorduGrid: a research collaboration – Based on MoU – Supports ARC m/w development via different projects . ARC: Advanced Resource Connector, the middleware – GPL software, coordinated by NorduGrid . KnowARC: an EU FP6 R&D project (STREP) – Core contributor to ARC development . NDGF: The Nordic DataGrid Facility, a Nordic infrastructure and R&D project – Builds the Nordic Tier-1 – Contributes to ARC development . NGIn: Innovative Services and Tools for NorduGrid, an educational and R&D project by Nordunet3 . NGN: Nordic Grid Neighborhood, a networking project – ARC community in Nordic countries, Baltic states and Russia 3/6/2007 www.nordugrid.org 3 December 2006: ARC 0.6 is out . A major milestone: first stable release since 2004 . New external software, new packaging and distribution – Globus 4 based (pre-WS components, e.g. GSI, gridftp libraries, LDAP backends) – Latest VOMS, GSOAP – Natively works on 64bit architectures – Streamlined configuration – Software repositories (e.g. apt, yum) . Major re-write of the client code base – Modularity – Multithreaded implementation – Extensibility . Client libraries, modules available for C++ (native), Python, Perl etc – via SWIG, and Java – Easy to use API for clients that need to interface to ARC, e.g. GUIs, portals, CLIs . Many new attributes in the information system – Better monitoring and decision-making – Improved job description 3/6/2007 www.nordugrid.org 4 More new ARC 0.6 features .
    [Show full text]