Inspired

ISSUE 11 APRIL 2013 news from the EGI community ) s n o m m o c i k i w

/ i l l e n o T

. A s a l o h c i N (

TOP STORIES

Federated Cloud in theory & practice page 2

The EGI Platform Architecture page 3

Ten years of production activity page 5

The EGI-InSPIRE mini-projects page 7

MORE

01 The EGI Technical Forum in Madrid European Grid Infrastructure 04 The Community Software Repository

www.egi.eu 10 VT report: A new scientific classification for EGI

11 Partner spotlight: GÉANT This Issue Welcome to Issue 11 of Inspired! What is new in EGI?

> Michel Drescher writes about the benefits of the EGI Federated Cloud for the WeNMR project > Gergely Sipos explains how technology support will work from May 2013, after the end of EMI and IGE > Kostas Koumantaros and Marios Chatziangelou write about the Community Software Repository > Tiziana Ferrari and Małgorzata Krakowian look at the infrastructure’s trends and performance This issue's photo is a view of cummulus > Sy Holsinger introduces the new tree of scientific clouds over a corn field in the Northampton disciplines County, Pennsylvania. Plus: (Illustration: Nicholas A. Tonelli / wikicommons) > What GÉANT is doing in Central Asia > ...and an overview of the EGI-InSPIRE Mini-Projects

Your thoughts and comments are always welcome! Sara Coelho, [email protected] The EGI Technical Forum 2013 in Madrid Providing an integrated sustainable Open Compute and Data Infrastructure for Open Science in the digital European Research Area

The EGI Technical Forum, infrastructure cloud that will hosted by EGI.eu and IBERGRID, enable the work of a diverse will take place in Madrid, Spain new generation of users from 16 to 20 September. > Continuing to build and Over the last decade, EGI has coordinate the community of established an open compute experts that use, operate, and data infrastructure built by manage, develop, support and federating national computing provide outreach and storage resources across > Refining the financial, technical Europe and around the world. and political governance of the Through the support provided EGI ecosystem so that it continues by national governments and to thrive for the decades to the European Commission, the come EGI Technical Forum in Madrid The EGITF 2013 will be co-located will reflect on the current with international workshops achievements and the plans for and meetings relating to the the future around: research challenges and technical > Bringing leading research interoperability needed to communities and researchers operate and exploit an open together to tackle societal compute and data infrastructure. challenges through the use of These include IBERGRID 2013, More information innovative technology EU-Brazil OpenBIO, Open Grid > Sustaining the existing Forum 39, GlobusEUROPE and EGI Technical Forum website infrastructure and bringing into the CloudPlugFest interoperability - http://tf2013.egi.eu production a federated workshops.

Inspired // Issue #11 April 2013 1 The EGI Federated Cloud, from theory to practice

Michel Drescher explains how the grid of clouds is helping the WeNMR project

Over the last year, EGI has been The federated cloud solution working to assemble a new type Together with EGI, the WeNMR of infrastructure for the project looked at how these research community, built on issues could be addressed using top of clouds and virtualisation cloud computing. Here is what process next (provided by technologies. they have done, and how it SURFsara) If you are not a computer benefits their research. So here you have a virtual scientist, your first reaction The basic premise is the fact infrastructure deployed on a might as well be: “Yeah, so what?” that NMR data analysis can be real, physical infrastructure; So let me answer this with an massively parallelised. The NMR both are quite different yet example. database consists of thousands neatly integrate with each other The WeNMR use case of different proteins, and for – that is the beauty of Within WeNMR, structural each protein many independent virtualisation! biologists use computing NMR data sets exist. So, while The benefits for WeNMR one data set is processed, a resources to achieve two goals: > Total control over the exact set second one could be running at (a) compute structural models of applications assembled into the same time and that is also of molecules, and (b) use old the Virtual Machine and new nuclear magnetic possible with many, many more > Flexible deployment and resonance (NMR) data to in parallel. instantiation of the VirtualCING validate protein simulation and The WeNMR project did exactly image across EGI modelling using advanced that: they packaged all necessary > Elastic resource consumption algorithms and techniques. This applications into one Virtual based on real need (think of a involves a number of applications Machine (VM), and built a bubble that can expand and that ingest raw NMR data to lightweight infrastructure contract again as required) return a validated (or not) around it so that many Virtual protein model. The applications Servers could run from that one > Finish the computational are maintained by the CING VM at maximum efficiency. effort in less than 10% of the time compared to a community (Common Interface The only required inputs are: for NMR Structure Generation). conventional, local resource > Request a data set from the based approach. The mechanisms used to get all NMR database, and these applications deployed EGI is Federated Cloud > Upload the results to the NMR directly in the same e-infrastru- Computing available for you database. cture, shared with many other and your team's research needs. This allows for a lightweight colleagues from other scientific architecture, requiring only More information domains, are complex. A key three key components: issue here is the difficulty to To learn more, or to try the contain (mostly) unintentional 1. The central NMR database EGI cloud resources, email: misuse in such a shared 2. The VM image (the [email protected]. infrastructure. Researchers 'VirtualCING' image) at the cloud aren’t happy with this, and the providers infrastructure providers aren’t 3. A service that tells the Virtual Or download the flyer at: really happy either. Servers which dataset to http://go.egi.eu/fc-flyer

Inspired // Issue #11 April 2013 2 The EGI Platform Architecture

Gergely Sipos discusses how technology support will work from May 2013, after the end of the European Middleware Initiative and the Initiative for Globus in Europe

In April 2013, with the end of the EMI and IGE projects, the EGI community lost its two largest technology providers and contri- butors to the Unified Middleware Distribution. This is a milestone for EGI and in Europe: for (3) EGI Collaboration Platform, will facilitate the interaction of the first time since 2001, we will for information exchange and software developers, scientific have no dedicated project for community coordination (e.g. communities and platform grid middleware development. AppDB), and integrators within EGI. Software But just because EMI and IGE (4) Community Platforms, developers will be able to choose are over, development is not service portfolios customised from three levels of involvement finished. Most of the teams for scientific communities. in this process. involved in EMI and in IGE are The Platform Architecture allows - ‘Integrated providers’ have the still active, fixing bugs for the any type and any number of strongest ties with EGI and communities who depend on community platform to co-exist scientific communities, via their products. Some teams on the physical infrastructure. MoUs and SLAs, and support is even implement new functiona- Some can provide a grid functio- provided by the EGI Helpdesk. lities in their software, but the nality – similar to EMI’s and IGE’s They are represented in the coordination and integration services. Others can include Technical Coordination Board and provided by EMI and IGE are gone. completely unique services that can influence EGI’s evolution. For EGI, the answer is the EGI run, for example, in Virtual - ‘Community providers’ simply Platform Architecture - a new Machines. make their software available software integration and A flexible system via AppDB, now extended to provisioning process. EMI’s and IGE’s distributions incorporate a Community A platform-based EGI included a large number of Software Repository. EGI started developing its services that very few commu- - ‘Contributing providers’ are in platform architecture in early ities required in bulk. In practice, between – they will offer guaran- 2012 and refined it over the last most user groups require a tees on the quality of their 14 months. The platform-based relatively simple, but variable software and user support, but EGI is capable of supporting a set (e.g. a service for job not as regimented as for broad customer base with a very execution and a file storage integrated providers. diverse set of requirements, and service, plus some services for EGI will move to the platform- with a much smaller central security and access control). based model gradually. The coordination effort. The flexibility of the new system focus will now be on establishing The platform-based EGI consists allows the ‘long tail of scientists’ new links between software of: to be better represented in the developers, software integrators (1) EGI Core Infrastructure platform-based EGI. and operators across the Platform, to operate and manage More choice for developers community. a distributed infrastructure (e.g. EGI.eu will support the develop- The conceptual change is big accounting); ment of community platforms but what scientists will notice is (2) EGI Cloud Infrastructure integrated and operated by the added freedom and options Platform, to operate a federated scientific communities in colla- offered by the EGI Platform cloud based infrastructure; boration with the NGIs. EGI.eu Architecture.

Inspired // Issue #11 April 2013 3 The Community Software Repository is now available through AppDB

Kostas Koumantaros and Marios Chatziangelou

The EGI Application Database > supports a light-weight & The extension to the AppDB (AppDB) has been extended community driven release service is part of the roadmap with newly developed management process; to a platform-based EGI and Community Software support > populates software metadata was developed in collaboration features. required by administrators as with the EGI Software repository. The extension will support the well as software developers; Community Technology Providers, > disseminates new releases grid/cloud software developers through established who will become part of the EGI communication channels. ecosystem from May 2013. Technically speaking, the new More information Community providers do not system offers, for example: need a formal agreement to > support for generic tarballs, Your feedback is valuable in engage with EGI. They will simply RPM & DEB binaries; the continuing development upload their software into the > support for multiple flavour / of the service. EGI Community Software operating system combinations; Repository through the AppDB > simplified, web-based, process EGI Applications Database service, where potential user for uploading the binary communities can access the http://appdb.egi.eu/ artifacts; packages and can integrate the > [email protected] software into their own > YUM & APT repositories for automatic deployment of the Community Platforms. EGI Community Software software; The Community Software Repository > mechanisms for initiating, Repository http://repository.egi.eu/community updating, removing, renaming, > offers the ability to manage > [email protected] and publish an unlimited series publishing, even cloning of releases per registered releases and their associated KK leads the EGI Software software item; repositories. Repository and Support tools team MC heads the EGI AppDB development team

Inspired // Issue #11 April 2013 4 EGI celebrates ten years of production activity

Tiziana Ferrari and Małgorzata Krakowian look at the infrastructure’s trends and performance

Over the past decade EGI has around the world. Compare and world, bringing together 27 evolved from a testbed of few contrast with the state-of-the- national and 9 federated thousands of CPUs, to an art at the first EGEE conference operations centres, 340 certified infrastructure that provides in Cork, when the infrastructure Resource Centres across 55 more than 373,000 CPU cores to was reported to include 1841 countries in Europe, Asia and support international research CPUs. North and South America, and collaborations. Since 2004, EGI Today EGI with its integrated CERN an European has delivered the equivalent of resource infrastructure Intergovernmental Research about 4.4 million years of providers is probably the largest Organisation. compute time (38.8 billion CPU multidisciplinary distributed e- wall time hours) to scientists infrastructure for science in the

Compute capacity and utilisation During EGI-InSPIRE's third project year the infrastructure has been running steadily with an increase in capacity and use driven by High Energy Physics, Astronomy and Astrophysics and Life Sciences. The relative yearly usage of EGI resources significantly increased for various disciplines. To date EGI provides more than 370,000 logical CPU cores (+33.6% yearly increase since May 2012) and 170 PB of disk space that are made accessible by a software services that allow secure data access, transfer, replication and processing. During 2012, EGI engaged with one new partner infrastructure – the Ukrainian National Grid, but unfortunately the Irish NGI Key numbers stopped its operations services. From January 2004 to date EGI > Operations Centres : 36, of which 9 are federated executed more than 1.6 billion > Resource Centres : 340 in 56 countries and CERN compute jobs amounting to > CPU cores : 373,800 38.8 billion CPU wall time hours > HEP-SPEC06 : 3.86 million (normalised elapsed time to a > Disk capacity : 170 PB reference value of HEP-SPEC06) > 212 VOs with 22,067 users > Usage: 1.67 million (grid only) and 2.25 million (grid and local) for an average daily job jobs per day submission rate of 1.6 million > Availability : 97.3% grid compute jobs/day since > Reliability : 99.0% March 2012.

Inspired // Issue #11 April 2013 5 Users and usage EGI has in total 22,067 users (+5.4%), distributed in 212 Virtual Organisations (VOs) (-6.2% from March 2012) listed in the EGI Operations Portal. The decrease reflects the campaign for decommissioning inactive VOs which has started in 2013. Almost all disciplines registered an increase of users from May 2012: the Multidisciplinary VOs (+14.0%), Computer science and mathematics (+11.9%), Fusion (+4.2%), High-Energy Physics (+3.9%), Earth Sciences (+2. 8%), Computational Chemistry (+1.7% ) and Life Sciences (+0.9%). High-Energy Physics (38.6% of However, the relative resource EGI used CPU wall clock time, the user community) is utilisation of other disciplines with a +74.8% yearly utilisation consuming the majority of EGI has been significantly increasing increase, followed by the Life resources: 93.3% of the as well. The Astronomy and Sciences community who saw a normalised CPU wall time Astrophysics discipline is the +105.62% yearly increase to available in the infrastructure second in terms of usage, 1.4% of the overall resource (+37.6%, since May 2012). consuming 2.9% of the overall consumption.

Middleware A major upgrade campaign to replace unsupported gLite software was successfully completed in 2013 and involved more than 350 production service end-points. Administrators now have a policy framework, procedures and support in the operations tools to support their task of keeping the software up to date. Various middleware stacks are in production in EGI. An indication of their distribution is given by the various Compute Element deployed by Resource Centres. CREAM-CE is in infrastructure, deployment of From May 2012 the staged production in the 89.41% of the software updates requires rollout community performed infrastructure, ARC-CE is second coordination and needs to follow 178 tests in preparation to the in deployment (0.11%) followed a well-defined process. releases of the Unified by GRAM (1.49%), EGI does this through Staged Middleware Distribution of EGI. Unicore6.TargetSystemFactory Rollout, i.e. by gradually installing Staged Rollout is a community (1.49%) and QCG.Computing updates that successfully passed effort which is currently (1.12%). internal verification, by a group contributed by 74 distributed In a large-scale distributed of expert Resource Centres. teams. Inspired // Issue #11 April 2013 6 The EGI-InSPIRE mini-projects

Following the endorsement in June 2012 of the EGI strategy as a framework for the future, the EGI- InSPIRE project asked the consortium to suggest mini-projects to accelerate: > Coordination and community building across EGI and its stakeholders > Maintaining and expanding the operational infrastructure and its management tools to support other research infrastructures and cloud technologies. > Promote the deployment of customisable Virtual Research Environments. Over 30 suggestions were received and peer-reviewed by the Project Management and Activity Management Boards, of which eleven were prioritised for implementation. The mini-projects are in essence funded Virtual Teams working to promote EGI’s strategic goals and reduce the barriers for researchers to access the resources that they need wherever they may be located. Detailed information is available online: http://go.egi.eu/mini-projects

A new approach to computing Infrastructures such PRACE and large distributed Availability/Reliability reports for EGI projects (e.g. EUDAT or wLCG). The work will As it is, the Availability/Reliability (A/R) service is facilitate more flexible use of a single GOCDB a closed-source solution built on top of a instance that means operational costs can be commercial database system which cannot be distributed across more stakeholders and easily extendable to meet the needs of the NGIs therefore reduced for each of them. and EGI Operations to introduce the new Consortium: STFC middleware services or regional calculations. The goal of the mini-project is to develop a new Massive open online course development open source A/R reporting service, extended to The goal of this project is to develop massive include figures from VOs, NGIs, services or open online courses (MOOCs) to grow the resource centres. NGIs, VO administrators and knowledge base of grid users across Europe and the broader operations community will benefit the world, increase the visibility of EGI with high- from an elastic, customisable A/R reporting quality content and enhance knowledge transfer. service with metrics tailored to different MOOCs take the shape of virtual classes offered requirements. online for free. They are available for current Consortium: GRNET (lead), IN2P3, CNRS, SRCE and future users, and also provide a scalable approach to training new users. GOCDB scoping extensions and The mini-project's aims include the selection of management interface an existing MOOC environment (e.g. Coursera) The current scoping implementation in GOCDB and the development of a number of courses on provides only mutually exclusive ‘local’ and ‘EGI’ e-infrastructure topics covering, for example, tags – in practice a site or service can only be EGI’s existing production infrastructure, the tagged with a single scope. Additionally, GOCDB capabilities of EGI’s Federated Cloud does not provide a management interface for infrastructure, and Hadoop with NoSQL. daily operational tasks. The goals of the mini- Consortium: SURFsara project are to: > enhance scoping by introducing multiple, non- Tools for automating applying for and exclusive scopes and create a tag cloud. Single allocating federated resources objects will be able to be tagged by multiple A coordinated resource allocation mechanism is projects, allowing projects to store topology data essential to a federated infrastructure such as in a single GOCDB. EGI. Efficient allocation systems ease the > introduce a management interface to simplify matching of resource demand and offer to daily tasks to reduce the introduction of errors support collaborations and ensure that the and operational costs. usage of EGI resources and services is properly The main beneficiaries are GOCDB users, the acknowledged. EGI operations community and other e- This mini-project will develop a tool to automate

Inspired // Issue #11 April 2013 7 how applications negotiate the usage of resources Providing OCCI support for arbitrary Cloud offered by federated resource providers. The Management Frameworks expected output is a web-based operations This project will maintain and extend the rOCCI application, integrated with existing production framework as an interoperability layer, using the operations tools, that will be able to: 1) trace Ruby cloud services library to provide support demand and offer, 2) authenticate customers for arbitrary cloud frameworks across compute, and providers, 3) programmatic scalable storage and network capabilities. The additional processing of demand and offer, 4) reduce costs support will be achieved by incorporating an by automating the human processes around established open-source tool into the rOCCI service offering following ITIL best practices. implementation. User community testing will be provided by the The main beneficiaries of this work are resource Biomed VO. providers unable to join the federation due to Consortium: Cyfronet (lead), CNRS unavailable or obsolete OCCI implementation and researchers and user communities who will VO administration and operational portal - get a bigger pool of usable resources. The VAPOR federation will benefit from better sustainability Small to medium-size grid user communities have and opportunities for synergies in the to perform administrative and operational tasks technologies used. that can become time consuming for staff Consortium: CESNET focused on research activities rather than system administration. CDMI Support in Cloud Management The goal of the mini-project is to develop VAPOR, Frameworks a generic VO administration and operations CDMI will be used to build a standards-based portal. VAPOR will help existing user research storage solution to allow different Cloud communities to sustain their operational models Management Frameworks within the EGI by sharing the administrative and operational Federated Cloud to attach data files and block costs. It will also lower entry barriers for new devices to the deployed VMs. The prototype will communities by easing the administration and be integrated with EGI’s Federated AAI model operations start up curve. This portal will be and will offer block storage to OCCI-compliant available to any community wishing to deploy it, services. A browser-friendly interface will be and licensed as open source software. developed on the basis of the CDMI HTTP based The VAPOR portal will also help to identify the RESTful protocol to enable end-users to access users behind robot certificates, and to keep and manage their files in an easy way. track of the articles produced by the users The beneficiaries will be the sites and communities within the community. willing to provide or consume resources using Consortium: CNRS (several institutes), GRyCAP, standards-based cloud APIs. An additional IPHC benefit for the communities is the ability to migrate between EGI and public cloud offerings Evaluation of Liferay modules more easily due to the existence of similar Web portals, and particularly Liferay-based web services. portals, are the main choice to provide user- Consortium: KTH friendly access to EGI infrastructure services. In addition to the Liferay portal framework, there Dynamic Deployments for OCCI Compliant are a large number of community supported Clouds modules (e.g. Liferay Social Office and Sync This task will adapt the open-source SlipStream modules). These modules could contribute to software to enable users to dynamically the adoption and use of portals by both current provision complex multi-VM applications, with and new users to EGI, and could also replace or elements of elastic behaviour and automatic supplement some of the existing EGI portals, image factory. All results from this project will be which would help their sustainability. contributed to the SlipStream Apache 2.0 open Consortium: EGI.eu (lead), INFN, MTA STAKI, source repository. CESNET The result will be an end-to-end solution that

Inspired // Issue #11 April 2013 8 will enable the on-demand provisioning of a infrastructure by large projects, which have scientific analysis environment on federated complicated software stacks to deploy. From an cloud resources to meet the needs of scientific operational point of view, it will reduce the end-users. overhead of the local site administrators when it The work will benefit resource centres who will comes to install application software and support be able to offer complex cloud resources to end- complicated software stacks for different user users, and platform integrators who will be able communities. to deploy complex scientific applications in the Consortium: CSIC (lead), FCTSG cloud, in a parameterised, repeatable and systematic way. Transforming Scientific Research Platforms Consortium: CNRS to Exploit Cloud Capacity The mini-project will create the means to make Automatic Deployment and Execution of scientific applications cloud-ready. This will be Applications using Cloud Services done by analysing a set of use cases and Setting up the required computing environment compiling a set of best practices to ease the is a serious overhead for the everyday work of uptake of cloud technologies by research researchers. This mini-project will design and communities. implement a new user service model that will The initial beneficiaries of this work will be the allow researchers to use the EGI Federated user communities whose use cases are selected Cloud to deploy the Virtual Research for analysis. In addition and in the long term, the Environments they need for their resources. lessons learnt from the exercise will benefit the The service is oriented to advanced users and entire community of end-users and application application managers within research developers. communities. It will enhance the adoption of the Consortium: FZJ (lead), CESNET

The EGI Webinar series http://go.egi.eu/webinars

What?

Web-based presentations and workshops hosted by a guest lecturer specialised in a field relevant to the EGI community.

Each lecture will be followed by a Q&A session.

Why?

> EGI Webinars provide first hand information about established and/or emerging services that can help you to use EGI's grid and cloud services.

> EGI Webinars are an opportunity to interact with user support teams, developers, site managers, platform integrators, researchers and NGIs.

> EGI Webinars are free and are open to anyone. Pre-registration required as participation is limited to a set number of connections)

Next event: Catania Science Gateway Framework - 15 May, 16:00-18:00 CET

Inspired // Issue #11 April 2013 9 VT Report: A new scientific classification for EGI

Sy Holsinger on the new tree of scientific disciplines

EGI is a multidisciplinary e- The result is a hierarchy of More information Infrastructure used by thousands scientific disciplines encompassing of researchers working on a all fields of research and verified contact: [email protected] variety of scientific disciplines. by forty EGI Virtual Organisation Throughout the years, new Managers and was open for url: http://go.egi.eu/SDC-VT technological advances and public comments. The new collaboration opportunities classification will be have continuously evolved the implemented in all EGI services, diversity of its users and the from the website to the nature of EGI itself. Operations Database over the The evolution left us with a coming months. legacy - the way scientific disciplines were classified in EGI five years ago is no longer This list contains only the first two tiers of the new indicative of the current usage. classification. Third level (e.g. high-energy physics in physics, or Half of users now fall into the zoology in biological sciences), fourth level (e.g. b-physics in 'other', 'multidisciplinary' and HEP, or entomology in zoology) and below are also considered 'infrastructure' categories, with but not listed here. the rest spread across only seven disciplines. Natural Sciences Engineering and Technology > Mathematics > Civil engineering This is clearly not very helpful > Computer sciences > Electrical, electronic and when we report our usage information engineering activity. > Information sciences > Mechanical engineering Due to these issues, it has > Earth sciences become essential to agree on a > Biological sciences > Aerospace engineering common, coherent classification > Space science > Chemical engineering that is consistent across all Physical Sciences > Materials engineering & science infrastructure tools and allows > Physics > Biomedical engineering for smooth inclusion of both > Chemical sciences > Environmental engineering current and future user Medical and Health > Environmental biotechnology communities. An updated Sciences > Industrial biotechnology scientific classification would > Basic medicine make it easier for new researchers > Nano-technology to engage others in their field of > Clinical medicine Humanities interest as well as in communi- > Health sciences > History and archaeology cating externally with funding > Medical biotechnology > Languages and literature agencies, resource centres and Social Sciences > Philosophy, ethics and religion potential research communities. > Psychology > Arts This was the task of the Scientific > Economics, finance & business Agricultural Sciences Discipline Classification VT. The > Educational sciences > Agriculture, forestry & fisheries virtual team investigated the > Sociology > Animal and dairy science current usage of scientific > Law > Veterinary science disciplines in EGI, looked into more than ten available classifi- > Political science > Agricultural biotechnology cations, and analysed how the > Social & economic geography Interdisciplinary classification impacts each EGI > Media and communications > Analytical Facilities tool. Ultimately, the Frascati FOS > Training/Demonstrations Classification was selected as a > Infrastructure development foundation for EGI's scheme.

Inspired // Issue #11 April 2013 10 Partner Spotlight: GÉANT

The European grid does not work in a vacuum - it needs a multitude of partners around the globe supporting the work it does. Within Europe one of the key organisations is GÉANT, the data network built for the research and education community. Without the GÉANT infrastructure it would not be possible, for example, to shift datasets around the grid. But GÉANT provides more than just transporting high-volumes of data around EGI. Its work enables easy communication and data sharing, even from remote mountaintops. The Merzbacher Station and Inylchek Glacier in central Tien Shan, Asia. Credit: Julia Neelmeijer GÉANT’s expertise is vital to monitor climate change and to provide early flood warning, it’s not that simple, the project is searchable and accessible particularly in Central Asia generating large amounts of location, making it the central where glacier retreat is likely to data, in very remote locations reference source on climate be an issue. Last year meltwater and time is of the essence. change in Central Asia. from the Tien Shan glacier GÉANT has been working with Projects such as this caused lake Tez-Tor in its Central Asia counterpart demonstrate that research is Kyrgyzstan to break its banks CAREN to build a high-speed transformed by connecting and flood the inhabited Ala network connecting all the regional resources with global Archa valley. This was not an participating institutes and infrastructures. This is an isolated incident. Retreating monitoring stations. “Thanks to overlapping aim for both EGI glaciers are leading to large GÉANT and CAREN we can and GÉANT and has led to an scale flooding, avalanches and quickly share information with ongoing partnership, formalised mudslides around the region, our European partners,” says with an agreement to often with disastrous results. Bolot Moldobekov, co-director strengthen the service offering This is where the Gottfried of Central Asian Institute of to the global research Merzbacher Global Change Applied Geosciences. “Speeding community in January 2013. Observatory comes in. Perched up the processing of monitoring atop the Tien Shan glacier, the data and enabling us to work observatory is part of a project together to predict the impact monitoring glaciers across of climate change and protect Europe and Central Asia and our local environment.” collects meteorological and Now the Central Asian Institute hydrological data, as well as ice of Applied Geosciences (CAIAG) thickness, glacier speed and is putting all this data into its seismic activities. Once put Geo Database of Central Asia together with satellite (GDB). The GDB brings together More information observations, the data allows digitised maps, satellite and researchers to get a robust view aerial images along with http://www.geant.net/ of the changing conditions. But geophysical data into a single,

Inspired // Issue #11 April 2013 11