Computing and Storage Resources

From D4Science Wiki
Jump to: navigation, search

This section describe the Computing and Storage Resources provided to the D4ScienceAn e-Infrastructure operated by the D4Science.org initiative. Ecosystem by external grid and cloud e-Infrastructures.

EGI

EGI (European Grid Infrastructure) is a project willing to create and maintain a pan-European Grid Infrastructure in collaboration with National Grid Initiatives (NGIs) in order to guarantee the long-term availability of a generic e-infrastructure for all European research communities and their international collaborators. The European Grid Infrastructure will (1) Operate a secure integrated production grid infrastructure that seamlessly federates resources from providers around Europe, (2) Coordinate the support of the research communities using the European infrastructure, and (3) Work with software providers within Europe and worldwide to provide high-quality innovative software solutions that deliver the capability required by user communities. EGI provides storage and computing resources, distributed across hundreds of sites worldwide and is based on different interoperable grid middleware solutions such as Globus, gLite, Arc, and Unicore. The resources offered by the EGI infrastructure significantly extend the storage and computing capacity available under the iMarine Data e-InfrastructureAn operational combination of digital technologies (hardware and software), resources (data and services), communications (protocols, access rights and networks), and the people and organizational structures needed to support research efforts and collaboration in the large.. These resources are accessible from the iMarine Data e-InfrastructureAn operational combination of digital technologies (hardware and software), resources (data and services), communications (protocols, access rights and networks), and the people and organizational structures needed to support research efforts and collaboration in the large. under a Research Collaboration Model.

The EGI sites supporting the d4science.research-infrastructures.eu VOVirtual Organization; provide the following services to the D4ScienceAn e-Infrastructure operated by the D4Science.org initiative. Ecosystem:


Site short name Site Official Name name CREAM CE WN SE
INFN-BARI INFN-BARI, BARI, Italy
Yes.png
Yes.png
INFN-TRIESTE INFN-TRIESTE
Yes.png
Yes.png
Yes.png
Taiwan-LCG2 Academia Sinica Grid Computing Center
Yes.png
Yes.png
Yes.png
csTCDie CSTCDIE, Trinity College Dublin, Ireland.
Yes.png


The following table summarizes for each site the contribution to the the d4science.research-infrastructures.eu VOVirtual Organization; in terms of CPU and free storage capacities.


Site CPUs Available Storage
INFN-BARI 4000 -
INFN-TRIESTE 2353 1.1 TB
Taiwan-LCG2 240 50.7 GB
csTCDie - 4.1 TB
Total 6593 5.2 TB

VENUS-C

VENUS-C (Virtual multidisciplinary EnviroNments USing Cloud infrastructure) is focused on developing and deploying a Cloud Computing service for research and industry communities in Europe with the aim of: (1) creating a platform that enables user applications to leverage cloud computing principles and benefits, (2) leveraging the state of the art to bring on board early adopters quickly, incrementally enable interoperability with existing Distributed Computing Infrastructures (DCIs), and push the state of the art where needed to satisfy on-boarding and interoperability, and (3) creating a sustainable infrastructure that enables cloud computing paradigms for the user communities inside the project and new communities. VENUS-C offers an industrial-quality service-oriented platform based on virtualisation technologies by providing an open and generic Application Programming Interface (API) at platform level for scientific applications. The VENUS-C platform is based on both commercial and open source solutions underpinned by the Engineering data centre, Microsoft through the Windows Azure and its European data centres, and two European High Performance Computing centres: The Royal Institute of Technology (KTH, Sweden) and the Barcelona Supercomputing Center (BSC, Spain).

The resources offered by VENUS-C significantly extend the storage, computing, and service hosting capacity available under the iMarine Data e-InfrastructureAn operational combination of digital technologies (hardware and software), resources (data and services), communications (protocols, access rights and networks), and the people and organizational structures needed to support research efforts and collaboration in the large.. The VENUS-C abstraction allows the immediate access to a concrete set of resources providers (Engineering Data Centre, Microsoft Windows Azure, KTH, BSC). These resources are accessible from the D4ScienceAn e-Infrastructure operated by the D4Science.org initiative. Ecosystem initially under a Research Collaboration Model and afterwards under a commercial Cloud Model.

The collaboration between iMarine and VENUS-C, is granting access to Microsoft Azure computation and storage cloud resources. which can be reported in the following table:


Cloud Name CPU Hours Storage Space End of collaboration
MS Azure
1.5 M
1.5 TB
April 2013

Eulos Cloud

Eolus is an open source project which aims to join the forces of an IaaS-Cloud (currently OpenNebula) and Java Enterprise Edition. it provides functionality that serves the needs of the lab MaDgIK at the University of Athens. In short, the cloud middleware (OpenNebula) is used as a management tool for virtual resources exploited in building higher level custom services available through a JEE application container. An advance VM scheduler called Nefeli and a web based administration console are only a couple of such high level components offered to users. The software is released under the EUPL licence.

iMarine is exploiting the Eulos Cloud resources provided by the NKUA project partner in order to dynamically allocate gCube Hosting nodes needed for a VREVirtual Research Environment. deployment. The process of VREVirtual Research Environment. Approval and deployment [1] infact includes an elastic resource acquisition trough Eulos.

The Eulos resources currently available for dynamic GHNs deployment are listed below:


CPUs Storage Space RAM
4
2 TB
32 GB