ICSC

Centro Nazionale HPC, Big Data e Quantum computing

Il Centro Nazionale HPC, Big Data e Quantum computing è uno dei cinque centri nazionali istituiti per il PNRR. E’ principalmente dedicato a diverse aree strategiche per lo sviluppo del paese per simulazioni, calcolo e alte prestazioni ed analisi dei dati. Il principale obiettivo per INAF è lo sviluppo di tecnologie di calcolo innovative per l’Astrofisica per la prossima generazione di infrastrutture Exascale. Le attività supportano applicazioni astrofisiche per simulazioni numeriche, elaborazione ed l’analisi dati, la visualizzazione scientifica e lo storage di complessi dataset (Big Data) di enorme volume provenienti dai più importanti progetti di osservazione e osservatori: SKA, LOFAR2.0, Meerkat+, Euclid , Gaia, CTA, ASTRI, SWGO, LSST, ELT and HPC Theory ecc.

English below

The National Research Centre in HPC, Big Data and Quantum Computing, is one of the five National Centres established under the PNRR, dedicated to several areas identified as strategic for the development of the Country as well as simulations, computing and high-performance data analysis. The main INAF objective is the development of innovative computing technologies for the A&A community for the next generation of HPC and storage Exascale infrastructures. The activities are supporting high-end astrophysical applications for numerical simulations and the processing, analysis, visualisation and storage of the big and complex datasets coming from major observational projects and observatories: SKA, LOFAR2.0, Meerkat+, Euclid , Gaia, CTA, ASTRI, SWGO, LSST, ELT and HPC Theory.

The ICSC Foundation (HUB)  

The ICSC foundation HUB is responsible for the validation and management of the research program of the National Center, whose activities are elaborated and implemented by the spokes and their affiliate institutions

The National Center  aims to create the national digital infrastructure for research and innovation, starting from the existing HPC, HTC and Big Data infrastructures evolving towards a cloud datalake model accessible by the scientific and industrial communities through flexible and uniform cloud web interfaces, relying on a high-level support team, form a globally attractive ecosystem based on strategic public-private partnerships to fully exploit top level digital infrastructure for scientific and technical computing and promote the development of new computing technologies.

The National Center   provides a pivotal opportunity for the national scientific, industrial and socio-economic system to address current and upcoming scientific and societal challenges, strengthening and expanding existing competences and infrastructural resources. The CN will be structured according to the hub and spoke model.

The National Center  has two main goals:

1) to create a national computing infrastructure, Datalake- like, by grouping together the existing High- Performance Computing (HPC), High Throughput Computing (HTC), Big Data and network infrastructures and new targeted resources procured by means of the CN funding, and by providing to the scientific and industrial communities a flexible and uniform Cloud interface to serve the full spectrum of applications, that span from HPC computing (needed by computational-intensive application, typically created by Fundamental Research and large/specialized Corporates) to general purpose cloud infrastructure (addressing computational needs of Science Applications, other Corporates and SMEs).

2) to create around the infrastructure a globally attractive ecosystem which supports the academia and the industrial system and fosters the exploitation of computing resources and technologies with the goal of encouraging processes of sustainable economic growth and human development. 

At the Bologna Tecnopolo it is also planned to integrate Leonardo supercomputer with a TIER-1 system, to support the CNR and INAF SKA-RC participation.

INAF is responsible (Leader) of the Spoke 3 Astrophysics and Cosmos Observations. 

INAF is Spoke co-Leader in Spoke 2 Fundamental Research and Space Economy. 

INAF participate to the Spoke 1 Future Computing and Big Data and to Spoke 10 Quantum Computing

Main INAF Research Lines in the National Center:

Computational astronomy; Computational methods; High Performance Computing; Big Data; Quantum Computing; Cloud computing; Distributed computing; GPU computing ; AstroInformatics; Machine Learning; Artificial Intelligence; Astronomy data visualization; Scientific Visualization

Spoke 1

Future HPC & Big Data 

Leaded by University of Bologna, co-Leaded by University of Torino

The Spoke is a technological pillar of the National Center and deals with the development of highly innovative hardware and software technologies for the supercomputers of the future.
The research and development activities planned in the Spoke 1 will lead to the creation of prototypes and demonstrators of the most promising technologies, facilitating their adoption and industrial development.

The objective of Spoke 1 “FutureHPC & BigData» is the creation of new labs as an integral part of a national federated center on a global level with skills aimed at hardware and software co-planning and enhancing Italian leadership in the EuroHPC JU, as well as in the ecosystem of data infrastructure for science and industry.

In particular, INAF provides prototypes and demonstrators of astrophysical applications (such as fluid dynamics and cosmological N-body simulation, gas dynamics, data analysis, etc.) on the platforms developed by Spoke 1, using innovative technologies for workflow management and tools for fast I/O and Big Data, and participate in all the co-design and benchmarking activities.

Spoke 1 is organized into 5 work packages, called Flagships (FLs), and two living labs.  (UniTO living Labs https://hpc4ai.unito.it/livinglab-swi-icsc/)INAF work is performed within flagship FL3 Workflows & I/O, Cloud-HPC convergence, digital twins and flagship FL5 Mini-applications & Benchmarking in collaboration with UNITO, CINECA, UNIPI and UNICT.

Spoke 2

Fundamental Research and Space Economy

Leaded by INFN and co-Leaded by INAF. 

New algorithms for data reconstruction, analysis and simulations of astroparticle experiments, also based on AI techniques. Portability of applications on GPU/CPU and on heterogeneous architectures. Development of data interpretation tools, and statistical analysis, and high-performance techniques for accessing high-energy and astroparticle physics data. Development of algorithms and methodologies for integrating heterogeneous data (e.g. from different instruments / observers, and multi-messenger).

MAIN INAF Activities

Ricostruzione immagini telescopi alte energie (IACT-Cherenkov e satelliti) con AI-machine learning e soluzioni GPU; Profiling e ottimizzazione simulazioni per IACT e analisi immagini con neural networks; Blockchain e space market; Ottimizzazione analisi serie temporali, con parallelizzazione e porting CUDA e FPGA su interferometria di intensità ottica; Ottimizzazione di codici paralleli per analisi dati di strumentazione di accelerometri su satelliti; scalabilità ; Calcolo multi-threading e distribuito e ottimizzazione pipeline di simulazioni su calcolo parallelo per modelli CAD e GEANT4; Integrazione di pipeline di analisi eterogenei e interfaccia grafica su dati di GW (gravitational wave interferometers) e multifrequency electromagnetic data.

Spoke 3

Astrophysics and Cosmos Observations

Leaded by INAF and co-Leaded by INFN.

The main objectives are the exploitation of cutting edge solutions in HPC and Big Data processing and analysis for problems of interest in the following research area: Cosmology; Stars and Galaxies; Space physics; (Earth, Solar and Planetary); Radio Astronomy; Observational Astrophysics and Time Domain; High Energy Astrophysics, Cosmic Microwave Background; Large Scale Structure, Clusters and Galaxies; MultiMessenger Astrophysics; Numerical Simulations and Modeling. It will pursue a user-driven approach in order to tightly couple to the community and it will adopt a co- design methodology for the development of the selected applications.

The following detailed objectives are:

a) Objective 1. High and Extreme performance computing, via developing software solutions for data analysis and numerical simulations able to effectively exploit HPC resources in the perspective of the Exascale era.

b) Objective 2. Big data processing and visualization, via adopting innovative approaches (e.g. Artificial Intelligence, inference via Bayesian statistics) for the analysis of large and complex data volumes and for their exploration (e.g. in-situ visualization), capable of efficiently exploiting HPC solutions.

c) Objective 3. High Performance storage, Big Data management, and archiving applying the Open Science principles and implementing them in the Big Data Archives.

d) Objective 4. Training and dissemination. Creating a community of scientists and code developers prepared to adopt and exploit innovative computational solutions.

Spoke 10

Quantum Computing

Leaded by Politecnico of Milan, co-Leaded by University of Padova.

Quantum computing has enormous potential in terms of speed and management of data. In fact, quantum calculators are capable of solving complex problems that no classic computer could ever solve in a timely manner, a potentiality known as “quantum supremacy”.  

However, despite rather high expectations regarding possible areas of application and new business models that may result from it, full technological maturity is still lightyears away. Overcoming certain challenges linked to the reliability of the components and to the complexity of planning – issues that are to be solved to allow the practical use of quantum calculators – is crucial.  

The Spoke 10 operate along three lines of inquiry: the first one includes the creation of applications that use quantum calculators as accelerators to solve otherwise unresolvable problems; the second one focuses on the development of hardware and software tools that facilitate the planning of quantum calculators and their operational compatibility with traditional calculators; the objective of the third is planning large and scalable quantum computers.

The exploitation of high-end, groundbreaking computing technologies is essential for observational and theoretical research in Astrophysics. INAF will experiment the application of Quantum Computing in Astrophysics by probing the feasibility and the suitability of representative problems by experimenting with quantum technologies on selected algorithms, implemented to effectively exploit such solutions.

INAF Innovation Grant Projects

Job Opportunities