16
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy of Sciences

The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy

  • View
    224

  • Download
    1

Embed Size (px)

Citation preview

Page 1: The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy

The LHC Computing Grid – February 2008

The Worldwide LHC

Computing Grid

Dr Ian Bird

LCG Project Leader

15th April 2009

Visit of Spanish Royal Academy of Sciences

Page 2: The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy

CERN IT Department

CH-1211 Genève 23

Switzerlandwww.cern.ch/it

The LHC Data Challenge

• Once the accelerator is completed it will run for 10-15 years

• Experiments will produce about 15 Million Gigabytes of data each year (about 20 million CDs!)

• LHC data analysis requires a computing power equivalent to ~100,000 of today's fastest PC processors

• Requires many cooperating computer centres, as CERN can only provide ~20% of the capacity

Ian Bird, CERN, IT Department

Page 3: The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy

CERN IT Department

CH-1211 Genève 23

Switzerlandwww.cern.ch/it

Solution: the Grid

• Use the Grid to unite computing resources of particle physics institutions around the world

The World Wide Web provides seamless access to information that is stored in many millions of different geographical locations

The Grid is an infrastructure that provides seamless access to computing power and data storage capacity distributed over the globe

Ian Bird, CERN, IT Department

Page 4: The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy

CERN IT Department

CH-1211 Genève 23

Switzerlandwww.cern.ch/it

How does the Grid work?

• It makes multiple computer centres look like a single system to the end-user

• Advanced software, called middleware, automatically finds the data the scientist needs, and the computing power to analyse it.

• Middleware balances the load on different resources.It also handles security, accounting, monitoring and much more.

Ian Bird, CERN, IT Department

Page 5: The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy

View of the ATLAS detector (under construction)

150 million sensors deliver data …

… 40 million times per second

Ian Bird, CERN, IT Department

Page 6: The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy

Tier 0 at CERN: Acquisition, First pass reconstruction, Storage & Distribution

[email protected]

1.25 GB/sec (ions)

6

Page 7: The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy

CERN IT Department

CH-1211 Genève 23

Switzerlandwww.cern.ch/it

[email protected]

LHC Computing Grid project (LCG)

• More than 140 computing centres

• 12 large centres for primary data management: CERN (Tier-0) and eleven Tier-1s

• 38 federations of smaller Tier-2 centres

– 7 Tier 2 in Spain: supporting ATLAS, CMS, LHCb

• 35 countries involved

Ian Bird, CERN, IT Department

PIC

Page 8: The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy

Data Transfer

• Data distribution from CERN to Tier-1 sites– Target rates easily exceeded

for extended periods– For all experiments and to all

Tier 1 sites– Also between Tier 1/Tier 2

sites

Tier-2s and Tier-1s are inter-connected by the general

purpose research networks

Any Tier-2 mayaccess data at

any Tier-1

Tier-2 IN2P3

TRIUMF

ASCC

FNAL

BNL

Nordic

CNAF

SARAPIC

RAL

GridKa

Tier-2

Tier-2

Tier-2

Tier-2

Tier-2

Tier-2

Tier-2Tier-2

Tier-2

Page 9: The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy

Grid activity• Workload continues to increase

– At the scale needed for physics

• Distribution of work across Tier0/Tier1/Tier 2 really illustrates the importance of the grid system

– Tier 2 contribution is around 50%; >85% is external to CERN

Tier 2 sites

Tier 0 + Tier 1 sites

Page 10: The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy

First events

Page 11: The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy

• LCG has been the driving force for the European multi-science Grid EGEE (Enabling Grids for E-sciencE)

• EGEE is now a global effort, and the largest Grid infrastructure worldwide

• Co-funded by the European Commission (Cost: ~170 M€ over 6 years, funded by EU ~100M€)

• EGEE already used for >100 applications, including…

Impact of the LHC Computing Grid in Europe

Medical ImagingEducation, TrainingBio-informatics

Ian Bird, CERN, IT Department

Page 12: The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy

Enabling Grids for E-sciencE

EGEE-III INFSO-RI-222667 The EGEE project - Bob Jones - EGEE'08 - 22 September 2008 12

EGEE-III

Main Objectives– Expand/optimise existing EGEE

infrastructure, include more resources and user communities

– Prepare migration from a project-based model to a sustainable federated infrastructure based on National Grid Initiatives

Flagship Grid infrastructure project co-funded by the European Commission

Duration: 2 years Consortium: ~140 organisations across 33 countriesEC co-funding: 32Million €

Page 13: The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy

Enabling Grids for E-sciencE

EGEE-III INFSO-RI-222667 The EGEE project - Bob Jones - EGEE'08 - 22 September 2008 13

EGEE Production Grid Infrastructure Steady growth over the lifetime of the project Improved reliability

EGEE Achievements - Infrastructure

How can we reduce the effort required to operate this expanding infrastructure?

Page 14: The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy

Enabling Grids for E-sciencE

EGEE-III INFSO-RI-222667 The EGEE project - Bob Jones - EGEE'08 - 22 September 2008 14

EGEE Achievements - Applications

• >270 VOs from several scientific domains– Astronomy & Astrophysics– Civil Protection– Computational Chemistry– Comp. Fluid Dynamics– Computer Science/Tools– Condensed Matter Physics– Earth Sciences– Fusion– High Energy Physics– Life Sciences

• Further applications under evaluation

Applications have moved from testing to routine and daily usage

~80-95% efficiency

How do we match the expectations of the growing user communities?Will we have enough computing resources to satisfy their needs?

Page 15: The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy

Enabling Grids for E-sciencE

EGEE-II INFSO-RI-031688

ArcheologyAstronomyAstrophysicsCivil ProtectionComp. ChemistryEarth SciencesFinanceFusionGeophysicsHigh Energy PhysicsLife SciencesMultimediaMaterial Sciences…

>250 sites48 countries>50,000 CPUs>20 PetaBytes>10,000 users>150 VOs>150,000 jobs/day

EGEE-III Partners in Spain: CESGA, CSIC, UNIZAR, UCM IFAE/PIC, CIEMAT, UPV, RED.ES

Page 16: The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy

Sustainability

• Need to prepare for permanent Grid infrastructure• Ensure a high quality of service for all user communities• Independent of short project funding cycles• Infrastructure managed in collaboration

with National Grid Initiatives (NGIs)• European Grid Initiative (EGI)

Ian Bird, CERN, IT Department