59
The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager http://cern.ch/lcg/peb/applications XXII HTASC Meeting September 27, 2002

The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager XXII HTASC Meeting September 27, 2002

  • View
    216

  • Download
    1

Embed Size (px)

Citation preview

Page 1: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

The LCG Applications Area

Torre Wenaus, BNL/CERNLCG Applications Area Manager

http://cern.ch/lcg/peb/applications

XXII HTASC MeetingSeptember 27, 2002

Page 2: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 2

Torre Wenaus, BNL/CERN

Outline

LCG and Applications Area introduction Project management: organization, planning, resources Applications area activities Software Process and Infrastructure (SPI) Math libraries Persistency project (POOL) Architecture blueprint RTAG Conclusion

Page 3: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 3

Torre Wenaus, BNL/CERN

LHC Software – General Principles

OO paradigm using C++ Modularity makes collaborative development easier ‘Implementation hiding’ eases maintenance and evolution

Take software engineering seriously Treat as a project: planning, milestones, resource management Build for adaptability over a ~20yr project lifetime Careful management of software configuration

Reproducibility of analyses Employ common solutions

HENP community-developed tools; open source; commercial Minimize in-house development

But when necessary, develop common solutions (LCG Project)

Page 4: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 4

Torre Wenaus, BNL/CERN

LCG Areas of Work

Computing System Physics Data Management Fabric Management Physics Data Storage LAN Management Wide-area Networking Security Internet Services

Grid Technology Grid middleware Standard application

services layer Inter-project

coherence/compatibility

Applications Support & Coordination

Application Software Infrastructure – libraries, tools

Object persistency, data management tools

Common Frameworks – Simulation, Analysis, ..

Adaptation of Physics Applications to Grid environment

Grid tools, Portals

Grid Deployment Data Challenges Grid Operations Network Planning Regional Centre Coordination Security & access policy

Page 5: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 5

Torre Wenaus, BNL/CERN

The LHC Computing Grid Project Structure

Project Overview Board

ProjectExecution

Board

Software andComputingCommittee

(SC2)Work Plan Definition

WP

RTAG

WP WP WP WP

Project Leader

Page 6: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 6

Torre Wenaus, BNL/CERN

Workflow around the organisation chartWPn PEB SC2 RTAGm

requirements

mandate

Prioritisedrequirements

Updated workplan

Workplan feedback

~2 mths

Project plan

Release 1

Release 2

~4 mths

Status report

Review feedbacktime

SC2 Sets the requirements

SC2 approves the workplan

SC2 monitors the status

PEB develops workplan

PEB manages LCG resources

PEB tracks progress

Page 7: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 7

Torre Wenaus, BNL/CERN

Coherent Architecture

Applications common projects must follow a coherent overall architecture

The software needs to be broken down into manageable pieces i.e. down to the component level

Component-based, but not a bag of disjoint components components designed for interoperability through clean

interfaces Does not preclude a common implementation foundation, such as

ROOT, for different components The ‘contract’ in the architecture is to respect the interfaces No hidden communication among components

Starting point is existing products, not a clean slate

Page 8: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 8

Torre Wenaus, BNL/CERN

Standard services, tools,infrastructure and interface glue,

code mgmt and support,QA, …

Cohesion Across Projects

Project

Project

Project

Project

Web basedproject info,documentation,browsers, buildand release access,bug tracking, …

Coherentoverallarchitecture

Page 9: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 9

Torre Wenaus, BNL/CERN

Distributed Character of Components

Persistency framework Naming based on logical filenames Replica catalog and management

Conditions database Inherently distributed (but configurable for local use)

Interactive frameworks Grid-aware environment; ‘transparent’ access to grid-enabled tools

and services Statistical analysis, visualization

Integral parts of distributed analysis environment Framework services

Grid-aware message and error reporting, error handling, grid-related framework services

Distributed production tools

Page 10: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 10

Torre Wenaus, BNL/CERN

Organisation of activities

Project

WP WP WP

Project

WP

Project

WP WP

Project

WP WPWP

Overall management, coordination, architecture, integration, support

Activity area: Physics data managementProjects: Hybrid event store, Conditions DB, …Work Packages: Component breakdown and work plan lead to Work Package definitions. ~1-3 FTEs per WP

Activity areaActivity areaActivity area

Example:

AreaLeader

Projectleaders

Page 11: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 11

Torre Wenaus, BNL/CERN

Collaboration with experiments

Direct technical collaboration between experiment participants, IT, ROOT, new LCG personnel; as far as the technical work goes, project lines should be more important than line management categories

‘Hosting’ of projects in experiments, where appropriate Completely open activities and communication Issues at all levels dealt with in regular open meetings as much as

possible Mechanism in place for formally involving experiment architects, and

dealing with contentious issues: Architects Forum

Page 12: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 12

Torre Wenaus, BNL/CERN

Architects Forum (AF)

Members: experiment architects + applications manager (chair) Decides the difficult issues

Most of the time, AF will converge on a decision If not, applications manager takes decision

Such decisions can be accepted or challenged Challenged decisions go to full PEB, then if necessary to SC2

We all abide happily by an SC2 decision Forum meetings also cover general current issues and exchange of

views Forum decisions, actions documented in public minutes

Page 13: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 13

Torre Wenaus, BNL/CERN

Planning

Project breakdown and schedule implemented in planning tool (XProject) and in development in terms of content

High level milestones developed and approved November 2002: Alpha release of hybrid event store (POOL) June 2003: General release of hybrid event store Nov 2003: Distributed production environment using grid services May 2004: Distributed end-user interactive analysis from a Tier 3 March 2005: Full function release of persistency framework

Current planning activity: develop level 2 schedule and deliverables in process/infrastructure and persistency

PEB goal to have detailed planning complete by the end of the year. Global long term workplan outlined last spring Individual project workplans being developed as projects are initiated

POOL 2002 workplan submitted and approved Software Process and Infrastructure workplan to SC2 on Oct 11

Page 14: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 14

Torre Wenaus, BNL/CERN

Overall Workplan Priorities

Considerations for developing workplan and priorities: Experiment need and priority Suitability to a common project Timing: when will the conditions be right to initiate a common

project Do established solutions exist in the experiments Are they open to review or are they entrenched

Availability of resources Allocation of effort

Is there existing effort which would be better spent doing something else

Availability, maturity of associated third party software E.g. grid software

Pragmatism and seizing opportunity. A workplan derived from a grand design would not fit the reality of this project

Page 15: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 15

Torre Wenaus, BNL/CERN

Top Priorities

1. Establish process and infrastructure project (Active)

2. Address core areas essential to building a coherent architecture, and develop the architecture

Object dictionary (Active) Persistency (Active) Interactive frameworks, analysis tools (also driven by assigning

personnel optimally) (Pending)

3. Address priority common project opportunities Driven opportunistically by a combination of experiment need,

appropriateness to common project, and ‘the right moment’ (existing but not entrenched solutions in some experiments)

Detector description and geometry model (RTAG in progress) Driven by need and available manpower

Simulation tools (RTAG in progress)

Initiate: First half 2002

Page 16: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 16

Torre Wenaus, BNL/CERN

Near term priorities

Build outward from the core top tier components Conditions database (Pending, but common tool exists) Framework services, class libraries (RTAG in progress)

Address common project areas of less immediate priority Math libraries (Active) Physics packages (RTAG in progress)

Extend and elaborate the support infrastructure Software testing and distribution (Active)

Initiate: Second half 2002

Deliver: fall/late 2002: basic working persistency framework (On schedule)

Page 17: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 17

Torre Wenaus, BNL/CERN

Medium term

The core components have been addressed, architecture and component breakdown laid out, work begun. Grid products have had another year to develop and mature. Now explicitly address physics applications integration into the grid applications layer.

Distributed production systems. End-to-end grid application/framework for production.

Distributed analysis interfaces. Grid-aware analysis environment and grid-enabled tools.

Some common software components are now available. Build on them.

Lightweight persistency, based on persistency framework Release LCG benchmarking suite

Initiate: First half 2003

Page 18: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 18

Torre Wenaus, BNL/CERN

Long term

Longer term items waiting for their moment (cf. considerations): ‘Hard’ ones, perhaps made easier by a growing common

software architecture Event processing framework

Address evolution of how we write software OO language usage

Longer term needs; capabilities emerging from R&D Advanced grid tools, online notebooks, …

Initiate: Second half 2003 and later

Page 19: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 19

Torre Wenaus, BNL/CERN

Resources

Original manpower-required estimates from Sep 2001 proposal To be completely revisited, but gives some idea of numbers and foreseen

distribution Total: 36 FTEs of which 23 are new LCG hires

Application software infrastructure - 5 FTEs Common frameworks for simulation & analysis - 13 FTEs Support for physics applications – 9 FTEs Physics data management - 9 FTEs

‘Mountain of manpower will be with us very soon and will leave in 3 years’ Tentative project assignments according to abilities Setting up tools and infrastructure

‘Some new people could go to experiment to get up to speed, whilst more experienced could go from experiments to project’

‘How does support issue effect project’ Support needed for lifetime of the software Long term commitments to support the software needed

Page 20: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 20

Torre Wenaus, BNL/CERN

Personnel status

10 new LCG hires in place and working 5 more identified, starting between now and December Manpower ramp is on schedule Contributions from UK, Spain, Switzerland, Germany, Sweden,

Israel, US We are urgently working on accruing enough scope (via RTAGs) to

employ this manpower optimally Everyone is working productively

~10 FTEs from IT (DB and API groups) also participating ~7 FTEs from experiments (CERN EP and outside CERN) also

participating, primarily in persistency project at present Important experiment contributions also in the RTAG process

Page 21: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 21

Torre Wenaus, BNL/CERN

Activities: Candidate RTAGs by activity area

Application software infrastructure Software process; math libraries; C++ class libraries; software

testing; software distribution; OO language usage; benchmarking suite

Common frameworks for simulation and analysis Simulation tools; detector description, model; interactive

frameworks; statistical analysis; visualization Support for physics applications

Physics packages; data dictionary; framework services; event processing framework

Grid interface and integration Distributed analysis; distributed production; online notebooks

Physics data management Persistency framework; conditions database; lightweight persistency

Page 22: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 22

Torre Wenaus, BNL/CERN

Candidate RTAG timeline

Months Merge? 02Q1 02Q2 02Q3 02Q4 03Q1 03Q2 03Q3 03Q4Simulation tools 1 XDetector description & model 2 XConditions database 1 XData dictionary 2 XInteractive framew orks 2 XStatistical analysis 1 XDetector & event visualization 2 XPhysics packages 2 XFramew ork services 2 XC++ class libraries 2 XEvent processing framew ork XDistributed analysis interfaces 2 XDistributed production systems 2 XSmall scale persistency 1 XSoftw are testing 1 XSoftw are distribution 1 XOO language usage 2 XLCG benchmarking suite 1 XOnline notebooks 2 X

Page 23: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 23

Torre Wenaus, BNL/CERN

Projects started or starting, prioritized order

Software process and infrastructure (SPI) – Alberto Aimar Persistency framework (POOL) – Dirk Duellmann Core framework services (RTAG in progress) Simulation (RTAG in progress) Detector description (RTAG in progress) Event generators (RTAG in progress) Analysis tools (RTAG soon) Math libraries – Fred James

Page 24: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 24

Torre Wenaus, BNL/CERN

SPI

Page 25: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 25

Torre Wenaus, BNL/CERN

Page 26: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 26

Torre Wenaus, BNL/CERN

LCG SPI Status (Alberto Aimar, IT/API)

Everything is done in collaboration with LCG and LCG projects (Pool) LHC experiments Big projects (G4, Root, etc)

Currently available or being developed, and used by Pool

All is done assessing what is available in the Laboratory (Experiments, IT, etc.) and in the free software in general

Users are involved in the definition of the solutions, all is public on the SPI web site

Development of specific tools is avoided, IT services are used

A workplan is being prepared for next LCG SC2 commitee

Services AFS delivery area CVS server Build platforms (Linux, Sun,

Windows) Components

Code documentation Software documentation Coding and design guidelines CVS organization Workpackage testing

Other services and components started Project web Memory testining Configuration management

Page 27: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 27

Torre Wenaus, BNL/CERN

SPI implementation

Each service/component of the infrastructure has: A responsible person in the project Similar SIMPLE approach

Define the goal of the component Standard procedures and documentation

Standard procedure Survey of possible/existing solutions in HEP and free software Meet the people expert and responsible of the component in real projects (LCG or

experiments or big projects) Discuss and agree/decide/verify a solution Present the solution Implement the solutions Use in the LCG SPI project itself Use it in a real project (LCG or experiment or big project)

Page 28: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 28

Torre Wenaus, BNL/CERN

POOL

Page 29: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 29

Torre Wenaus, BNL/CERN

Data Persistency RTAG

Commenced in Jan, completed April 5 Mandated to write the product specification for the

Persistency Framework for Physics Applications at LHC Collaborative, friendly atmosphere” “Real effort to define a

common product” Focused on the architecture of a persistence management

service Define components and their interactions in terms of

abstract interfaces that any implementation must respect Provide the foundation for a near-term common project

reasonably matched to current capabilities of ROOT, with a relational layer above it

Page 30: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 30

Torre Wenaus, BNL/CERN

Persistency Project Timeline

Started officially 19 April, led by Dirk Duellmann IT/DD Initially staffed with 1.6 FTE (1 from CMS, .6 (Dirk) from IT) MySQL scalability and reliability test Requirement list and experiment deployment plans

Persistency Workshop 5-6 June at CERN More requirement and implementation discussions Work package breakdown and release sequence proposed Additional experiment resource commitments received A name, POOL: Pool of persistent objects for LHC

Since beginning of July Real design discussions in work packages started

active participation since then ~5FTE (LHCb, CMS, ATLAS, IT/DD) Project hardware resources defined and deployed (10 nodes, 2 disk servers) Software infrastructure defined and becoming progressively available Work plan for 2002 presented to SC2 in August and approved Coding!

Page 31: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 31

Torre Wenaus, BNL/CERN

Hybrid Data Store – Schematic View

EventEvent

TrackListTrackList

TrackerTracker CalorimeterCalorimeter

TrackTrackTrackTrack

TrackTrackTrackTrackTrackTrack

HitListHitList

HitHitHitHitHitHitHitHitHitHit

Experiment event model

ObjectDictionary

Service

PersistencyManager

Experimentspecific object model

descriptions

Storage:

Pass object(s)

Get ID(s)

Retrieval:

Pass ID(s)

Get object(s)

Locator ServiceDistributed

Replica Manager(Grid)Locate Files

File(s)

Storage Manager

ID – FileDB

fopen etc.

ObjectStreaming

Service

File info

File records

Data objects

Objectinfo

Object descriptions

DatasetLocator

Process dataset DatasetDB

File(s)

Name

Data File

Human Interaction

Page 32: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 32

Torre Wenaus, BNL/CERN

POOL Work Packages

Storage Manager and Object References ROOT I/O based Storage Manager and persistent references Capable of storing objects ‘foreign’ to ROOT (‘any’ C++ class)

File Catalog and Grid Integration MySQL, XML and EDG based implementations

Collections and Metadata Collection implementations for RDBMS and RootI/O

Dictionary and Conversion Transient and persistent dictionaries Cross population between RootI/O dictionary and Dictionary

import/export Infrastructure, integration and testing

Project specific development, integration, test infrastructure

Page 33: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 33

Torre Wenaus, BNL/CERN

Prioritized Experiment Focus of Interest

RootIO integration with grid aware catalog (ALL) Transparent Navigation (ATLAS/CMS/LHCb)

ALICE: maybe, but only in some places EDG (ALL), Alien (ALICE), Magda (ATLAS)

MySQL as RDBMS implementation until first release (ALL) Consistency between streaming data and meta-data (CMS/ATLAS)

At application defined checkpoints during a job Early separation of persistent and transient dictionary (ATLAS/LHCb) Initial release supports persistency for non-TObjects (ATLAS/LHCb)

without changes to user class definitions Support for shallow (catalog-only) data copies (CMS)

formerly known as cloned Federations Support for deep (extracted part-of-object hierarchy) copies (CMS)

Page 34: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 34

Torre Wenaus, BNL/CERN

Release Sequence

End September - V0.1 - Basic Navigation all core components for navigation exist and interoperate

StorageMgr, Refs & CacheMgr, Dictionary, FileCatalog some remaining simplifications

Assume TObject on read/write – simplified conversion End October - V0.2 – Collections

first collection implementation integrated support implicit and explicit collections on either RDBMS or RootIO

persistency for foreign classes working persistency for non-TObject classes without need for user code instrumentation

EDG/Globus FileCatalog integrated End November - V0.3 – Meta Data & Query External release

annotation and query added for event, event collection and file based meta data Early testing & evaluation of this release by ATLAS, CMS, LHCb anticipated

Page 35: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 35

Torre Wenaus, BNL/CERN

POOL V0.1 – End September

Main Goal navigation for TObject (ROOT object) instances operational

all core components in place at least one component implementation operational Navigation for non-ROOT objects in next release

complete the first release cycle!

Expect to complete this month or in first days of next month interface discussion finished large fraction of the implementation code already existing and in the

repository remaining work should be mainly component integration…

Page 36: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 36

Torre Wenaus, BNL/CERN

POOL V0.1 Release Steps & Schedule

Complete/Update Component Documentation - by 15 th Sept. (Completed Sep 17) pool (internal) component description what will be done in this release? what is still left for later? documents are announced on pool list and appear on POOL web site version numbers for any external packages are fixed

(Root, MySQL, MySQL++ …) Component Code Freeze for Release Candidate - by 25th Sept. (Imminent)

all component code tagged in CVS all documented features are implemented and have a test case compiles at least on highest priority release platform

(=Linux RH7.2 and gcc-2.95.2 ?) survives regression tests on component level

System & Integration Testing and later packaging - by 30 th Sept. (Next week) Any remaining platform porting Integration tests & “end user” examples run on all platforms

Code review – early October Start planning the next release cycle

Page 37: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 37

Torre Wenaus, BNL/CERN

Math libraries

Page 38: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 38

Torre Wenaus, BNL/CERN

Math Libraries

Many different libraries in use General purpose (NAG-C, GSL, ..) HEP-specific ( CLHEP, ZOOM, ROOT) Modern libraries dealing with matrices and vectors (Blitz++,

Boost..) Financial considerations

NAG-C 300 kCHF/yr initially dropping to 100 kCHF/yr after 4-5 years

Comparative evaluation of NAG-C and GSL is difficult and time-consuming

Collect information on what is used/needed Evaluation of functionality and performance very important

HEP-specific libraries expected to evolve to meet future needs

Page 39: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 39

Torre Wenaus, BNL/CERN

Math library recommendations & status

Establish support group to provide advice and info about existing libraries, and identify and develop new functionality

Group to be established in October Experiments should specify the libraries and modules they use

Only LHCb has provided info so far Detailed study should be undertaken to assess needed functionality

and how to provide it, particularly via free libraries such as GSL Group in India is undertaking this study; agreement just signed

Page 40: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 40

Torre Wenaus, BNL/CERN

Blueprint RTAG

Page 41: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 41

Torre Wenaus, BNL/CERN

Applications Architectural Blueprint RTAG

Preamble: without some overall view of LCG applications, the results from individual RTAGs, and thus the LCG work, may not be optimally coherent. Hence the need for an overall architectural ‘blueprint’. This blueprint will then serve to spawn other RTAGs leading to specific proposals, and ensuring some degree of overall consistency.

Mandate: define the architectural ‘blueprint’ for LCG applications: Define the main architectural domains (‘collaborating frameworks’) of

LHC experiments and identify their principal components. (For example: Simulation is such an architectural domain; Detector Description is a component which figures in several domains.)

Define the architectural relationships between these ‘frameworks’ and components, including Grid aspects, identify the main requirements for their inter-communication, and suggest possible first implementations. (The focus here is on the architecture of how major ‘domains’ fit together, and not detailed architecture within a domain.)

Identify the high level deliverables and their order of priority. Derive a set of requirements for the LCG

Page 42: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 42

Torre Wenaus, BNL/CERN

Meetings and Presentations

June 12 Pere Mato, Rene Brun June 14 Rene Brun, Vincenzo Innocente July 3 Torre Wenaus July 8 Pere Mato, Vincenzo

Innocente July 12 Pere Mato, Vincenzo Innocente July 23 Paul Kunz August 5 Tony Johnson August 6 Bob Jacobsen, Lassi Tuura August 8 August 9 Andrea Dell’Acqua September 9 September 10 October 1

Page 43: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 43

Torre Wenaus, BNL/CERN

Report Outline

Executive summary SC2 mandate Response of the RTAG to the mandate Blueprint scope and requirements Use of ROOT

As outlined in the last status report Blueprint architecture design precepts

High level architectural issues, approaches Blueprint architectural elements

Specific architectural elements, suggested patterns, examples Domain decomposition Metrics (quality, usability, maintability, modularity) Schedule and resources Specific recommendations to the SC2 Recommended reading list

Page 44: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 44

Torre Wenaus, BNL/CERN

Requirements

Lifetime Languages Distributed applications TGV and airplane work Modularity of components Use of interfaces Interchangeability of implementations Integration Design for end-users Re-use existing implementations Software quality at least as good as any LHC experiment Platforms

Page 45: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 45

Torre Wenaus, BNL/CERN

How will ROOT be used?

LCG software will be developed as a ROOT user Will draw on a great ROOT strength: users are listened to very

carefully! The ROOT team has been very responsive to needs for new and

extended functionality coming from the persistency effort Drawing on ROOT in a user-provider relationship matches the

reality of the ROOT development model of a very small number of ‘empowered’ developers

The ROOT development team is small and they like it that way While ROOT will be used at the core of much LCG software for the

foreseeable future, we agree there needs to be a line with ROOT proper on one side and ‘LCG software’ on the other.

Despite the user-provider relationship, LCG software may nonetheless place architectural, organizational or other demands on ROOT

Page 46: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 46

Torre Wenaus, BNL/CERN

Basic Framework

Foundation Libraries

Simulation Framework

Reconstruction Framework

Visualization Framework

Applications

. . .

Optional Libraries

OtherFrameworks

Software Structure

Page 47: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 47

Torre Wenaus, BNL/CERN

Blueprint architecture design precepts

Software structure: foundation; basic framework; specialized frameworks Component model: APIs, collaboration (‘master/slave’, ‘peer-to-peer’),

physical/logical module granularity, plug-ins, abstract interfaces, composition vs. inheritance, …

Service model: Uniform, flexible access to functionality Object models: dumb vs. smart, enforced policies with run-time checking,

clear and bullet-proof ownership model Distributed operation Dependencies Interface to external components: generic adapters …

Page 48: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 48

Torre Wenaus, BNL/CERN

Blueprint Architectural Elements

Object dictionary and whiteboard Component bus Scripting language (ROOTCINT and Python both available) Component configuration Basic framework services

Framework infrastructures: creation of objects (factories), lifetime, multiplicity and scope (singleton, multiton, smart pointers), communication & discovery (eg. registries), …

Core services: Component management, incident management, monitoring & reporting, GUI manager, exception handling, …

System services Foundation and utility libraries

Page 49: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 49

Torre Wenaus, BNL/CERN

EventGeneration

Core Services

Dictionary

Whiteboard

Foundation and Utility Libraries

DetectorSimulation

Engine

Persistency

StoreMgr

Reconstruction

Algorithms

Geometry Event Model

GridServices

InteractiveServices

Modeler

GUIAnalysis

EvtGen

Calibration

Scheduler

Fitter

PluginMgr

Monitor

NTuple

Scripting

FileCatalog

ROOT GEANT4 DataGrid Python Qt

Monitor

. .

.

Modeler

MySQL

Page 50: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 50

Torre Wenaus, BNL/CERN

Domain Decomposition

Object dictionary and object model Persistency and data management Event processing framework Event model Event generation Detector simulation Detector geometry and materials Trigger/DAQ Event reconstruction Detector calibration

Scripting, interpreter GUI toolkits Graphics Analysis tools Math libraries and statistics Job management Core services Foundation and utility libraries Grid middleware interfaces

Brown: at least partly foreseen as common project

Grey: not foreseen as common project

Page 51: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 51

Torre Wenaus, BNL/CERN

Recommendations

Recommendations taking shape in the report so far… Aspects of ROOT development of expected importance for LCG

software are listed Initiation of a common project on core services Implementation of Python as interactive/scripting environment and

‘component bus’ (as an optional part of the architecture) Adopt full STL Review CLHEP quickly with a view to repackaging it as discrete

components and deciding which components the LCG employs, develops

Adopt AIDA (several questions surrounding this still to be clarified) Adopt Qt as GUI library Support standard Java compiler(s) and associated needed tools Develop a clear process (and policies, criteria) for adopting third party

software

Page 52: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 52

Torre Wenaus, BNL/CERN

Four experiments, Four Viewpoints

A lot of accord between ATLAS, CMS, LHCb on the architecture emerging in the blueprint

There are points of difference, but not fundamental ones Differences are between CMS and LHCb; no daylight between

LHCb and ATLAS ALICE point of view is distinct!

Sees the planned work as primarily a duplication of work already done over the last 7 years and available in ROOT

But, ALICE and ROOT are prepared to work in the framework of the LCG software being a customer of ROOT, with ALICE contributing substantially to ROOT

We do not have full accord among the four experiments, but we have come to a plan that should yield a good, productive working relationship among all

Page 53: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 53

Torre Wenaus, BNL/CERN

Blueprint RTAG Status

Final report for Oct 11 SC2 meeting Report draft (~27 pages) is close to complete Plan to invite a few people with a ‘physics analysis user, non-expert

in software’ perspective to review the draft and give comments

Page 54: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 54

Torre Wenaus, BNL/CERN

Conclusion

So far so good! Good engagement and support from the experiments at all stages

SC2, RTAG participation, project participation Use of LCG software written into experiment planning Now the LCG has to deliver!

LCG hires on track and contributing at a substantial level IT personnel involved; fully engaged in the DB group, ramping up in

the API group EP/SFT group being established Oct 1 to host LCG applications

software activity in EP Physical gathering of LCG applications participants in building 32

begun Progress is slower than we would like, but direction and approach

seem to be OK First product deliverable, POOL, is on schedule

Page 55: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 55

Torre Wenaus, BNL/CERN

Supplemental

Page 56: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 56

Torre Wenaus, BNL/CERN

Detector Geometry and Materials Description RTAG

Write the product specification for detector geometry and materials description services.

Specify scope: e.g. Services to define, provide transient access to, and store the geometry and materials descriptions required by simulation, reconstruction, analysis, online and event display applications, with the various descriptions using the same information source

Identify requirements including end-user needs such as ease and naturalness of use of the description tools, readability and robustness against errors e.g. provision for named constants and derived quantities

Explore commonality of persistence requirements with conditions data management

Identify where experiments have differing requirements and examine how to address them within common tools

Address migration from current tools

Page 57: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 57

Torre Wenaus, BNL/CERN

Physics Generators RTAG

There are several areas common to the LHC experiments involving the modeling of the physics for the LHC embodied in Monte Carlo (MC) generators. To best explore the common solutions needed and how to engage the HEP community external to the LCG it is proposed that a RTAG should be formed to address:

How to maintain a common code repository for the generator code and related tools such as PDFLIB.

The development or adaptation of generator-related tools (e.g. HepMC) for LHC needs.

How to provide support for the tuning and the evaluation of the Monte Carlo generators and for their maintenance. The physics evaluation is presently being addressed by the MC4LHC group, which is a joint effort of CERN/TH and of the four experiments.

The integration of the Monte Carlo generators into the experimental software frameworks.

The structure of possible forums to facilitate interaction with the distributed external groups who provide the Monte Carlo generators.

Page 58: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 58

Torre Wenaus, BNL/CERN

Simulation RTAG Mandate (1)

Define the requirements and a validation procedure for the physics processes as needed by the LHC experiments.

Processes should include electromagnetic, hadronic, and the various backgrounds.

The requirements for the physics models and the energy range they should cover should be addressed

Provide guidelines on performance, memory, and speed.

Page 59: The LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager  XXII HTASC Meeting September 27, 2002

HTASC meeting, September 27, 2002 Slide 59

Torre Wenaus, BNL/CERN

Simulation Mandate (2)

Study the requirements for integrating the simulation engines (i.e. Geant 4, Fluka etc.) in the application programs. Topics to be addressed should include :

mechanisms for specifying and configuring the physics processes mechanisms for interfacing to the physics generators mechanisms for keeping track of MCtruth information mechanisms for specifying the detector geometry mechanisms for outputting event data support for general simulation services such as transport

mechanisms, shower parameterisations and for an inverse tracking and reconstruction package "a la GEANE" .

other issues related to interfacing to general services of the various software frameworks, such as event visualisation, job configuration, user interfaces etc.