Upload
claud-merritt
View
217
Download
0
Embed Size (px)
DESCRIPTION
329 Jan 2003 Mike Wilde, University of Chicago Management and Coordination Challenges Management structure External advisory committee How we work How we are coordinated Meetings Managing and tracking progress External collaborations Summary
Citation preview
Management & CoordinationPaul Avery, Rick Cavanaugh
University of [email protected]
Ian Foster, Mike WildeUniversity of Chicago, Argonne
GriPhyN NSF Project Review29-30 January 2003
Chicago
229 Jan 2003Mike Wilde, University of Chicago [email protected]
GriPhyN Management• Paul Avery (Florida)
– co-Director• Ian Foster (Chicago)
– co-Director• Mike Wilde (Argonne)
– Project Coordinator• Rick Cavanaugh (Florida)
– Deputy Coordinator
329 Jan 2003Mike Wilde, University of Chicago [email protected]
Management and Coordination• Challenges• Management structure• External advisory committee• How we work• How we are coordinated• Meetings• Managing and tracking progress• External collaborations• Summary
429 Jan 2003Mike Wilde, University of Chicago [email protected]
GriPhyN Project Challenges• We balance and coordinate
– CS researchwith “goals, milestones & deliverables”
– GriPhyN schedule/priorities/riskswith those of the 4 experiments
– General tools developed by GriPhyNwith specific tools developed by 4 experiments
– Data Grid design, architecture & deliverableswith those of other Grid projects
• Appropriate balance requires– Tight management, close coordination, trust
• We have (so far) met these challenges– But requires constant attention, good will
529 Jan 2003Mike Wilde, University of Chicago [email protected]
External Advisory Committee
Physics Experiments
Project DirectorsPaul AveryIan Foster
Inte
rnet
2
DO
E Sc
ienc
e
NSF
PA
CIs
Project CoordinationMike Wilde
Rick Cavanaugh
Outreach/EducationManuela Campanelli
Industrial Connections
Ian Foster / Paul Avery
EDG, LCG,Other Grid Projects
ArchitectureCarl Kesselman
VDT DevelopmentCoord.: M. Livny
Requirements, Definition & Scheduling(Miron Livny)
Integration, Testing, Documentation, Support
(Alain Roy)
Globus Project & NMI Integration
(Carl Kesselman)
CS ResearchCoord.: I. Foster
Virtual Data(Mike Wilde)
Request Planning & Scheduling
(Ewa Deelman)
Execution Management(Miron Livny)
Measurement, Monitoring & Prediction
(Valerie Taylor)
ApplicationsCoord.: R. Cavanaugh
ATLAS(Rob Gardner)
CMS(Rick Cavanaugh)
LIGO(Albert Lazzarini)
SDSS(Alexander Szalay)
Inter-Project Coordination:
R. Pordes
HICB(Larry Price)
HIJTB(Carl Kesselman)
PPDG(Ruth Pordes)
TeraGrid, NMI, etc.(Carl Kesselman)
International (EDG, etc)(Ruth Pordes)
GriPhyNManagement
iVDGL
iVDGLRob Gardner
629 Jan 2003Mike Wilde, University of Chicago [email protected]
External Advisory Committee• Members
– Fran Berman (SDSC Director)– Dan Reed (NCSA Director)– Joel Butler (former head, FNAL Computing Division)– Jim Gray (Microsoft)– Bill Johnston (LBNL, DOE Science Grid)– Fabrizio Gagliardi (CERN, EDG Director)– David Williams (former head, CERN IT)– Paul Messina (former CACR Director)– Roscoe Giles (Boston U, NPACI-EOT)
• Met with us 3 times: 4/2001, 1/2002, 1/2003– Extremely useful guidance on project scope & goals
729 Jan 2003Mike Wilde, University of Chicago [email protected]
How We Work (1)• System architecture & virtual data toolkit as
two overarching organizational mechanisms• Project activities all defined in relationship
to these organizing principles: – Research: Explore new techniques to guide
evolution of the system architecture and VDT – Development: Construct VDT software– Experimentation: Deploy, apply, & evaluate VDT
software and/or new techniques in context of application challenges
• Intimate coordination with experiments for requirements, evaluation, dissemination
829 Jan 2003Mike Wilde, University of Chicago [email protected]
How We Work (2)
ComputerScience
Research
VirtualData
Toolkit
PartnerPhysicsProjects
LargerScience
Community
Globus, Condor, NMI, EU DataGrid, PPDG Communities
ProductionDeployment
TechTransfer
Techniques& software
Requirements
Prototyping& experiments
Other linkages:- Work force- CS researchers- Industry
GriPhyN
discovery
ScienceReview
ProductionManager
Researcherdiscovery
sharing
instrument
Applications
VirtualData
storageelement
Grid
Grid Fabric
storageelement
storageelement
composition
planning
data
Planning
Execution
Virtual DataToolkit
ServicesServices
Chimera virtual data
system
Pegasus planner
DAGman
Globus ToolkitCondor
Ganglia, etc.
GriP
hyN
Arch
itect
ure
Performance
Production Analysisparams
exec.
data
1029 Jan 2003Mike Wilde, University of Chicago [email protected]
How We Are Coordinated
• The activities of this large, multidisciplinary group are coordinated by frequent and multivalent communications– Face-to-face meetings in large & small groups– Formal and informal documents defining
requirements, challenge problems, testbeds– Email lists, phone calls, web sites, videoconferences– Cooperation on challenge problems and technology
and application demonstrations– Cooperation on software releases
• Explicit, long-term pairings with experiments to ensure communication & collaboration
1129 Jan 2003Mike Wilde, University of Chicago [email protected]
Meetings in 2000-2001•GriPhyN/iVDGL meetings
– Oct. 2000 All-hands Chicago– Dec. 2000 Architecture Chicago– Apr. 2001 All-hands, EAC USC/ISI– Aug. 2001 Planning Chicago– Oct. 2001 All-hands, iVDGL USC/ISI
•Numerous smaller meetings– CS-experiment– CS research– Liaisons with PPDG and EU DataGrid– US-CMS and US-ATLAS computing reviews– Experiment meetings at CERN
1229 Jan 2003Mike Wilde, University of Chicago [email protected]
Meetings in 2002• GriPhyN/iVDGL meetings
– Jan. 2002 EAC, Planning, iVDGL Florida– Mar. 2002 Outreach Workshop Brownsville– Apr. 2002 All-hands Argonne– Jul. 2002 Reliability Workshop ISI– Oct. 2002 Provenance Workshop Argonne– Dec. 2002 Troubleshooting Workshop Chicago– Dec. 2002 All-hands technical ISI +
Caltech– Jan. 2003 EAC SDSC
• Numerous other 2002 meetings– iVDGL facilities workshop (BNL)– Grid activities at CMS, ATLAS meetings– Several computing reviews for US-CMS, US-ATLAS– Demos at IST2002, SC2002– Meetings with LCG (LHC Computing Grid) project– HEP coordination meetings (HICB)
1329 Jan 2003Mike Wilde, University of Chicago [email protected]
Managing and Tracking Progress• Project milestones
– Initially prepared for NSF at time of award– Yearly “annual workplan” maps from these
milestones to more detailed subgroup goals> Experiment subgroups: ATLAS, CMS, LIGO, SDSS> Technology subgroups: CS areas & VDT
– Tracked regularly by management team– Reports to project at all-hands meetings
• Plus – VDT releases as technology coordination point– Application data challenges & demonstrations as
cross-cutting coordination mechanisms
Progress Towards Project Goals
1529 Jan 2003Mike Wilde, University of Chicago [email protected]
Global Context: Grid Projects• U.S. Infrastructure Projects
– GriPhyN (NSF)– iVDGL (NSF)– Particle Physics Data Grid (DOE)– PACIs and TeraGrid (NSF)– DOE Science Grid (DOE)– NSF Middleware Infrastructure (NSF)– National Virtual Observatory (NSF)
• EU, Asia major projects– European Data Grid (EDG) (EU, EC)– EDG-related national Projects (UK, Italy, France, …)– CrossGrid (EU, EC)– DataTAG (EU, EC)– VIRGO, GEO, European Virtual Observatory projects– LHC Computing Grid (LCG) (CERN)– Japanese Grid Projects– Korea Grid project
1629 Jan 2003Mike Wilde, University of Chicago [email protected]
U.S. Project Coordination: Trillium• Trillium = GriPhyN + iVDGL + PPDG
– Large overlap in leadership, people, experiments• Benefit of coordination
– Common S/W base + packaging: VDT + PACMAN– Low overhead for collaborative or joint projects– Wide deployment of new technologies– Stronger, more extensive outreach effort
• Forum for US Grid projects– Joint view, strategies, meetings and work– Unified entity to deal with EU & other Grid projects
1729 Jan 2003Mike Wilde, University of Chicago [email protected]
International Coordination• EU DataGrid & DataTAG
– Advisory boards and formal collaboration• HICB: HEP Inter-Grid Coordination Board
– HICB-JTB: Joint Technical Board– GLUE
• Participation in LHC Computing Grid (LCG)• International networks
– Standing Committee on Inter-regional Connectivity– Digital Divide projects, IEEAF
1829 Jan 2003Mike Wilde, University of Chicago [email protected]
Summary• GriPhyN has succeeded in creating a team
from 60+ people at 12 institutions, linking also with major appln & CS projects– Multidisciplinary teams that actually interact on a
daily basis—and even like each other!– Real and substantial progress on computer science,
technology, and science infrastructure• We take this for granted, but we believe that
in fact it is a remarkable achievement• We’re not exactly sure what makes it work,
but work it does– One major reason must surely be the caliber and
leadership roles of the participants