View
219
Download
0
Tags:
Embed Size (px)
Citation preview
1P. Vande Vyvre - CERN/PH
ALICE DAQProgress Report
Comprehensive Review IV
P. Vande Vyvre – CERN/PH
ALICE DAQProgress Report
Comprehensive Review IV
P. Vande Vyvre – CERN/PH
5P. Vande Vyvre - CERN/PH
ALICE DAQALICE DAQ
• Institutes, Responsibilities, Milestones
• DAQ Architecture
• DDL and D-RORC
• MOOD (Data quality monitoring)
• DAQ Fabric
• Data Challenge V
• Installation, test and commissioning
6P. Vande Vyvre - CERN/PH
ALICE DAQALICE DAQ
• Institutes, Responsibilities, Milestones
• DAQ Architecture
• DDL and D-RORC
• MOOD (Data quality monitoring)
• DAQ Fabric
• Data Challenge V
• Installation, test and commissioning
7P. Vande Vyvre - CERN/PH
Institutes & ResponsibilitiesInstitutes & Responsibilities
• Birmingham– TRG/DAQ simulation– RORC receiver cards
• KFKI-Budapest :– DDL: optical links and RORC receiver cards– Radiation tolerance tests (with Technical University/Budapest and
Institute of Nuclear Research (ATOMKI/Debrecen)
• CERN:– DATE: DAQ software framework– DAQ fabric
• Zagreb:– TRG/DAQ simulation– AFFAIR: performance monitoring package
• Split:– TRG/DAQ simulation– Storage
• Collaborating institutes: Istanbul (Data quality monitoring)
8P. Vande Vyvre - CERN/PH
Milestones and outcome of CR3Milestones and outcome of CR3LHCC Milestones Detector readout with January 2003
TDR preparation status March 2003
Common TDR submission to LHCC December 2003
DDL pre-production for detector 1Q 2004test and commissioning
• D-RORC pre-production for detector 1Q 2004 test and commissioning
• DAQ reference system in DAQ lab: 4Q 2004
• DAQ systems for surface tests and commissioning:– SXL2: mounting hall on the surface of point 2 1Q 2005
– Si. lab: ITS surface test 2005
• Milestones Final DAQ– Jan 2006 Final DAQ system ready with all functionalities
20 % of final performance
– Nov 2006 30 % for pp and first HI run
– Oct 2008 100 % for second HI run (needs and budget)
9P. Vande Vyvre - CERN/PH
ALICE DAQALICE DAQ
• Institutes, Responsibilities, Milestones
• DAQ Architecture
• DDL and D-RORC
• MOOD (Data quality monitoring)
• DAQ Fabric
• Data Challenge V
• Installation, test and commissioning
10P. Vande Vyvre - CERN/PH
25 GB/s
2.50 GB/s
1.25 GB/s
Pb-Pb beam Rate Max. ev. size
- Central 20 Hz 86.0 MB - MB 20 Hz 20.0 MB- Dimuon 1600 Hz 0.5 MB- Dielectron 200 Hz 9.0 MB
pp beamMB 100 Hz 2.5 MB
Running modes
A: DAQB: DAQ+HLT AnalysisC: DAQ+HLT Trigger
Physics requirementsPhysics requirements
TriggerLevel 0,1
TriggerLevel 2
High-LevelTrigger
Detector
Front-end Buffer
Readout Buffer
Event-Building Network
Storage network
Local DataConcentrators (LDC)
Global Data Collectors(GDC)
Permanent DataStorage (PDS)
Detector Data Link(DDL)
11P. Vande Vyvre - CERN/PH
GDC GDCGDCGDC
DAQ architectureDAQ architecture
CTP
LTU
TTC
FERO FERO
LTU
TTC
FERO FERO
LDCLDC
BUSY BUSY
Rare/All
Event Fragment
Sub-event
Event
File
Storage Network
TDS
PDS
L0, L1a, L2
L0, L1a, L2
262 DDLs
EDM
LDCLoad Bal. LDC LDC
HLT Farm
FEPFEP
DDL
H-RORC
10 DDLs
10 D-RORC
10 HLT LDC
123 DDLs
TDS
DSS DSS
Event Building Network
329 D-RORC
175 Detector LDC
50 GDC25 TDS
5 DSS
14P. Vande Vyvre - CERN/PH
Key building blocks and conceptsKey building blocks and concepts
• Protocol-less push-down strategy
• System throttling by X-on/X-Off signals
• Detector interface via a standard link (DDL)
– Readout
– Control and download
• Software framework (DATE)
– Dataflow (data driven according to TRG generated event tags)
– Control (FSM and messages)
– Monitoring
• Distributed DAQ Fabric
– LDC
– GDC
– DSS
15P. Vande Vyvre - CERN/PH
ALICE DAQALICE DAQ
• Institutes, Responsibilities, Milestones
• DAQ Architecture
• DDL and D-RORC
• MOOD (Data quality monitoring)
• DAQ Fabric
• Data Challenge V
• Installation, test and commissioning
16P. Vande Vyvre - CERN/PH
D-RORC Hardware ArchitectureD-RORC Hardware Architecture
APEXFPGA
64-bit/66 MHz, PCI/PCI-X
MediaI/F 1
MediaI/F 2
BusyI/F
Conf.Flash
JTAGJTAG
P11
P12
P13
P14
CMC I/F
Opt
ical
I/F
LV
DS
I/F
Configuring
250 MB/s
250 MB/s
528 MB/s
17P. Vande Vyvre - CERN/PH
D-RORC HardwareD-RORC Hardware
D-RORC with integrated DIU ports• 2 DDL channels• Integration with the HLT system
D-RORC with plug-in DIU• Read out single DDL channel• Detector integration
18P. Vande Vyvre - CERN/PH
D-RORC and DIUD-RORC and DIU
PC + single-channel D-RORC
CPU: 2 x Xeon 2400 MHzKernel: 2.4.20-30.7
Testing the integration of the D-RORC and the DIU using the new library (v4.2)
• rorc_receive –g 3 ... (internal loopback)• rorc_receive –g 1 ... (DIU loopback)
19P. Vande Vyvre - CERN/PH
D-RORC and DATED-RORC and DATE
• 2 D-RORC cards• Same PCI bus• Data generation:
• Front-end emulator cards• Internal data generator
21P. Vande Vyvre - CERN/PH
Data splitter for DAQ/HLT interfaceData splitter for DAQ/HLT interface
PC + twin-channel D-RORC
PC + single-channel D-RORC
FEE Emulator
CPU: Intel P3 800 MHzKernel: 2.4.20-24.7rorc_receive
CPU: 2 x Xeon 2400 MHzKernel: 2.4.20-30.7rorc_receive
DAQ
HLT
Detector
22P. Vande Vyvre - CERN/PH
ALICE DAQALICE DAQ
• Institutes, Responsibilities, Milestones
• DAQ Architecture
• DDL and D-RORC
• MOOD (Data quality monitoring)
• DAQ Fabric
• Data Challenge V
• Installation, test and commissioning
23P. Vande Vyvre - CERN/PH
• Monitoring Of Online Data
• Detector Debugger (raw data visualizer)
• Written in C/C++
• Based on ROOT
• Based on DATE
• Data quality monitoring framework for all ALICE detectors
MOODMOOD
24P. Vande Vyvre - CERN/PH
MOOD: Current Detector SchemeMOOD: Current Detector Scheme
Detectors currently implemented :•ITS – SDD
•TPC Sector
•HMPID Photocathode
Detectors to be implemented :•All detectors individually
• Test setups
•ALICE as a whole
25P. Vande Vyvre - CERN/PH
TPC Sector TestTPC Sector Test
PadPlane
ChargePlane
Event SizeDistribution
3DView
ChargePlane 3D
ToolBar
26P. Vande Vyvre - CERN/PH
HMPID Photocathode (1)HMPID Photocathode (1)
Tabs•Photocathode•3D View•Size Distribution•Event Dump•Logs…
27P. Vande Vyvre - CERN/PH
HMPID Photocathode (2)HMPID Photocathode (2)
Tabs•Photocathode•3D View•Size Distribution•Event Dump•Logs…
28P. Vande Vyvre - CERN/PH
ITS - SDD (1)ITS - SDD (1)
Tabs•SDD Display•3D View•Size Distribution•Event Dump•Logs…
29P. Vande Vyvre - CERN/PH
ITS - SDD (2)ITS - SDD (2)
Tabs•SDD Display•3D View•Size Distribution•Event Dump•Logs…
30P. Vande Vyvre - CERN/PH
ALICE DAQALICE DAQ
• Institutes, Responsibilities, Milestones
• DAQ Architecture
• DDL and D-RORC
• MOOD (Data quality monitoring)
• DAQ Fabric
• Data Challenge V
• Installation, test and commissioning
31P. Vande Vyvre - CERN/PH
DAQ Reference System DAQ Reference System
1 FC switch
1 GE switch
4 LDCs (Local Data Concentrator):
- rackmount PCs- U1, U2, U4 height- equipped with 6 D-RORCs
6 DDLs
2 GDCs (Gobal Data Collector):
- rackmount PCs- U1 height - equipped with FC cards
2 TDS (Transient Data Storage):
- rackmount disk array- U2 height- IDE and FC disks
1 DSS (DAQ Services):
- rackmount PC- U4 height- hot-swap SCSI disks
1 KVM switch
standard LAN
32P. Vande Vyvre - CERN/PH
Reference Setup – FrontReference Setup – Front
KVM switch:
Raritan Paragon UMT2161
2x LDCs:
- 4U height- dual Xeon- 2x D-RORC cards DDL
2x GDCs:
- 1U height- dual Xeon- Qlogic QLA2310F cards
2x disk arrays:
- Infortrend IFT-6330
- DotHill SANnet II
DSS:
- 4U height- quad Xeon- 3x 36GB SCSI disks
L3 rack
2x GB Ethernet switch:
3COM SuperStack 3
Fibre Channel switch:
Broacade SilkWorm 3800
• DAQ lab• Detect and address integration issues• System for development and support
33P. Vande Vyvre - CERN/PH
Reference Setup - RearReference Setup - Rear
Power distributor
Cat5 cables:
- Ethernet- KVM- RJ45 connectors
Optical cables:
- DDL- FiberChannel- 2 Gbit/s multimode - LC-LC connectors
L3 rack
Mounting rails (!)
34P. Vande Vyvre - CERN/PH
ALICE DAQALICE DAQ
• Institutes, Responsibilities, Milestones
• DAQ Architecture
• DDL and D-RORC
• MOOD (Data quality monitoring)
• DAQ Fabric
• Data Challenge V
• Installation, test and commissioning
35P. Vande Vyvre - CERN/PH
ADC V Hw ArchitectureADC V Hw Architecture
4 x GE4 x GE
4 x GE4 x GE 10GE10GE
10GE10GE
10GE10GE
10 Tape Server10 Tape Server
4 x GE4 x GE
3COM 49003COM 4900
~ 80 CPU servers 2 x 2.4 GHz Xeon, 1 GB RAM, Intel 8254EM Gigabit in PCI-X 133 (Intel PRO/1000), CERN Linux 7.3.3~ 80 CPU servers 2 x 2.4 GHz Xeon, 1 GB RAM, Intel 8254EM Gigabit in PCI-X 133 (Intel PRO/1000), CERN Linux 7.3.3
4 x 7 Disk servers4 x 7 Disk servers2 x 2.0 GHz Xeon2 x 2.0 GHz Xeon1 GB RAM1 GB RAMIntel 82544GCIntel 82544GC
32 x GE32 x GE
32 IA64 HP-rx2600 Servers32 IA64 HP-rx2600 Servers2 x 1 GHz Itanium-22 x 1 GHz Itanium-22 GB RAM2 GB RAMBroadcom NetXtrem BCM5701 (tg3)Broadcom NetXtrem BCM5701 (tg3)RedHat Advanced Workstation 2.1RedHat Advanced Workstation 2.16.4 GB/s to memory, 4.0 GB/s to I/O6.4 GB/s to memory, 4.0 GB/s to I/O
3COM 49003COM 490016 x Gbit16 x Gbit
Enterasys E1 OASEnterasys E1 OAS12 Gbit, 1 x 10 Gbit12 Gbit, 1 x 10 Gbit
Enterasys ER16Enterasys ER1616 slots16 slots4/8 x Gbit or 1 x 10 Gbit/slot4/8 x Gbit or 1 x 10 Gbit/slot
3COM 49003COM 4900
36P. Vande Vyvre - CERN/PH
Achievements (1)Achievements (1)
System size (lack of resources in LCG testbed) System scalability Performance test with ALICE data traffic
– ALICE-like traffic– ALICE-like events simulated data used:Realistic (sub-)event size on tape (ALICE year 1)
DATE load-balancing demonstrated and used
Sustained bw to tape not achieved– Peak 350 MB/s. Sustained 280 MB/s over 1 day – Reached production-quality level only last week of test
IA-64 from Openlab successfully integrated in the ADC V Simulated raw data used for performance test Data read back from CASTOR and verified
39P. Vande Vyvre - CERN/PH
Achievements (2)Achievements (2)
• Network LDCs and GDCs: stable and scaleable including trunking
Between GDCs and disk servers: Unreliable
Truncking not scaling as expected
Module broken and replaced twice in Enterasys
10 Gbit Eth. Backbone New generation of NIC cards (Intel Pro 1000)
– NIC from Broadcom unreliable. Replaced by Intel Pro 1000.
Storage–Hardware problem on the disk servers–Several last minute workarounds needed (scripts for monitoring and reconfiguring)
40P. Vande Vyvre - CERN/PH
Performance Goals (1)Performance Goals (1)
0
500
1000
1500
2000
2500
3000
MB/s
1998 1999 2000 2001 2002 2003 2004 2005 2006
DAQ bw
DAQ bw ALICE traffic
DAQ bw uniform traffic
650 MB/s650 MB/s650 MB/s650 MB/s
41P. Vande Vyvre - CERN/PH
Performance Goals (2)Performance Goals (2)
MB/s to Mass Storage
0
200
400
600
800
1000
1200
1400
1998 1999 2000 2001 2002 2003 2004 2005 2006
MSS bw initial goalsMSS bw achievedMSS bw revised goalsTape Bw LCG
300 MB/s300 MB/s300 MB/s300 MB/s
42P. Vande Vyvre - CERN/PH
Open issues and future goalsOpen issues and future goals
• CASTOR: – Recovery from malfunctioning disk server– New stager– Special daemon between CPU and disk server instead of
standard RFIO daemon. Needed to achieve adequate performance. Should be put back in main development.
• DAQ– Increase performances
• Network– First prototypes of 10 Gbit Eth. equipment from Enterasys
unreliable– Enterasys support not effective on this case
• Meeting scheduled with LCG PEB to present results and address the open issues
43P. Vande Vyvre - CERN/PH
ALICE DAQALICE DAQ
• Institutes, Responsibilities, Milestones
• DAQ Architecture
• DDL and D-RORC
• MOOD (Data quality monitoring)
• DAQ Fabric
• Data Challenge V
• Installation, test and commissioning
44P. Vande Vyvre - CERN/PH
DAQ Commissioning (1) DAQ Commissioning (1)
• What: – DAQ itself – How the DAQ will help the commissioning of other systems
• Where:– ACR: ALICE Control Room.– PX24-CR1: ALICE DAQ Counting Room located in the access shaft.– SXL2: mounting hall on the surface of point 2.– UX25: experimental underground area
• When:– 1 Q 2005: all DAQ functionalities. Hw at the final location.
20 DDLs for readout of 2 TPC sectors in SXL2 + other detectors
– Jan 2006 Final DAQ system ready with all functionalities 20 % of performance
– Nov 2006 30 % of performance– Oct 2008 100 % of performance (needs and budget)
45P. Vande Vyvre - CERN/PH
DAQ Commissioning (2)DAQ Commissioning (2)
• Tests at the construction site of hardware elements– Verification of the DDLs and D-RORCs– Test of the cards with a test station made of a PC and DDL test software
• Tests at the development sites of software elements: – AFFAIR, DATE, DDL sw, CASTOR – Combined tests well before installation: test beams, Data Challenges
• Standalone tests in experimental area– DAQ: possibility of injecting data at every stage of the data flow. – Each segment of the dataflow first tested in isolation and then in combination with
other elements• DAQ integration at Point 2:
– Integration with ECS– Tests with TRG, HLT, DCS– Detector test and commissioning
• Test with cosmic and pulser trigger– From June 2005: a TTC-based trigger to trigger autonomous DDL data sources
for global tests of the DAQ involving DDLs.– From January 2006: a cosmic or pulser trigger for the commissioning of detectors
involving Trigger and DAQ
46P. Vande Vyvre - CERN/PH
DAQ Tools for Detector CommissioningDAQ Tools for Detector Commissioning
• Lab test– DDL Simulator: standalone daughter-card– Detector readout with DDL and DATE
• Beam test– Detector readout by DDL and DATE– VME for trigger and older electronics (Silicon
telescope e.g.)• Point 2
– DAQ system at Point 2 in 2004– Detector test and commissioning – System will evolve in size and performances
according to the needs.– Concurrent tests of several detectors (~3
in 2004, all in 2006)– Complete capabilities since the start– Control from ACR or any computer
• Other tests– ITS integrated test
DDL Simulator
Detector Readout card
under test
47P. Vande Vyvre - CERN/PH
DAQ Tools for Detector CommissioningDAQ Tools for Detector Commissioning
• Lab test– DDL Simulator: standalone daughter-card– Detector readout with DDL and DATE
• Beam test– Detector readout by DDL and DATE– VME for trigger and older electronics
(Silicon telescope e.g.)• Point 2
– DAQ system at Point 2 in 2004– Detector test and commissioning – System will evolve in size and
performances according to the needs.– Concurrent tests of several detectors
(~3 in 2004, all in 2006)– Complete capabilities since the start– Control from ACR or any computer
• Other tests– ITS integrated test
SIU (DDL)Optical Fiber to LDC
Input Data from Pattern
Generator
49P. Vande Vyvre - CERN/PH
DAQ Tools for Detector CommissioningDAQ Tools for Detector Commissioning
• Lab test– DDL Simulator: standalone daughter-card– Detector readout with DDL and DATE
• Beam test– Detector readout by DDL and DATE– VME for trigger and older electronics
(Silicon telescope e.g.)• Point 2
– DAQ system at Point 2 in 2004– Detector test and commissioning – System will evolve in size and
performances according to the needs.– Concurrent tests of several detectors
(~3 in 2004, all in 2006)– Complete capabilities since the start– Control from ACR or any computer
• Other tests– ITS integrated test
DDL
LDC (PC/Linux)
RORC
DDL DIU
Detectorreadout
GDC
LDC (VME/Linux)
TriggerLogic
Event Building Network
Data Storage inComputing Center
DDL SIU
Silicon tel.readout
DATE V4
50P. Vande Vyvre - CERN/PH
DAQ/Detector integration statusDAQ/Detector integration statusDATE DDL
data generator
DDLreadout
MOOD
Data quality monitoring
SPD
SSD
SDD
TPC In progress
TRD
TOF
HMPID
Muon In progress
PHOS
ZDC
FMD
T0
V0
51P. Vande Vyvre - CERN/PH
DAQ Tools for Detector CommissioningDAQ Tools for Detector Commissioning
• Lab test– DDL Simulator: standalone daughter-card– Detector readout with DDL and DATE
• Beam test– Detector readout by DDL and DATE– VME for trigger and older electronics
(Silicon telescope e.g.)• Point 2
– DAQ system at Point 2 in 2004– Complete capabilities since the start– Detector test and commissioning – System will evolve in size and
performances according to the needs– Concurrent tests of several detectors
(~3 in 2004, all in 2006)– Control from ACR or any computer
• Other tests– ITS integrated test
52P. Vande Vyvre - CERN/PH
DAQ Tools for Detector CommissioningDAQ Tools for Detector Commissioning
• Lab test– DDL Simulator: standalone daughter-card– Detector readout with DDL and DATE
• Beam test– Detector readout by DDL and DATE– VME for trigger and older electronics
(Silicon telescope e.g.)• Point 2
– DAQ system at Point 2 in 2004– Complete capabilities since the start– Detector test and commissioning – System will evolve in size and
performances according to the needs– Concurrent tests of several detectors
(~3 in 2004, all in 2006)– Control from ACR or any computer
• Other tests– ITS integrated test
53P. Vande Vyvre - CERN/PH
DAQ for Test & Commissioning DAQ for Test & Commissioning
SXL HallSX Hall
UX25 (Experimental Area)
PX24/CR1 (DAQ)
PX24/CR2 (HLT)
SR Hall
Access Shaft
DDLLAN
DDL Patch Panel
(ALICE sub-detector assembly)(Networking)
PX24/CR3 (DCS)
PX24/CR4 (Misc.)
LAN
WR1
WR2
ACR
1 Q 2005 : DAQ System at Point 2
12 DDLs for TPC
8 DDLs for others
3 partitions
2005: similar system in Si. lab.
54P. Vande Vyvre - CERN/PH
Final DAQ System Final DAQ System
SXL HallSX Hall
UX25 (Experimental Area)
PX24/CR1 (DAQ)
PX24/CR2 (HLT)
SR Hall
Access Shaft
DDLLAN
DDL Patch Panel
(ALICE sub-detector assembly)(Networking)
PX24/CR3 (DCS)
PX24/CR4 (Misc.)
WR1
WR2
ACR
Installation staging:
(% of final DAQ performance)
2006: 20%
2007: 30%
2008: 100%