20
CHEP 2003, La Jolla, 23-28 March 2003 1 A. Polini ZEUS MVD and GTT Group: ANL, Bonn Univ., DESY-Hamburg -Zeuthen, Hamburg Univ., KEK-Japan, NIKHEF, Oxford Univ., Bologna, Firenze, Padova, Torino Univ. and INFN, UCL, Yale, York. The Architecture of the ZEUS Micro The Architecture of the ZEUS Micro Vertex Detector DAQ and Second Vertex Detector DAQ and Second Level Global Track Trigger Level Global Track Trigger Alessandro Polini DESY/Bonn

ZEUS MVD and GTT Group: ANL, Bonn Univ., DESY-Hamburg -Zeuthen, Hamburg Univ., KEK- Japan, NIKHEF, Oxford Univ., Bologna, Firenze, Padova, Torino Univ

Embed Size (px)

Citation preview

Page 1: ZEUS MVD and GTT Group: ANL, Bonn Univ., DESY-Hamburg -Zeuthen, Hamburg Univ., KEK- Japan, NIKHEF, Oxford Univ., Bologna, Firenze, Padova, Torino Univ

CHEP 2003, La Jolla, 23-28 March 2003 1 A. Polini

ZEUS MVD and GTT Group: ANL, Bonn Univ., DESY-Hamburg -Zeuthen, Hamburg Univ., KEK-Japan, NIKHEF, Oxford Univ., Bologna, Firenze, Padova, Torino Univ. and INFN, UCL, Yale, York.

The Architecture of the ZEUS Micro The Architecture of the ZEUS Micro Vertex Detector DAQ and Second Vertex Detector DAQ and Second

Level Global Track TriggerLevel Global Track Trigger

Alessandro Polini DESY/Bonn

Page 2: ZEUS MVD and GTT Group: ANL, Bonn Univ., DESY-Hamburg -Zeuthen, Hamburg Univ., KEK- Japan, NIKHEF, Oxford Univ., Bologna, Firenze, Padova, Torino Univ

CHEP 2003, La Jolla, 23-28 March 2003 A. Polini2

OutlineOutline

The ZEUS Silicon Micro Vertex DetectorThe ZEUS Silicon Micro Vertex Detector ZEUS Experiment Environment and RequirementsZEUS Experiment Environment and Requirements DAQ and Control DescriptionDAQ and Control Description The Global Track TriggerThe Global Track Trigger Performance and first experience with real dataPerformance and first experience with real data Summary and OutlookSummary and Outlook

Page 3: ZEUS MVD and GTT Group: ANL, Bonn Univ., DESY-Hamburg -Zeuthen, Hamburg Univ., KEK- Japan, NIKHEF, Oxford Univ., Bologna, Firenze, Padova, Torino Univ

CHEP 2003, La Jolla, 23-28 March 2003 A. Polini3

Detector LayoutDetector LayoutForward Section

410 mm

The forward section consists of 4 wheels with 28 wedged silicon sensors/layer providing r- information.

Barrel Section622 mm

The Barrel section provides 3 layers of support frames (ladders) which hold 5 full modules, 600 square sensors in total, providing r- and r-z space points.

27.5 GeV

p

920 GeV

Page 4: ZEUS MVD and GTT Group: ANL, Bonn Univ., DESY-Hamburg -Zeuthen, Hamburg Univ., KEK- Japan, NIKHEF, Oxford Univ., Bologna, Firenze, Padova, Torino Univ

CHEP 2003, La Jolla, 23-28 March 2003 A. Polini4

The ZEUS DetectorThe ZEUS Detector

bunch crossing interval: 96 ns

ZEUS: 3-Level Trigger System(Rate 500Hz405 Hz)

27.5 GeV

p

920 GeV

Event BuilderEvent Builder

Third Level TriggerThird Level Trigger

cpucpucpucpu cpucpu cpucpu cpucpu cpucpu

CALCAL CTDCTD

Offline TapeOffline Tape

Global Second Global Second Level TriggerLevel Trigger

GSLT Accept/RejectGSLT Accept/Reject

Global First Global First Level TriggerLevel Trigger

GSLT Accept/RejectGSLT Accept/Reject

CTDCTDFront EndFront End

CALCALFront EndFront End

Other Other ComponentsComponents

Other Other ComponentsComponents

CTDCTDSLTSLT

CALCALSLTSLT

CALCALFLTFLT

CTDCTDFLTFLT

~10 ms

5Hz5Hz

40Hz40Hz

500Hz500Hz

101077 Hz Hz

Eve

nt

Bu

ffer

s

Eve

nt

Bu

ffer

s

55 s

pip

elin

es

pip

elin

e

55 s

pip

elin

es

pip

elin

e

~0.7 s

Page 5: ZEUS MVD and GTT Group: ANL, Bonn Univ., DESY-Hamburg -Zeuthen, Hamburg Univ., KEK- Japan, NIKHEF, Oxford Univ., Bologna, Firenze, Padova, Torino Univ

CHEP 2003, La Jolla, 23-28 March 2003 A. Polini5

MVD DAQ and Trigger DesignMVD DAQ and Trigger Design

ZEUS experiment designed by end of ’80s– First high rate (96 ns) pipelined system

– With a flexible 3 level trigger

– Main building blocks were transputers (20Mbit/s)

10 years later the MVD: – 208.000 analog channels

– MVD available for triggering from 2nd level trigger on

DAQ Design Choice:– Use off-the-shelf products whenever possible

– VME embedded systems for readout

Priority scheduling absolutely needed

LynxOS (Real Time OS)

– Commercial Fast/Ethernet Gigabit Network

– Linux PC for data processing

Page 6: ZEUS MVD and GTT Group: ANL, Bonn Univ., DESY-Hamburg -Zeuthen, Hamburg Univ., KEK- Japan, NIKHEF, Oxford Univ., Bologna, Firenze, Padova, Torino Univ

CHEP 2003, La Jolla, 23-28 March 2003 A. Polini6

Detector Front-endDetector Front-endFront-end Chip HELIX 3.0*Front-end Chip HELIX 3.0*

– 128 channel analog pipelined programmable readout system specifically designed for the HERA environment.

– Highly programmable for wide and flexible usage

– ENC[e] 400 + 40*C[pF] (no radiation damage included, S/N ~13).

– Data read-out and multiplexed (96ns) over the analog output

– Internal Test Pulse and Failsafe Token Ring (8 chips) capability

* Uni. Heidelberg Nim A447, 89 (2000)

125 mm

64

mm

Front-end Hybrid

Silicon Sensors

Page 7: ZEUS MVD and GTT Group: ANL, Bonn Univ., DESY-Hamburg -Zeuthen, Hamburg Univ., KEK- Japan, NIKHEF, Oxford Univ., Bologna, Firenze, Padova, Torino Univ

CHEP 2003, La Jolla, 23-28 March 2003 A. Polini7

The ADC Modules and the Readout The ADC Modules and the Readout

Read-out:Read-out:

Custom made ADC Modules* 9u VME board + private bus extensions

8 detector modules per board (~8000 channels)

10 bit resolution

Common Mode, Pedestal and Noise Subtraction

Strip Clustering

2 separate data buffers:

– cluster data (for trigger purposes)

– raw/strip data for accepted events.

Design event data sizes

– Max. raw data size: 1.5 MB event (~208.000 ch)

– Strip data: Noise threshold 3 sigma (~15 KB)

– Cluster data ~ 8 KB

* Kek Tokyo Nim A436,281 (1999)

Page 8: ZEUS MVD and GTT Group: ANL, Bonn Univ., DESY-Hamburg -Zeuthen, Hamburg Univ., KEK- Japan, NIKHEF, Oxford Univ., Bologna, Firenze, Padova, Torino Univ

CHEP 2003, La Jolla, 23-28 March 2003 A. Polini8

VME Data Gathering and ControlVME Data Gathering and Control Data gathering and readout control using LynxOS 3.01 Real Time OS on

network booted Motorola MVME2400/MV2700 PPC VME Computers

VME functionalities using developed VME driver/library uvmelib*: multiuser VME access, contiguous memory mapping and DMA

transfers, VME interrupt handling and process synchronization

System interrupt driven (data transfer on ADCM data ready via DMA)

Custom designed VME “all purpose Latency clock + interrupt board”

Full DAQ wide latency measurement system

Data transfer over Fast Ethernet/GigaBit network using TCPIP connections

Data trasfer as binary streamwith an XDR header

data file playback capability(Montecarlo or dumped)

* http://mvddaq.desy.de

GSLT 2TP modules

Lynx OS

CPU

Lynx OS

CPU

NIM + Latency

NIM + Latenc

y

Slow control + Latency Clock modules

CPU Boot Server and Control

ADCM modules

Lynx OS

CPU

AnalogLinksNIM + Latency

Clock +Control

GSLT VME interface MVD VME Readout Crates

Latency Clock

F/E Network

Page 9: ZEUS MVD and GTT Group: ANL, Bonn Univ., DESY-Hamburg -Zeuthen, Hamburg Univ., KEK- Japan, NIKHEF, Oxford Univ., Bologna, Firenze, Padova, Torino Univ

CHEP 2003, La Jolla, 23-28 March 2003 A. Polini9

UVMElib* Software PackageUVMElib* Software Package Exploit the Tundra UniverseII VME bridge features

– 8 independent windows for R/W access to the VME

– Flexible Interrupt and DMA capabilities

Library layered on an enhanced uvmedriver

– Mapping of VME addresses AND contiguous PCI RAM segments

– Each window (8 VME, N PCI) addressed

by a kernel uvme_smem_t structure

Performance

– 18MB/s DMA transfer on std VMEbus

– Less than 50us response on VME IRQ

Additional Features

– Flexible interrupt usage via global system semaphores

– Additional Semaphores for process synchronization

– DMA PCI VME trasfer queuing

– Easy system monitoring via semaphores status and counters

typedef struct { int id; unsigned mode; int size; unsigned physical; unsigned virtual; char name[20];} uvme_smem_t;

For DMA transferFor DMA transferFor R/W operationsFor R/W operations

Symbolic MappingSymbolic Mapping

Addressing Mode: A32D32, Addressing Mode: A32D32, A24D32, A16D16… SYS,USR… A24D32, A16D16… SYS,USR…

* http://mvddaq.desy.de/

Page 10: ZEUS MVD and GTT Group: ANL, Bonn Univ., DESY-Hamburg -Zeuthen, Hamburg Univ., KEK- Japan, NIKHEF, Oxford Univ., Bologna, Firenze, Padova, Torino Univ

CHEP 2003, La Jolla, 23-28 March 2003 A. Polini10

Interfaces to the ZEUS existing environment Interfaces to the ZEUS existing environment

The ZEUS Experiment is based on Transputer

Interfaces for data gathering from other

detectors using and connection to existing

Global Second Level Trigger using ZEUS

2TP* modules

All newer components interfaces using

Fast/Gbit Ethernet

VME TP connections planned to be upgraded to

Linux PC + PCI-TP interface

* Nikhef NIM A332, 263 (1993)

Page 11: ZEUS MVD and GTT Group: ANL, Bonn Univ., DESY-Hamburg -Zeuthen, Hamburg Univ., KEK- Japan, NIKHEF, Oxford Univ., Bologna, Firenze, Padova, Torino Univ

CHEP 2003, La Jolla, 23-28 March 2003 A. Polini11

The Global Tracking TriggerThe Global Tracking Trigger Trigger Requirements

– Higher quality track reconstruction and rate reduction

– Z vertex resolution 9 cm (CTD only) 400 m (MVD+CTD+GTT)

– Decision required within existing SLT (<15 ms)

Development path

– MVD participation in GFLT not feasible, readout latency too large.

– Participation at GSLT possible:

Tests of pushing ADC data over FastEthernet give acceptable rates/latencies performance.

But track and vertex information poor due to low number of planes.

– Expand scope to interface data from other tracking detectors:

Initially Central Tracking Detector (CTD) - overlap with barrel detectors

Later Straw Tube Tracker (STT) - overlap with wheels detectors.

– Implement GTT as a PC farm with TCP data and control path Dijet MC event

Page 12: ZEUS MVD and GTT Group: ANL, Bonn Univ., DESY-Hamburg -Zeuthen, Hamburg Univ., KEK- Japan, NIKHEF, Oxford Univ., Bologna, Firenze, Padova, Torino Univ

CHEP 2003, La Jolla, 23-28 March 2003 A. Polini12

Network Connection to the ZEUS Event Builder

(~100 Hz)

ADCM modules

Lynx OS

CPU

AnalogLinksNIM + Latency Clock +

ControlADCM modules

Lynx OS

CPU

AnalogLinksNIM + Latency

ZEUS Run Control and OnlineMonitoring Environment

Main MVDDAQ server, Local

Control, Event-Builder Interface

ADCM modules

Lynx OS

CPU

AnalogLinksNIM + Latency Clock+

Control

VME (C+C Slave)Crate 1 (MVD bottom)

Analog Data

MVD HELIX Front-End & Patch-Boxes

HELIX Driver Front-end

Lynx OS

CPU

GSLT 2TP modules

Lynx OS

CPU

Lynx OS

CPU

VME (C+C Slave)Crate 2 (MVD forward)

VME (C+C Master)Crate 0 (MVD top)

NIM + Latency

TP connection to the Global Second

Level Trigger

VME HELIX Driver Crate

Global First Level Trigger,Busy, Error

NIM + Latency

Slow control + Latency Clock modules

Fast Ethernet/ Gigabit Network

VME CPU Boot Server and Control

Clock+Control

MVD VME Readout

Lynx OS

CPU

NIM + Latency

Lynx OS

CPU

NIM + Latency

CTD 2TP modules

STT 2TP module

Central Tracking Detector Read-out

Forward Tracking, Straw Tube Tracker Read-out

VME TP connection Data from CTD

VME TP connection Data from STT

Global Second Level Trigger

Decision

Global Tracking Trigger Processors (GFLT rate 800 Hz)GTT Control +

Fan-out

The MVD Data Acquisition System and GTTThe MVD Data Acquisition System and GTT

Page 13: ZEUS MVD and GTT Group: ANL, Bonn Univ., DESY-Hamburg -Zeuthen, Hamburg Univ., KEK- Japan, NIKHEF, Oxford Univ., Bologna, Firenze, Padova, Torino Univ

CHEP 2003, La Jolla, 23-28 March 2003 A. Polini13

GTT hardwareGTT hardwareImplementation

– MVD readout

3 Motorola MVME2400 450MHz

– CTD/STT interfaces

NIKHEF-2TP VME-Transputer

Motorola MVME2400 450MHz

– PC farm

12 DELL PowerEdge 4400 Dual 1GHz

– GTT/GSLT result interface Motorola MVME2700 367MHz

– GSLT/EVB trigger result interface

DELL PowerEdge 4400 Dual 1GHz

DELL Poweredge 6450 Quad 700 MHz

– Network switches

3 Intel Express 480T Fast/Giga 16 ports

Thanks to Intel Corp. who provided high-performance switch and PowerEdge hardware via Yale grant.

CTD/STT interface MVD readout

PC farm and switches

GTT/GSLT interface EVB/GSLT result interface

Page 14: ZEUS MVD and GTT Group: ANL, Bonn Univ., DESY-Hamburg -Zeuthen, Hamburg Univ., KEK- Japan, NIKHEF, Oxford Univ., Bologna, Firenze, Padova, Torino Univ

CHEP 2003, La Jolla, 23-28 March 2003 A. Polini14

GTT Algorithm DescriptionGTT Algorithm Description Modular Algorithm Design

– Two concurrent algorithms

(Barrel/Forward) foreseen

– Process one event per host

– multithreaded event processing: data unpacking

concurrent algorithms

time-out

– Test and Simulation results: 10 computing hosts required

“Control Credit” distribution not Round-Robin

At present barrel algorithm implemented Forward algorithm in development phase

Page 15: ZEUS MVD and GTT Group: ANL, Bonn Univ., DESY-Hamburg -Zeuthen, Hamburg Univ., KEK- Japan, NIKHEF, Oxford Univ., Bologna, Firenze, Padova, Torino Univ

CHEP 2003, La Jolla, 23-28 March 2003 A. Polini15

Find tracks in the CTD, extrapolate into the MVD to resolve pattern recognition ambiguity

– Find segments in Axial and Stereo layers of CTD

– Match Axial Segments to get r- tracks

– Match MVD r- hits

– Refit r- track including MVD r- hits

After finding 2-D tracks in r-, look for 3-D tracks in z-axial track length,s:

– Match stereo segments to track in r- to get position for z-s fit

– Extrapolation to inner CTD layers

– If available use coarse MVD wafer position to guide extrapolation

– Match MVD z hits

– Refit z-s track including z hits

Constrained or unconstrained fit– Pattern recognition better with constrained

tracks

– Secondary vertices require unconstrained tracks

Unconstrained track refit after MVD hits have been matched

Barrel algorithm descriptionBarrel algorithm description

Page 16: ZEUS MVD and GTT Group: ANL, Bonn Univ., DESY-Hamburg -Zeuthen, Hamburg Univ., KEK- Japan, NIKHEF, Oxford Univ., Bologna, Firenze, Padova, Torino Univ

CHEP 2003, La Jolla, 23-28 March 2003 A. Polini16

Resolution including MVD from MC ~400 μm

First tracking resultsFirst tracking results

Dijet Montecarlo Vertex Resolution

mm

Run 42314 Event 938

Run 44569 Vertex Distribution

Collimator C5Proton-beamgasinteraction

Nominal Vertex

GTT eventdisplay

Physics datavertex distribution

Page 17: ZEUS MVD and GTT Group: ANL, Bonn Univ., DESY-Hamburg -Zeuthen, Hamburg Univ., KEK- Japan, NIKHEF, Oxford Univ., Bologna, Firenze, Padova, Torino Univ

CHEP 2003, La Jolla, 23-28 March 2003 A. Polini17

Resolution including MVD from MC ~400 μm

First tracking resultsFirst tracking results

Dijet Montecarlo Vertex Resolution

mm

Yet another backgroundevent

Run 44569 Vertex Distribution

Collimator C5Proton-beamgasinteraction

Nominal Vertex

GTT eventdisplay

Physics datavertex distribution

Page 18: ZEUS MVD and GTT Group: ANL, Bonn Univ., DESY-Hamburg -Zeuthen, Hamburg Univ., KEK- Japan, NIKHEF, Oxford Univ., Bologna, Firenze, Padova, Torino Univ

CHEP 2003, La Jolla, 23-28 March 2003 A. Polini18

First GTT latency resultsFirst GTT latency results

2002 HERA running, after lumi upgrade compromized by high background rates

– Mean datasizes from CTD and MVD much larger than the design

Sept 2002 runs used to tune datasize cuts

– Allowed GTT to run with acceptable mean latency and tails at the GSLT

– Design rate of 500 Hz appears possible

CTD VME readout latency with respect to MVD

MVD VME SLT readout latencyms msms ms

ms

GTT latency after complete trigger processing

Mean GTT latency vs GFLT rate per run

MVD-GTT Latency as measured by GSLT

Low data occupancy rate tests

HERA

Montecarlo

Hz

Page 19: ZEUS MVD and GTT Group: ANL, Bonn Univ., DESY-Hamburg -Zeuthen, Hamburg Univ., KEK- Japan, NIKHEF, Oxford Univ., Bologna, Firenze, Padova, Torino Univ

CHEP 2003, La Jolla, 23-28 March 2003 A. Polini19

MVD Slow ControlMVD Slow ControlCANbus is the principle fieldbus used:

2 ESD CAN-PCI/331* dual CANbus adapter in 2 linux PCs

Each SC sub-system uses a dedicated CANbus:

– Silicon detector/radiation monitor bias voltage:

30 ISEG EHQ F0025p 16 channel supply boards**+ 4 ISEG ECH 238L UPS 6U EURO crates

– Front-end Hybrid low voltage:

Custom implementation based on the ZEUS LPS detector supplies (INFN TO)

– Cooling and Temperature monitor:

Custom NIKHEF SPICan Box ***

– Interlock System:

Frenzel+Berg EASY-30 CAN/SPS ****

MVD slow control operation:– Channel monitoring performed typically every 30s

– CAN emergency messages are implimented

– SC wrong state disables the experiments trigger

– CERN Root based tools are used when operator control and monitoring is required*http://www.esd.electronic **http://www.iseg-hv.com ***http://www.nikhef.nl/user/n48/zeus_doc.html

****//www.frenzel-berg.de/produkte/easy.html

Page 20: ZEUS MVD and GTT Group: ANL, Bonn Univ., DESY-Hamburg -Zeuthen, Hamburg Univ., KEK- Japan, NIKHEF, Oxford Univ., Bologna, Firenze, Padova, Torino Univ

CHEP 2003, La Jolla, 23-28 March 2003 A. Polini20

Summary and OutlookSummary and Outlook

The MVD and GTT system have been successfully integrated into the ZEUS experiment

267 runs with 3.1Mio events recorded between 31/10/02 and 18/02/03 with MVD on and DQM (~ 700 nb-1)

The MVD DAQ and GTT architecture, built as a synthesis of custom solutions and common-off-the-shelf equipment (real Time OS + Linux PC+ Gigabit Network), works reliably

The MVD DAQ and GTT performance (latency, throughput and stability) are satisfactory

Next steps: Enable utilization of barrel algorithm result at the GSLT Finalize development and integration of the forward algorithm

So far very encouraging results. Looking forward to routine high

luminosity data taking. The shutdown ends in June 2003...