22
Data rate reduction in ALICE

Data rate reduction in ALICE. Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event

  • View
    215

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Data rate reduction in ALICE. Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event

Data rate reduction inALICE

Page 2: Data rate reduction in ALICE. Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event

Data volume and event rate

TPC detector

data volume = 300 Mbyte/event

data rate = 200 Hz

front-end electronics

DAQ – event building

Level-3 system

permanent storage system

bandwidth

60 Gbyte/sec

15 Gbyte/sec

< 1.2 Gbyte/sec

< 2 Gbyte/sec

Page 3: Data rate reduction in ALICE. Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event

Data rate reduction

• Volume reduction– regions-of-interest and partial

readout– data compression

• entropy coder

• vector quantization

• TPC-data modeling

• Rate reduction– (sub)-event reconstruction and event

rejection before event building

Page 4: Data rate reduction in ALICE. Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event

Regions-of-interest and partial readout (1)

• Selection of TPC sector and -slice based on TRD track candidate

• Momentum filter for D0 decay tracks based on TPC tracking

Page 5: Data rate reduction in ALICE. Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event

Regions-of-interest and partial readout (2)

• Momentum filter for D0 decay tracks based on TPC tracking:

pT > 0.8 GeV/c vs. all pT

Page 6: Data rate reduction in ALICE. Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event

Data compression:Entropy coder

Variable Length Codingshort codes for long codes forfrequent values infrequent values

Results: NA49: compressed event size = 72% ALICE: = 65%

(Arne Wiebalck, diploma thesis, Heidelberg)

Probability distribution of 8-bit TPC data

Page 7: Data rate reduction in ALICE. Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event

Data compression: TPC - RCU

• TPC front-end electronics system architecture and readout controller unit.

• Pipelined Huffman Encoding Unit, implemented in a Xilinx Virtex 50 chip*

* T. Jahnke, S. Schoessel and K. Sulimma, EDA group, Department of Computer Science, University of Frankfurt

Page 8: Data rate reduction in ALICE. Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event

Data compression:Vector quantization

• Sequence of ADC-values on a pad = vector:

• Vector quantization = transformation of vectors into codebook entries

• Quantization error:

Results: NA49: compressed event size = 29 %ALICE: = 48%-64%(Arne Wiebalck, diploma thesis, Heidelberg)

codebook

compare

Page 9: Data rate reduction in ALICE. Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event

Data compression: TPC-data modeling

• Fast local pattern recognition:

Result: NA49: compressed event size = 7 %

analytical cluster model

quantization of deviations from track and cluster

model

local track parameters

comparison to raw data

simple local track model (e.g. helix) track parameters

• Track and cluster modeling:

Page 10: Data rate reduction in ALICE. Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event

Event rejection

TRD

Trigger~2 kHz

Global

Trigger

Zero suppressed

TPC data

Sectorparallel

Other Trigger

Detectors,

L0pretrig.

L1

L2 accept

(216 Links, 83 MB/evt)

L0

Readout

TPC

Readout

other detectors

L1

Tracking ofe+e- candidates

insideTPC

Selectregionsofinterest

Verifye+e-

hypothesis

TRDe+e- tracks

Rejectevent

Track segments

and space pointse+e- tracksplus ROIs

On-linedata reduction(tracking, reconstruction,

partialreadout,compression)

seed

s

enable

L0

L1

L2

HLT

DAQ

Tim

e, c

ausa

lity

0.5-2 MB/evt 4-40 MB/evt

Detector raw data readout for debugging

Binary loss less data compression(RLE, Huffman, LZW, etc.)

45MB/evt

Event sizes and number of links TPC only

Event sizes and number of links TPC only

Page 11: Data rate reduction in ALICE. Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event

Fast pattern recognition

Essential part of HLT system

– crude complete event reconstruction

monitoring, event rejection

– redundant local tracklet finder for cluster evaluation and data modeling

efficient data compression

– selection of (,,pT)-slices

ROI

– momentum filter

ROI

– high precision tracking for selected track candidates event rejection

Page 12: Data rate reduction in ALICE. Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event

Requirements on the TPC-RORC design

concerning HLT tasks

• Transparent mode– transfering raw data to DAQ

• Processing mode– Huffman decoding– unpacking– 10-to-8 bit conversion– pattern recognition

• cluster finder

• Hough transformation tracker

Page 13: Data rate reduction in ALICE. Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event

TPC PCI-RORC

• HLT TPC PCI-RORC– backwards compatibility– fully programmable

FPGA coprocessor

PCI bridge Glue logic DIUinterface

S5935 ALTERA

DIU card

PCI bus

• Simple PCI-RORC

PCI bridge Glue logic DIUinterface

DIU card

PCI bus

FPGA Coprocessor

SRAM

Page 14: Data rate reduction in ALICE. Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event

raw data, 10bit dynamic range,zero suppressed

Huffman encoding (and vector quantization)

fast cluster finder: simple unfolding, flagging

of overlapping clusters

RCU

RORC

cluster list

raw data

fast vertex finder

fast track finder initialization (e.g. Hough transform)

Hough histogramsPeakfinder

receiver node

Preprocessing per sector

global node vertex position

detector front-end electronics

Huffman decoding,unpacking,

10-to-8 bit conversion

Page 15: Data rate reduction in ALICE. Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event

FPGA coprocessor:cluster finder

• Fast cluster finder– up to 32 padrows per RORC

– up to 141 pads/row and up to 512 timebins/pad

– internal RAM: 2x512x8bit

– timing (in clock cycles, e.g. 5 nsec)1: #(cluster-timebins per pad) / 2 + #clusters

outer padrow: 150 nsec/pad, 21 sec/row

1. Timing estimates by K. Sulimma, EDA group, Department of Computer Science, University of Frankfurt

– centroid calculation: pipelined array multiplier

Page 16: Data rate reduction in ALICE. Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event

FPGA coprocessor:Hough transformation

• Fast track finder: Hough transformations2

– (row,pad,time)-to-(2/R,,) transformation

– (n-pixel)-to-(circle-parameter) transformation

– feature extraction: local peak finding in parameter

space

2. E.g. see Pattern Recognition Algorithms on FPGAs and CPUs for the ATLAS LVL2 Trigger,

C. Hinkelbein et at., IEEE Trans. Nucl. Sci. 47 (2000) 362.

Page 17: Data rate reduction in ALICE. Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event

raw data, 8bit dynamic range,decoded and unpacked

slicing of padrow-pad-time space into sheets of pseudo-rapidity,

subdiving each sheet into overlapping patches

track segments

fast track finder B:1. Hough transformation

receiver node

Processing per sector

vertex position,cluster list

sub-volumes in r,,

cluster deconvolutionand fitting

updated vertex positionupdated cluster list,track segment list

fast track finder B:2. Hough maxima finder 3. tracklett verification

RORC

fast track finder A:track follower

Page 18: Data rate reduction in ALICE. Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event

Hough transform (1)

• Data flow

Page 19: Data rate reduction in ALICE. Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event

Hough transform (2)

-slices

Page 20: Data rate reduction in ALICE. Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event

Hough transform (3)• Transformation and maxima search

Page 21: Data rate reduction in ALICE. Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event

FPGA coprocessor:Implementation of Hough transform

Page 22: Data rate reduction in ALICE. Data volume and event rate TPC detector data volume = 300 Mbyte/event data rate = 200 Hz front-end electronics DAQ – event

FPGA coprocessorprototype

• FPGA candidates– Altera Excalibur (256 kbyte SRAM)

– Xilinx Virtex II (3.9 Mbit dual port SRAM + 1.9 Mbit distributed SRAM, 420 MHz)

– external high-speed SRAM

PCI bridge Glue logic

DIUinterface

DIU card

PCI bus

FPGA Coprocessor

SRAM

FEP RAM

SIUinterface

SIU card RCU