30
CMS Electronics Week November 2002 1 Electronics Issues Drew Baden University of Maryland USCMS HCAL USCMS HCAL

USCMS HCAL

  • Upload
    yama

  • View
    39

  • Download
    2

Embed Size (px)

DESCRIPTION

USCMS HCAL. Electronics Issues Drew Baden University of Maryland. SBS. CLK. D C C. CAL REGIONAL TRIGGER. H T R. H T R. H T R. 16 bits @ 80 MHz. TTC. 32 bits @ 40 MHz. QIE. CCA. GOL. QIE. QIE. CCA. QIE. GOL. QIE. CCA. QIE. FE/DAQ Electronics. DAQ. - PowerPoint PPT Presentation

Citation preview

Page 1: USCMS HCAL

CMS Electronics Week November 2002 1

Electronics IssuesDrew Baden

University of Maryland

USCMS HCALUSCMS HCALUSCMS HCALUSCMS HCAL

Page 2: USCMS HCAL

CMS Electronics Week November 2002 2

Shield Wall

SBS

HPD

FE MODULE

12 HTRs perReadout Crate,2 DCC

FRONT-ENDRBXReadout Box (On detector)

READ-OUT CrateTrigger Primitives

Fibers at 1.6 Gb/s3 QIE-channels per fiber

QIE

QIE

QIE

QIE

QIE

QIE

CC

A

GOL

DCC

TTC

GOL

CC

A

HTR

HTR

CAL

REGIONAL

TRIGGER

32 bits@ 40 MHz

16 bits@ 80 MHz

CC

A

S-Link: 64 bits @ 25 MHz

Rack CPU

FE/DAQ ElectronicsFE/DAQ ElectronicsFE/DAQ ElectronicsFE/DAQ Electronics

CLK

HTR

DAQ

Page 3: USCMS HCAL

CMS Electronics Week November 2002 3

Readout VME CrateReadout VME CrateReadout VME CrateReadout VME Crate

“BIT3” board – Slow monitoring over VME– Commercial VME/PCI Interface to CPU

FanOut board– Takes TTC stream in– Clone and Fanout timing signals

HTR (HCAL Trigger and Readout) board– Spy output over VME– FE-Fiber input– TPG output (SLBs) to CRT– DAQ/TP Data output to DCC

DCC (Data Concentrator Card) board– Input from HTRs– Spy output– Output to DAQ

DCC

VME CRATE

20m Copper1.2 Gb/s

DAQ

Calorimeter Regional Trigger

BIT3

Fiber1.6 Gb/s

FanOut

HTR

Front End Electronics

HTR

DCC

HTR

HTR

...

TTC fiber

Page 4: USCMS HCAL

CMS Electronics Week November 2002 4

HCAL RacksHCAL RacksHCAL RacksHCAL Racks

• HCAL will need 8 Racks • 2 crates/rack

• ~200 HTR cards• ~3000 fibers and ~525 SLB with TPG cables• All I/O via front panels• Doors on front of rack required

• Sufficient standoff to satisfy fiber curvature requirement• Keeps people from pulling out the fibers

• Two 6U panels for cable/fiber support• Computer access in front of rack for fiber/TPG

installation • Wireless in counting room?• Laptop/monitor/keyboard mounted somewhere close?

Page 5: USCMS HCAL

CMS Electronics Week November 2002 5

TPG Cable IssuesTPG Cable IssuesTPG Cable IssuesTPG Cable Issues

• Amphenol skew-clear cables work @ 20m ok• Skew spec ~125ps @ 20m running at 1.2Gbaud

• Eye pattern will survive, BER = 10-15

• Each cable carries 2 pair – need 2 cables per SLB connector• $100/cable + ~$150 for assembly/testing (custom connector molding)

• Electrically these are very nice cables, but…• Formidable mechanical challenges – 6 of these beasts per HTR!

• We are investigating quad cable, much thinner• Single cable, ~$180 for 20m• Would not require custom molding – much cheaper ~$30 for assembly• However…skew is almost x2 worse for 20m (230ps)• Amphenol spec says this will give 10-15 BER for 15m @ 1.6Gbaud

• They were not clear about 1.2Gbaud – we will measure• If at all possible a 15m spec will:

• Save money (~$100k)• Give breathing room on BER• Save 1 clock tick in L1 latency• Decrease mechanical risks on all boards

Page 6: USCMS HCAL

CMS Electronics Week November 2002 6

VME Rack LayoutVME Rack LayoutVME Rack LayoutVME Rack Layout• 56U Total rack height, 55 used

(Note: Can recover 3U by using 1U exchangers)

• Rack computer (3U)• Air circulation has to be front → back

ala DAQ crate• Recirculation/Monitoring (4U)• Extra Heat Exchanger • 2 VME crate zones:

• Cable support (6U) • Front panel only• Fibers and TPG cables are formidable

• VME Crate (9U)• Air/Water heat exchanger (2U)• Fan Tray (2u)

• Power Supply zone (6U)• Cheaper, robust, safe, D0/CDF• Air transport issue here• Will have to build wire harness• Put A/C circuit breakers here?

• Return air guide (2U)Return Air Guide 2U

SBS

Air/Water Heat Exchanger 2U

HTR

HTR

HTR

HTR

HTR

HTR

HTR

HTR

HTR

HTR

HTR

HTR

DCC

DCC

TTC

HTR

9U

Cable Strain Relief 6U

Air/Water Heat Exchanger 2U

HTR

HTR

HTR

HTR

HTR

HTR

HTR

HTR

HTR

HTR

HTR

HTR

DCC

DCC

TTC

HTR

9U

Cable Strain Relief 6U

Air/Water Heat Exchanger 2U

Recirculation Fan & Rack Protection 4U

Rack Computer (dual Intel/Linux) 3U

Fan Tray 2U

Fan Tray 2U

Power Supply Zone 6U

Page 7: USCMS HCAL

CMS Electronics Week November 2002 7

Power Consumption EstimatesPower Consumption EstimatesPower Consumption EstimatesPower Consumption Estimates

• VME crate ~ 1.2kW (2 crates/rack only)• HTR 70W/slot

• 7A @ 5V = 35W• 11A @ 3.3V = 33W• Includes 6 SLBs, but many cards will have fewer• 13 or fewer HTR/crate = 910W

• Fanout card ~20W/slot• .5A @ 5V = 2.5W• 4.5A @ 3.3v ~16W

• DCC ~ 60W/double slot• 5A @ 5V = 25W• 10A @ 3.3V = 33W• S-Link64 current draw is a wild guess• 2 DCC/crate = 120W

• Add power for rack computer, monitor and fans to get rack power• 1kW max

• Total power dissipated by entire rack ~3.5kW• Note current CMS power consumption ~2kW/crate, >6kW/rack

Page 8: USCMS HCAL

CMS Electronics Week November 2002 8

Production ScheduleProduction ScheduleProduction ScheduleProduction Schedule

• Front-End• CCM Jun – Sep 03

• FE cards Jul – Oct 03

• RBX Aug 03 – May 04

• HPD deliveries from now until Apr 04

• HTR• Pre-production Jan 03, production Apr 03 – Sep 03

• DCC• Motherboards nearly complete, logic cards by Aug 03

• Awaiting final specs on S-Link64

• Fanout Card• Complete soon after QPLL, Q2/03

Page 9: USCMS HCAL

CMS Electronics Week November 2002 9

HCAL TriDAS Integration StatusHCAL TriDAS Integration StatusHCAL TriDAS Integration StatusHCAL TriDAS Integration Status

• First integration completed, summer 02• FE HTR DCC SLINK CPU

• All links well established

• No obvious clocking problems• Work needed on synch monitoring and reporting

• Improvements expected using crystal for TI refclk• Will always have TTC/QPLL clock as backup…

• HTR firmware fairly mature• Switch to Virtex2 all but complete

• TPG and BCID ready but not tested• To commence when next HTR version delivered and Wisconsin

TPG boards delivered (est Q1 2003 for testing to commence)

• Will be main effort when next HTR version arrives Jan 2003

Page 10: USCMS HCAL

CMS Electronics Week November 2002 10

HTR Production ScheduleHTR Production ScheduleHTR Production ScheduleHTR Production Schedule

O N D J F M A M J J A S O N D J F M A M J J A S O N D J F M A M J J A S O N DFirmwareBoard layoutFab/assembly 20 boards will be built but not assembled Pre-production HTR boardCheckoutBoard layout if needed

Fab/assembly if needed Production prototypeCheckoutProductionTestbeam ?Vertical Slice ?SLB Production

2002 2003 2004 2005

Issues:• Parts availability for HTR

• Stratos LC’s, FPGAs, etc., so far so good – should make schedule ok

• QPLL not needed for HTR since we have a Fanout card per crate

• Firmware requirements

• Will learn a lot in 2003…very far along now

Page 11: USCMS HCAL

CMS Electronics Week November 2002 11

Integration Goals 2003Integration Goals 2003Integration Goals 2003Integration Goals 2003• Continued development of HTR and DCC firmware

• Commission TPG path• Firmware requirements/logic, LUTs, synchronization, SLB output…

• Monitoring, error reporting, etc.

• Preliminary US-based integration at FNAL Q1/03• Full system as in the previous testbeam

• Except TPG which will be done initially at UMD, moved to FNAL if appropriate

• Testbeam in the summer (to begin in spring)• Same goals as summer 02

• Support calibration effort and continue commissioning the system

• Operate a “vertical slice” for an extended period Q4/03• Fully pipelined, monitoring, TPG, DAQ, synchronization, clocking….

• Develop software to support DAQ activities• Testbeam software improvements• Software for commissioning HTR needed

• Allow us to verify fiber mapping, download LUTs, firmware version, etc.

• By end of 2003 will have most of the HCAL TRIDas functionality

Page 12: USCMS HCAL

CMS Electronics Week November 2002 12

Installation ScheduleInstallation ScheduleInstallation ScheduleInstallation Schedule

1st integration with L1

Q3Q2Q1 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4

2003 2004 2005 2006

Vert. Slice

Testbeam

Production

RacksInstall in USC: Crates

HCAL Alcove Tests (HB/E/O?)

Integration

HB in UXC

HF in UXC

HF mated

No detector – rely on emulator for further commissioning/debugging

Cable HB/F

HE+ HE-

Cable HE

Cable HF

TriDAS Integration

HO in UXC

Page 13: USCMS HCAL

CMS Electronics Week November 2002 13

Installation RequirementsInstallation RequirementsInstallation RequirementsInstallation Requirements

• Production cards will be available, all systems• Front-end emulator will be important

• No other way to light up the fibers during installation• Design very close to actual front-end card (GOL, not TI)• Built by FNAL

• Close interaction with UMD on board• UMD firmware

• HCAL mapping nightmare will have to be implemented very carefully

• Will need to be able to connect to rack CPU from inside shield wall as we plug the fibers in one at a time

• Will need to have audio communication between operators inside shield wall and at VME racks

Page 14: USCMS HCAL

CMS Electronics Week November 2002 14

HCAL InstallationHCAL InstallationHCAL InstallationHCAL Installation

• We have a modest amount of stuff to be installed in USC:• 8 VME racks, 16 crates• ~3000 Fibers into front of HTR cards

• Fibers laid by CERN personnel?• ~525 TPG cables from HTR’s to RCT

• We will provide technician for installation• We will have 3 senior physicists at CERN:

• Pawel de Barbaro, Laza Dragoslav, Dick Kellogg• Other personnel:

• Post-doc(s), student(s), US-based physicist(s)…• In USC:

• Need a place for someone to sit in front of HTRs when fibers are being plugged in

• Access via HCAL Rack computer for VME access to cards• Wireless in counting house?• Mounted monitor/keyboard to interface with rack computer?• Both might be good…

• Will there be cabinets, work benches, etc?

Page 15: USCMS HCAL

CMS Electronics Week November 2002 15

Installation Manpower EstimatesInstallation Manpower EstimatesInstallation Manpower EstimatesInstallation Manpower Estimates

• Drawing on D Level 2 experience for the current Tevatron Run 2a…

• Each significant card requires on-site expertise:

• Probably 1-2 postdoc-level (or above) and 1 engineer

• Maybe the same engineer for both DCC and HTR…

• HCAL will have an electronics setup at CERN

• Total personnel estimate:• Front End 1

• HTR 2

• DCC 2

• Miscellaneous (grad students, transients, etc.) maybe 4?

• Very difficult to say with any accuracy

Page 16: USCMS HCAL

CMS Electronics Week November 2002 16

HCAL ClockingHCAL ClockingHCAL ClockingHCAL Clocking

• System Goals:• FE fiber physical layer synchronization locking• FPGA clock phase locked with LHC clock• Be able to achieve TPG alignment• Keep track of and handle BC0/EVN• Correct tagging of L1A bucket inside Level 1 pipeline

• Known issues:• Random 4-5 clock latency within TI deserializer• Quality of TTC/QPLL clock jitter• Whether we can use crystals for TI refclk

• Unknown issues:• Good point!

Page 17: USCMS HCAL

CMS Electronics Week November 2002 17

FE ClockingFE ClockingFE ClockingFE Clocking

• TTCex fiber input to CCM• Agilent Fiber receiver + TTCrx chip + QPLL

• 40MHz clean clock converted to PECL

• 40MHz clean PECL clock driven by 1-9 clock driver onto backplane to FE module

Page 18: USCMS HCAL

CMS Electronics Week November 2002 18

FE LinkFE LinkFE LinkFE Link

• Issue:• FE uses GOL Tx and TI Serdes Rx (TLK2501)• TLK2501 requires

• Refclk jitter < 40ps pkpk• Equivalent to 6.5kHz bandwidth on PLL

• Frequency offset < ± 100ppm• Equivalent to ± 4kHz on fLHC

• NB: commercial applications always use crystals

• Solutions• Use crystal for Refclk, or…• QPLL jitter spec <50ps

http://proj-qpll.web.cern.ch/proj-qpll/qpllHome.htm

Page 19: USCMS HCAL

CMS Electronics Week November 2002 19

HTR SchematicHTR SchematicHTR SchematicHTR Schematic

P1

to D

CC

P2

LVDS

to L

evel

1 C

al T

rig

ger

LVDS

SLB

SLB

SLB

SLB

SLB

SLB

FPGAXilinxXC2V

LCLC

TITI

TI

TI

TI

TITI TI

LCLC

FPGAXilinxXC2V

LCLC

TITI

TI

TI

TI

TITI TI

LCLC

VMEFPGA

Fibers

No

P3!

8-way8-way

8

8

Page 20: USCMS HCAL

CMS Electronics Week November 2002 20

Clocking SchematicClocking SchematicClocking SchematicClocking Schematic

TTCrxTTC

80 MHz LVPECL Crystal 1 to 8

Fanout

1 to 8 Fanout

Single width VME

BC0

80MHz

40MHz

SLB

SLB

SLB

SLB

SLB

SLB

TI(16)

FPGA

BC0BC0

40MHz 1 to 8 Fanout

80MHz

TTC mezzTTC TTC broadcast bus

Cat 6/7 quad cable (allows LVDS/PECL)

TTC Fanout Board

QPLL

• Start with Fanout card• TTCrx Maryland mezzanine card or CERN TTCrm daughterboard• QPLL• Fanout on Cat6/7 quad twisted pair TTC, BC0, 40MHz, 80MHz

• In HTR:• Send TTC signal to TTCrx mezzanine board, access to all TTC signals• Send 80MHz clean clock (cleaned by QPLL) to mux

• Select 80MHz clean clock OR crystal to TI deserializers

80 MHz

40 MHz

Page 21: USCMS HCAL

CMS Electronics Week November 2002 21

HCAL TRIDas Clock SchemeHCAL TRIDas Clock SchemeHCAL TRIDas Clock SchemeHCAL TRIDas Clock Scheme

TTCrx QPLL

(‘CC’ means Clean Clock)

Cat6/7 RJ45

RJ45 TTCMezz

TTC

SLB

Xilinx

TTC broadcast, L1A, BCR, EVR, CLK40

Fanout Card

4 twisted pair…

TTC BC0 CC40CC80

HTR Board

CC40

CC80

BC0

Page 22: USCMS HCAL

CMS Electronics Week November 2002 22

Fanout – HTR schemeFanout – HTR schemeFanout – HTR schemeFanout – HTR scheme

HTR

TTC fiber

TTC LVDS

CLK803.3V-PECL

RX_BC0 LVDS

Cat6Eor Cat7cable

8 clks to TLKs

DS90LV001

Q1Q2Q3Q4Q5Q6Q7Q8

MC100LVE310 3.3V PECL

CLK403.3V-PECL

LVDSFanoutx 8

PCK953LVPECL-to-LVTTLFanout(top layer)

PCK953LVPECL-to-LVTTLFanout(top layer)

8 clks to TLKs + TPs

To 6SLBs

Diff. to 2Xilinx+ termin.

Diff. to 6 SLBsSingle-end to 2 xilinx

TTC daughter

card

ININ_b

Brdcst<7:2>,BrcstStr, L1A,BCntResto xilinxand SLBs

CLK80 LVDS

FanoutBoard

Low-jitterFanout x 15

O/E

Brdcst<7:2>,BrcstStr BC0

Fanoutbuffer

TTCTTC

TTC

FPGA

Fanout x 15

Brdcst<7:2>,BrcstStr,BCntRes,L1A

CMOSLVDSor diffPECL

……..……..

……..……..

15 connectorson bottomlayer ?

15 Cables &Connectorstbd

……..……..

NB100LVEP221is LVDS compatible

TTCrx

(or daughter card)

QPLL

AN1568/D Fig 11Onsemi.com

RJ45

~FifteenRJ45connectors

PECLfanout

e.g. DS90LV110

..

..

2 TestPoints forCLK40andBC0

CLK40 LVDS

PECLfanout

..

..

..

..

80.0789 MHz3.3V crystalDiff. PECL

MC100LVEL37

CKCKCK/2CK/2

..

..

..

..

9U Front-panel space = 325 mm ; => space per connector ~ 21.5 mm

Notes: SLBs require fanout of CLK40, BC0. FE-link possibly requires CLK80. PECL fanout was tested in TB2002. One Cat6E cable (low x-talk) replaces the 2 Cat5 cables used in TB2002. TTC and BC0 remain LVDS as in Weiming’s board.HTR needs Broadcast bus, BCntRes and L1A: from TTCrx if we get it to work, otherwise we have to fan them out.

LVDS

Tullio Grassi <[email protected]>

Page 23: USCMS HCAL

CMS Electronics Week November 2002 23

TTCrx Mezzanine cardTTCrx Mezzanine cardTTCrx Mezzanine cardTTCrx Mezzanine card

• Very simple card:• 2 PMC connectors

• TTCrx chip

• TTC signal receiver and driver on motherboard

• Used by HTR, DCC, and Fanout cards

Page 24: USCMS HCAL

CMS Electronics Week November 2002 24

TTC Distribution – Fanout CardTTC Distribution – Fanout CardTTC Distribution – Fanout CardTTC Distribution – Fanout Card

• Currently HCAL has 6 TTC partitions:• Each partition requires TTCvi and TTCex

• Each HCAL VME crate will have a single TTCrx receiving data directly from TTCex in a single VME card (Fanout Card)

• Fanout TTC signal to HTR mezzanine card with TTCrx chip

• Use quad twisted pair CAT6/7 cable

• TTC and BC0 fanout using LVDS

• Also fanout of 40 and 80MHz clean clocks over LVPECL

• Cost savings and simplification

• TTC monitoring by Fanout card over VME

Page 25: USCMS HCAL

CMS Electronics Week November 2002 25

TTC MonitoringTTC MonitoringTTC MonitoringTTC Monitoring

• Chris Tully has built a very nice TTC monitoring board:

• 6U VME form factor board• Needs only 5V power, so could be used as a standalone monitor

with an appropriate battery

• Hosts a TTCrm module

• Front-panel LEDs displays:

• TTC system activity

• History of broadcasts

• Event counter/bunch counter values

• Useful for debugging and monitoring.

Page 26: USCMS HCAL

CMS Electronics Week November 2002 26

TTC Display BoardTTC Display BoardTTC Display BoardTTC Display Board

Page 27: USCMS HCAL

CMS Electronics Week November 2002 27

Random Latency IssueRandom Latency IssueRandom Latency IssueRandom Latency Issue

•Texas Instruments TLK2501 Serdes• Run with 80MHz frame clock – 20 bits/frame, 1.6GHz bit clock

• 626ps bit time

• Latency from data sheet:

• “The serial-to-parallel data receive latency…fixed once the link is well established. However…variations due to… The minimum…is 76 bit times…the maximum is 107 bit times…”

• Latency is 47.5 to 66.875 or 19.4ns – could cross a 40MHz bucket boundary!

•How to fix? Two ways• SLB “knows” this latency – we will read it out after each reset

• HCAL LED fast rise time

• Can pulse during abort gap and align channels

• Requires LED pulsing alignment

Page 28: USCMS HCAL

CMS Electronics Week November 2002 28

TPG AlignmentTPG AlignmentTPG AlignmentTPG Alignment

• TPG alignment performed in SLB• Necessary: All HTRs will send common BC0 to SLB’s

within each of 16 VME crates

• Calibration procedure to be performed for crate-crate alignment

• Initial alignment with LEDs, laser, etc.

• Final alignment with LHC first beam data

• CMS should consider pushing for initial beam with only 1 bucket populated

• This will ensure successful alignment

Page 29: USCMS HCAL

CMS Electronics Week November 2002 29

DAQ AlignmentDAQ AlignmentDAQ AlignmentDAQ Alignment

• DAQ data must also be aligned• Must know L1A bucket for zero suppression

• Solution: discussed in previous slide

• Read from SLB

• FE sending known ID after with fixed offset relative to BC0 during abort gap

• Comparison of the two for error checking

• DAQ check on BC0 in DCC for alignment• Will send BC0 and EVN with the data to DAQ

Page 30: USCMS HCAL

CMS Electronics Week November 2002 30

MISC ErrorsMISC ErrorsMISC ErrorsMISC Errors

• What happens if DCC finds mismatch in EVN?• DCC will then issue reset request to aTTS system

• Details not yet defined but is fully programmable

• Fiber Link/synchronization errors (GOL/TI)• Work out protocols to inform DCC

• Reset requests to aTTS as well

• FE Clock/GOL PLL link errors• If GOL loses synch, then transmitter will send out IDLE

characters

• IDLE characters are illegal in a pipelined system!

• HTR will trap on IDLE as a signal that FE/GOL is having trouble