LUSI Controls and Data Acquisition LCLS eXperimental End Stations (XES)

pigeoneggtrainsNetworking and Communications

Oct 24, 2013 (4 years and 15 days ago)

191 views

Gunther Haller

June 13, 2007

1

LUSI Controls and Data Acquisition

LCLS eXperimental End Stations (XES)

Gunther Haller


Gunther Haller

June 13, 2007

2

XES Slow Controls

Control (for AMO, for LUSI need to make list)

Experiments have ~ 100 movers

Ultra high vacuum, power supplies etc.

Some requiring special consideration are:

Beam inhibit next pulse from the XES via MPS

Use chopper to rate limit

Synch gas jet & Laser to beam pulse

Synch mechanical chopper to the beam


Gunther Haller

June 13, 2007

3

Simplified Example AMOS 4A (replace beamline for LUSI)

Motion Controller

Vacuum pump Controller

Camera Controller

Valve Controller

Power Supply Controller


AB controllers,
VME crates,
Motorola 6100
PPC IOC’s,

RTEM, EPICS

Control

Accelerator Timing

Interface

Accelerator 120Hz

Beamdata Interface


VME crates
Motorola 6100
PPC IOC’s

RTEM, EPICS

XES Sub
-
Net

Triggers

Timing and Beamdata
from Accelerator

DAQ Modules

CE Module


PPC’s

RTEM, EPICS

Control &
Data Path
to Front
-
End
Electronics

Science
Data to
local cache
and SCCS

XES Sub
-
Net

XES Sub
-
Net

Experimenter

EPICS, MATLAB, SPEC, etc

Gunther Haller

June 13, 2007

4

AMO for 4B

Gunther Haller

June 13, 2007

5

SCCS

XES Hutch

Detector

Instrument(s)

Hutch IOCs &

Instruments

X
-
Ray Transport to
Diagnostics & Next Hutch

Ethernet

Fast DAQ

Data

Detector

Control

Anywhere

LCLS Data

Retrieval

eXperimental End Station System Overview

EPICS Archive

MPG/LCLS

Event
Generator

Linac & LCLS

Control GUI

MCC Main Control Room

Linac/LCLS

Injector

Linac

Undulator

X
-
Ray

Beam

EVR Hardware Triggers

XES

private

subnet

Data

Acquisition

Subsystem(s)

XES Controller

Experimenter

LCLS

Analysis /

Viewing

SLAC WAN

Event

Receiver

Channel Access

Gateway

Visual Data Monitor

120Hz IOC

120 Hz Archive


Engine

120 Hz Beam Line Data

120 Hz Beam Line & Timestamp Data (dedicated Ethernet)

XTOD

Machine Protection (MPS)

EELOG

EELOG

Beam Line

Processor

Other

Databases

DAQ

Archive

Slow

Instrument

Archive

XES

Data

Cache

Fast

Instrument

Archive

Experiment

Specific IOCs &

Instruments

MPS

MPS

Digitized

Data

Channel
Access

DAQ

Config

SLAC WAN

Config/Status

Gunther Haller

June 13, 2007

6

XES Detail


Data Acquisition

XES Hutch

6. DAQ Data to SCCS

D. ADC Control & Digitized Data

B. Distributed EVR Hardware Triggers

XES

private

subnet

XES Controller

Channel Access

Gateway

Visual Data Monitor

C. Beam Line & Timestamp Data (dedicated Enet)

EELOG

Beam Line

Processor

XES

Data

Cache

4. MPS (Reflective Memory Fiber)

E. Detector Control & Digitized Data

Non
-
RTOS

SBC

RTOS

PC

Non
-
RTOS

PC

EVR VME

Beam Line
Data PMC

1 Gb E

1 Gb E

1 Gb
E

2 Detector Types

Spectrometer

CCD Pixel

Detector

DAQ

SBC DAQ

CE Module DAQ

PPC

10 Gb Enet

Experiment Chamber

cPCI ADC

RTOS

RTOS

Front End Board

TOF/Momentum
Instrument

EPICS Config PVs

Fast Serial

1 Gb E

1 Gb E

1 Gb E

2. Beam Line 120 Hz Data

3. EVR (Fiber)

5. SLAC WAN (Ethernet)

1. Channel Access (Ethernet)

A. EPICS & Local Control (Hutch Subnet)

F. DAQ Data to Cache

1

2

3

5

6

4

A

F

B

C

MPS PMC

FPGA

1 Gb E

EPICS Config PVs

Spectrometer Monitor

CCD Monitor

1 Gb E

D

E

G. Visual Monitor Data

G

G

1 Gb E

1 Gb E

Gunther Haller

June 13, 2007

7

Experimental Data

Generation, storage, retrieval and analysis of experimental
data is the “product” of the LCLS.

LSST, if funded, will produce ~30 TB of data per night.

The AMOS experiment may eventually take data @120Hz
from:

6 spectrometers @~15 KB.

5 CCDs @1 MB each

That’s ~700 MB/second or 2.4 TB/hour or ~58 TB/24 hour
running period.

Larger CCD densities are already planned.

LCLS experiments are relatively short term and so will follow
the technology more quickly than LSST.

Gunther Haller

June 13, 2007

8

Data Handling

Use scalable technology developed for LSST for CCD data.

Capitalize on years of BaBar experience with hierarchical storage and
management of HEP data.

Many aspects of traditional HEP computing and data management are
applicable to XES/LUSI

Large parts of the BaBar data management system have been developed at
SLAC

Significant expertise for this type of system exists at SLAC.

BaBar Stores:

~1 TB/day raw data.

>>1 TB/day derived data products.

~1.5 PB total Babar data. Old data purged.

Extend/integrate several SLAC
-
developed and EPICS Java
-
based
technologies for data retrieval and analysis.

Java Analysis Studio (JAS)

Accelerator Integrated Data Access (AIDA)

EPICS Archive Viewer

Gunther Haller

June 13, 2007

9

Detector Intro

PAD Pixel Array, Cornell

6x6 array: 192x192 pixels/pad

One frame is 192x192x6x6x16bits= 2.6 Mbytes

At 120Hz 312 Mbytes/sec

6x6x192x192x16bitsx120 Hz = 2.54 Gbit/s serial transmission

1 day 24x3600x2.5x10^9= 27 Tbytes

But: will eventually go to 1.3 Gbyte/s (11 Mbyte/frame)

XAMPS

1,024 x 1,024 array

Row
-
by
-
row readout: 64 readout IO’s

8 8
-
channel 20
-
MHz 14
-
bit ADC’s + range
-
bit

2 Gbyte/s raw data

250Mbytes/s at 120 Hz (1024x1024x2x120)




Gunther Haller

June 13, 2007

10

Cornell PAD Configuration

Epics CAS

Linux Box

FE Custom Board

FPGA/Decoder

PPC/RAM

Epics

PADS

PAD
Electronics

Experiment specific board
without processor/RTEM/EPICS.


CE Box

Fiber Lanes

Configuration stored in CE
-
Box

Mode bits

Reset/Gain Cal/CI/Dark Image Cal/Data
Acqui.

Control Bits

Change in V/I or Gain

Number of Gain Readings

Number of Dark Images

Number of CI Readings

Number of Charge Injections

Gain Mapping file

Frequency of Image Data

Integration Window

Reference V/I Settings file

Calibration
-
Constant extraction algorithm

On
-
line monitoring parameters

Frequency, etc

Image Processing Code

For on
-
line monitoring

For off
-
line storage (if different)

Gunther Haller

June 13, 2007

11

Cornell PAD Configuration

Epics CAS

Linux Box

FE Custom Board

FPGA/Decoder

PPC/RAM

Epics

PADS

PAD
Electronics

PAD Control
Signals Data
Readout

CE Box

Fiber Lanes

Configuration Commands send from
CE
-
Box to FE Board

Command (Address) + data
-
field

Reference V/I (288 Bytes)

Pixel Gain (~ 750 kbytes)

Integration Window?


Configuration stored in CE
-
Box

Mode bits

Reset/Gain Cal/CI/Dark Image
Cal/Data Acqui.

Control Bits

Change in V/I or Gain

Number of Gain Readings

Number of Dark Images

Number of CI Readings

Number of Charge Injections

Gain Mapping file

Frequency of Image Data

Integration Window

Reference V/I Settings file

Calibration
-
Constant extraction
algorithm

On
-
line monitoring parameters

Frequency, etc

Image Processing Code

For on
-
line monitoring

For off
-
line storage (if different)

Configuration Read
-
back

Configuration Registers

Temperature (read only)


Gunther Haller

June 13, 2007

12

Cornell PAD Configuration

Most configuration stored in CE
-
Box.

PAD configuration (V/I/Gain) commanded from CE
-
Box via FE
Board to PAD electronics

FE
-
Boards just converts command
-
code/data
-
field to PAD
specific low
-
level control signals

No configuration “stored” in FE board

For configuration (non
-
destructive) read
-
back, FE
-
Board just
transmits the from PAD received data to CE
-
Box

Epics CAS

Linux Box

FE Custom Board

FPGA/Decoder

PPC/RAM

Epics

PADS

PAD
Electronics

PAD Control
Signals Data
Readout

CE Box

Fiber Lanes

Configuration Commands send from
CE
-
Box to FE Board

Command (Address) + data
-
field

Reference V/I (288 Bytes)

Pixel Gain (~ 750 kbytes)

Integration Window?


Configuration stored in CE
-
Box

Mode bits

Reset/Gain Cal/CI/Dark Image
Cal/Data Acqui.

Control Bits

Change in V/I or Gain

Number of Gain Readings

Number of Dark Images

Number of CI Readings

Number of Charge Injections

Gain Mapping file

Frequency of Image Data

Integration Window

Reference V/I Settings file

Calibration
-
Constant extraction
algorithm

On
-
line monitoring parameters

Frequency, etc

Image Processing Code

For on
-
line monitoring

For off
-
line storage (if different)

Configuration Read
-
back

Configuration Registers

Temperature (read only)

Gunther Haller

June 13, 2007

13

Cornell PAD Calibration & Run
Commands

Epics CAS

Linux Box

FE Custom Board

FPGA/Decoder

PPC/RAM

Epics

PADS

PAD
Electronics

PAD Control
Signals Data
Readout

CE Box

Fiber Lanes

Data packaged and transmitted for
each Run

Configuration common to a Run

Data packaged and transmitted for
each Image

LCLS Beam Parameters

Configuration not common to a Run

Temperatures

Raw Image or/and processed Image


LCLS Beam
Parameter

Run Control in CE
-
Box

E.g. Calibration Mode

Iteration over n images

Change configuration for next
image set

Beam Acquisition Mode

Receive Pre
-
Beam from LCLS

Transmit “Take Raw Image”
command to FE
-
Box

On
-
Line
Monitoring

Off
-
Line
Storage


Gunther Haller

June 13, 2007

14

Cornell PAD Calibration & Run
Commands

Run
-
Control State
-
Machine runs in CE
-
Box

One raw image for each “Read Raw Image” command sent
from CE
-
Box

FE
-
Board converts “Read Raw Image” command into low
-
level
signals controlling the PAD and multiplexes the data up to the
CE
-
Box (low
-
level electrical signal generation state
-
machine in
VHDL)



Epics CAS

Linux Box

FE Custom Board

FPGA/Decoder

PPC/RAM

Epics

PADS

PAD
Electronics

PAD Control
Signals Data
Readout

CE Box

Fiber Lanes

Run Command

Read Raw Image


Data packaged and transmitted for
each Run

Configuration common to a Run

Data packaged and transmitted for
each Image

LCLS Beam Parameters

Configuration not common to a Run

Temperatures

Raw Image or/and processed Image


LCLS Beam
Parameter

Run Control in CE
-
Box

E.g. Calibration Mode

Iteration over n images

Change configuration for next
image set

Beam Acquisition Mode

Receive Pre
-
Beam from LCLS

Transmit “Take Raw Image”
command to FE
-
Box

Raw Pixel Data

One image for each “Read Raw
Image” command received

On
-
Line
Monitoring

Off
-
Line
Storage

Gunther Haller

June 13, 2007

15

Correction Discussion

First need to understand what correction really is. Just offset & gain & ?

Present plan seems to be to take dark
-
images in separate runs and then take 120Hz beam images

Number of images read out in 8 msec: 1

6x6x192x192 = 1.3 Mpixel

Assume split into 2 fibers (PPC’s): 650 Kpixel/PPC

Assume subtraction & multiplication (offset & gain correction)

About 4 instructions/pixel (without much effort in pipelining)

Note that in our case (CE
-
Box) we are not processor IO limited for algorithms such as calibration as in
most commercial systems

Need 4 memory accesses for each pixel, or 4 x 650k x 2/8 msec = 0.625 Gbyte/sec

CE has 12 Gbyte/sec processor to memory bandwidth

CE
-
Box can (just with C++ & PowerPC)

8ns/pixel or in 8 msec: 1 Mpixel (only need 650 Kpix)

Or: have 12 nsec for each pixel: > 5 instructions/pixel

But, just if there is still concern

Don’t have to have just 2 fibers for 6x6 array

Lots of processing overhead

Eventually system will go to 4x readout, is scalable

Bottom line is that looks ok in C++ on RTEM, easier to develop, modify, and maintain

No requirement for VHDL or DSP’s

although those are options available in the CE
-
Box





Gunther Haller

June 13, 2007

16

Science Processing

Processing can be done in


CE
-
box


Online Farm


SCCS

Might have advantages performing some processing in the CE’s

Large CPU
-
memory bandwidth

Can use VHDL in FPGA (~200 DSP’s available within FPGA)

Depends on what processing needs to be performed

Needs work


Epics CAS

Linux Boxes

PPC/RAM

Online
processing
farm & local
storage

CE Box

SCCS

Off
-
Line Storage

Processing

10Gb/s fiber

Gunther Haller

June 13, 2007

17

CE
-
Pizza Box






RLDRAM

PGB/Aurora
2.5 Gb/s fibers

FO

FO

100K/1G Ethernet (Control/Monitoring)


Conn

Conn

Conn

LVDS IO (Synch signal from EVR)

CEM Pizza Box


FPGA


FO

FO

EVR Timing fiber


PPC 1 & VHDL

PPC 2 & VHDL

RLDRAM

CE 2


CE 1


to/from FE
board 1

to/from FE
board 2

10G Ethernet (Science Data)


Conn

1G Ethernet (Beam
-
Parameters)


100K/1G Ethernet (Control/Monitoring)


Conn

Conn

Conn

LVDS IO (Synch signal from EVR)

EVR timing fiber


Conn

10G Ethernet (Science Data)


Conn

1G Ethernet (Beam
-
Parameters)


Conn

Gunther Haller

June 13, 2007

18

Appendix


Gunther Haller

June 13, 2007

19

LCLS Operations

SCCS

Experiment

Operations

Slow

Archive

Configs

Archive Data Management System Overview

1. EPICS Channel Access

Archive

Configuration

Interface

Anywhere on

Channel Access Networks

Slow Archive

Engines

XES Hutch

Archive

Configurations

Interface

Data

Acquisition

Module(s)

Modeling &

Other Databases

Global Archive

Dictionary

XES DAQ

Data Cache

Global Archive

Configuration

Management

6. Slow Archive Configuration

5. DAQ Archive Configuration

9. Global Archive

Dictionary Management

3. Slow Archive Data

2. DAQ Archive Data

8. Offsite/Cache Archive

Transfer

Slow

Archives

Fast

Archives

DAQ

Archives

Fast

Archive

Config

DAQ

Archive

Configs

Anywhere on

SCRAMNet Network

Fast Archive

Engine

4. Fast Archive Data

7. Fast Archive Configuration

1

2

3

4

5

6

7

8

9

Gunther Haller

June 13, 2007

20

Data Volume Makes DAQ Archive More Complicated

SBC

10 GigE

C E

1 GigE

Ethernet Network Switch

Ethernet Network Switch

Local Cache Farm

10 GigE

Links to Computer Center

Gunther Haller

June 13, 2007

21

Anywhere

SCCS

Slow

Archives

Server

Archive Retrieval/Analysis System Overview

LCLS Analysis /

Viewing

DAQ

Archives

Server

Modeling &

Other Databases

Global Archive

Dictionaries

LCLS Data

Retrieval

Relational

Database

Server

Fast

Archive

Server

2. Data Retrieval Request

1. Global Archive Dictionary Read

6. Slow Archive Data

7. Fast Archive Data

8. DAQ Archive Data

5. Database Data

Tape Data

2

Slow

Archives

Fast

Archives

DAQ

Archives

Data Retrieval

Temporary Cache

1

6

7

8

5

4. Tape Data Request

4

3. Locate server & storage

3

9. Results

9