LHC - lishep

scacchicgardenSoftware and s/w Development

Dec 13, 2013 (3 years and 6 months ago)

74 views


LHC Experiment’s Software




Lucia Silvestris

INFN
-
Bari


LISHEP 2006

INTERNATIONAL SCHOOL ON

HIGH ENERGY PHYSICS

Rio de Janeiro
-

Brazil

Large Hadron Collider & Experiments


The startup




27 km around

L
arge
H
adron
C
ollider

Trigger challenge

task for LHC !!



LHC Detector Requirements

Very good
electromagnetic calorimetry

for electron and

photon identification (H
-
>gamma gamma)


Good
hadronic calorimeter

jet reconstruction and
missing

transverse energy measurement;


Efficient and high
-
resolution
tracking

for particle

momentum measurements, b
-
quark tagging,
t

tagging,

vertexing (primary and secondary vertex)


Excellent
muon identification

with precise momentum

reconstruction



A Generic Multipurpose LHC Detector

µ

e

n

p

n

g

About 10
l

are needed to shield
the muon system from hadrons
produced in p
-
p collision



Experiments at LHC

CMS

C
ompact
M
uon
S
olenoid


ATLAS
A

T
oroidal
L
HC
A
pparatu
S


LHC
b
Study of CP

violation in B
-
meson

decays at the LHC


ALICE

A L
arge
I
on
C
ollider
E
xperiment




LHC startup plan

L=3x10
28
-

2x10
31

Stage 1

Initial commissioning

43x43 to 156x156, N=3x10
10

Zero to partial squeeze

Stage 2

75 ns operation

936x936, N=3
-
4x10
10

partial squeeze

L=7x10
32
-

2x10
33

L=10
32
-

4x10
32

Stage 3

25 ns operation

2808x2808, N=3
-
5x10
10

partial to near full squeeze



Pilot Run

Pilot Run : Luminosity


30 days; maybe less (?); 43*43 bunches, then 156*156
bunches

PILOT RUN
1.E-04
1.E-03
1.E-02
1.E-01
1.E+00
1.E+01
1.E+02
1
3
5
7
9
11
13
15
17
19
21
23
25
27
29
DAYS
luminosity (10**30 cm-2 sec-1)
integrated luminosity (pb-1)"
events/crossing
10
29

10
30

10
28

10
31

Lumi

(cm
-
2
s
-
1
)

Pile
-
up

Int. Lumi

(pb
-
1
)

10

1

0.1


LHC

= 20%

(optimistic!)



Startup plan and Software

Turn
-
on is fast


Pile
-
up increasing rapidly


Timing (43x43 to 75ns to
25 ns) evolution


LOTS of physics

For all detectors
:


Commission detector and
readout


Commission trigger
systems


Calibrate/align
detector(s)


Commission computing
and software systems


Rediscover the Standard
Model

Simulation

Reconstruction

Trigger

Monitoring

Calibration/Alignment


calculation


application

User
-
level data objects


selection

Analysis

Visualization

SW Development Tools



LHC startup: CMS/ATLAS

Integrated luminosity with the current LHC plans

Run 2008
1.E-01
1.E+00
1.E+01
1.E+02
1.E+03
1.E+04
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
weeks
luminosity (10**30 cm-2 sec-1)
integrated luminosity (pb-1)
events/crossing
Top re
-
discovery

Higgs (?)

Z’


muons

Susy
-

Susy

10
31

Lumi

(cm
-
2
s
-
1
)

10
32

10
33

1.9 fb
-
1


LHC

= 30%

(optimistic!)

1 fb
-
1

(optimistic?)



Physics Startup plans


ALICE: minimum
-
bias proton
-
proton interactions


Standard candle for the heavy
-
ion runs


LHCb: B
S

mixing, sin2
b

repeat


If the Tevatron has not done it already


ATLAS
-
CMS: measure jet and Z and W
production; In 15 pb
-
1

will have 30K W’s and 4K Zs
into leptons.


Measure cross sections and W and Z charge
asymmetry (pdfs; top!)


Luminosity?



Multiplicity paper:



Introduction


Detector system


-

Pixel (& TPC)


Analysis method


Presentation of data


-

dN/d
η and mult. distribution (

s
dependence)










Theoretical interpretation


-

ln
2
(s) scaling?, saturation, multi
-
parton
inter…


Summary

p
T

paper outline:



Introduction


Detector system


-

TPC, ITS


Analysis method


Presentation of data


-

p
T

spectra and

p
T

-
multiplicity correlation











Theoretical interpretation


-

soft vs hard, mini
-
jet production…


Summary

Startup physics (ALICE)

Can publish two papers
1
-
2

weeks after
LHC

startup

Where we are ?

Common Software



LCG Application Area

Deliver the common physics applications software for
the

LHC experiments (
http://lcgapp.cern.ch
/
)


Organized to ensure focus on real experiment needs


Experiment
-
driven requirements and monitoring


Architects in management and execution


Open information flow and decision making


Participation of experiment developers


Frequent releases enabling iterative feedback


Success is defined by adoption and validation of the

products by the experiments


Integration, evaluation, successful deployment



Software Domain Decomposition

Core

PluginMgr

Dictionary

MathLibs

I/O

Interpreter

GUI

2D Graphics

Geometry

Histograms

Fitters


Simulation

Foundation

Utilities

Engines

Generators


Data Management

Persistency

FileCatalog

Framework

DataBase


Grid Services

Batch

Interactive

OS binding

3D Graphics

NTuple

Physics

Collections

Conditions


Exper Frameworks


Simulation Program


Reconstruction Program

Analysis Program

Event

Detector

Calibration

Algorithms



Simplified Software Decomposition

non
-
HEP specific

software packages

Exp. Framework

Applications

Core Libraries

Simulation

Data

Mgt.

Distrib.

Analysis

Every experiment has a framework for
basic services and various specialized
frameworks: event model, detector
description, visualization, persistency,
interactivity, simulation, etc.

Many non
-
HEP libraries widely used

Applications are built on top of
frameworks and implementing the
required algorithms

Core libraries and services that are
widely used and provide basic
functionality

Specialized domains that are common
among the experiments

Common SW

Experiment SW



Application Area Projects

ROOT



Core Libraries and Services


Foundation class libraries, math libraries, framework
services, dictionaries, scripting, GUI, graphics, SEAL
libraries, etc.


POOL


Persistency Framework


Storage manager, file catalogs, event collections, relational
access layer, conditions database, etc.


SIMU
-

Simulation project


Simulation framework, physics validation studies, MC event
generators, Garfield, participation in Geant4 and Fluka.


SPI



Software Process Infrastructure


Software and development services: external libraries,
savannah, software distribution, support for build, test, QA,
etc.



ROOT: Core Library and services

ROOT activity at CERN fully integrated in the LCG

organization (planning, milestones, reviews, resources,
etc.)



The main change during last year has been the merge of the
SEAL and ROOT projects


Single development team


Adiabatic migration of the software products into a
single set of core software libraries


50% of the SEAL functionality has been migrated into
ROOT (Mathlib, reflection, python scripting, etc.)




ROOT is now at the “root” of the software for all
the LHC experiments


Web Page:
http://root.cern.ch/



ROOT: Core Library and services

Current work packages (SW Components)



BASE:

Foundation and system classes, documentation and
releases


DICT:

Reflexion system, meta classes, CINT and Python
interpreters


I/O:

Basic I/O, Trees, queries


PROOF:

parallel ROOT facility, xrootd


MATH
: Mathematical libraries, histogramming, fitting


GUI:

Graphical User interfaces and Object editors


GRAPHICS:

2
-
D and 3
-
D graphics


GEOM:

Geometry system


SEAL:

Maintenance of the existing SEAL packages


Web Page:
http://root.cern.ch/



ROOT: I/O

Recent developments of ROOT
I/O and Trees

General I/O


STL Collections


Data compression using
reduced precision


Alternatives to default
constructors


Other I/O improvements




Save space

Increase

precision



ROOT:I/O

TTree Extensions


New Features


Fast Merging


Indexing of

TChains


TTree

Interface
enhancements


TRef

and

pool::Reference


Browsing


Browse Extensions:


Split objects


Unsplit objects


Collections


And can now see


Simple member functions


Transient members


Persistent members


Main focus:


Consolidation (Thread Safety)

Generic Object Reference
support


Important features requested
by

the experiments are
implemented




ROOT: Math

New Developments of ROOT
Mathematical Libraries



new Vector package (3D and
4D)


SMatrix (for small matrices
with fixed size)



Fitting and Minimization


Minuit2 (C++)


Linear Fitter


Robust Fitter


SPlot (unfolding)



ROOT: Graphics
-

Detector Geometries

Alice

LHCb

Atlas

CMS



ROOT: Graphics
-

Events



Data Management

FILES
-

based on ROOT I/O


Targeted for complex data structure: event data, analysis data


Based on
Reflex

object dictionaries


Management of object relationships: file catalogues


Interface to Grid file catalogs and Grid file access

Relational Databases


Oracle, MySQL, SQLite


Suitable for conditions, calibration, alignment, detector description
data
-

possibly produced by online systems


Complex use cases and requirements, multiple ‘environments’


difficult to be satisfied by a single solution


Isolating applications from the database implementations with a
standardized relational database interface


facilitate the life of the application developers


no change in the application to run in different environments


encode “good practices” once for all


Focus moving into deployment and experiment support



Persistency framework

The AA/POOL project is delivering a number of “products”



POOL


Object and references persistency framework


CORAL


Generic database access interface


ORA


Mapping C++ objects into relational database

Oracle

SQLite

MySQL

ROOT I/O

RDBMS

STORAGE MGR

COLLECTIONS

FILE CATALOG

POOL API

USER CODE

COOL API

COOL

CORAL


COOL


Detector
conditions database

Object storage and

references successfully
used

in large scale production in

ATLAS, CMS, LHCb

Need to focus on database

access and deployment in

Grid


basically starting
now

http://pool.cern.ch/





CORAL :Generic database access interface

RDBMS Implementation

(oracle)

RDBMS Implementation

(sqlite)

RDBMS Implementation

(frontier)

RDBMS Implementation

(mysql)

Authentication Service

(xml)

Authentication Service

(environment)

Lookup Service

(xml)

Lookup Service

(lfc)

Relational Service

implementation

Monitoring Service

implementation

Connection Service

implementation

CORAL Interfaces

(C++ abstract classes

user
-
level API)

CORAL C++ types

(Row buffers, Blob,

Date, TimeStamp,...)

Client Software

Common

Implementation


developer
-
level

interfaces

Plug
-
in libraries, loaded at run
-
time, interacting only through the interfaces

http://pool.cern.ch/coral/




CORAL :Generic database access interface

A software system for vendor
-
neutral access to relational

databases


C++, SQL
-
free API















CORAL
integrated

in the software of LHC experiments (CMS, ATLAS

and LHCb) directly (i.e. on
-
line applications) and indirectly (COOL, POOL)

coral::ISchema& schema =
session.nominalSchema();

coral::TableDescription tableDescription;

tableDescription.setName( “T_t” );

tableDescription.insertColumn( “I”, “long long”
);

tableDescription.insertColumn( “X”, “double” );

schema.createTable( tableDescription);

CREATE TABLE T_t

( I BIGINT,



X DOUBLE PRECISION)

CREATE TABLE “T_t”

( I
NUMBER(20),



X BINARY_DOUBLE)

Example 1:
Table creation

Oracle

MySQL



Conditions DataBase

DataBases to store time varying data

COOL

:


holds condition data for
reconstruction and analysis


access data from PVSS, local
file catalog (LFC) and
bookeeping


implementation in ORACLE,
MySQL and SQLite

C++ Relational Access (CORAL)
Oracle
Access
MySQL
Access
SQLite
Access
Oracle
OCI
MyODBC
API
SQLite
API
Oracle
DB
MySQL
DB
SQLite
DB
Relational Database Deployment
and Data Distribution (3D)
Time-varying
multi-version
data (~offline)
Time-varying
single-version
data (~online)
Conditions DB Access (COOL)
Experiment conditions data
common software and conventions
Experiment framework
Sub
detector
#N
Sub
detector
#2
Sub
detector
#1

Now in deployment phase (
ATLAS and
LHCb
)


fully integrated in experiment
frameworks


Benefits from other LCG projects


CORAL, SEAL/ROOT and 3D project


http://pool.cern.ch/CondDB/




Simulation

MC generators


MC generators specialized on different physics domains,
developed by different authors


Needed to guarantee support for the LHC experiments and
collaboration with the authors.

Simulation engines


Geant4 and Fluka are well established products

Common additional utilities required by the experiments


Interoperability between MC generators and simulation
engines


Interactivity, visualization and analysis facilities


Geometry and Event data persistency


Comparison and validation (between engines and real data)


http://lcgapp.cern.ch/project/simu




Simulation framework utilities

HepMC: C++ Event Record for Monte Carlo Generators

GDML: Geometry description markup language


Geometry interchange format or geometry source


GDML writer and readers exists for Geant4 and ROOT

Geant4 Geometry persistency


Saving/retrieving Geant4 geometries with ROOT I/O

FLUGG: using Geant4 geometry from FLUKA


Framework for comparing simulations


Example applications have been developed

Python interface to Geant4


Provide Python bindings to G4 classes


Steering Geant4 applications from Python scripts

Utilities for MC truth handling



Simulation components

Geant4

Fluka

Flugg

GDML

R

W

W

R

Steering Python scripts

text

editor

TGeom

Pythia

Pythia

MC generators

geom.

root

MCDB

MC

truth

root

HepMC

HepMC



Distributed data analysis

Full spectrum of different analysis applications will be
co
-

existing


Data analysis applications using the full functionality
provided by the experiment’s framework (analysis tools,
databases, etc.)


Requiring big fraction of the available software packages
and very demanding on computing and I/O


Typically batch processing


Final analysis of ntuple
-
like data (ROOT trees)


Fast turn
-
around (interactive)


Easy migration from local or distributed (PROOF)

Tools to help the Physicists are being made available


Large scale grid job submission (GANGA)


Parallelization of the analysis jobs (PROOF)



Application Area Highlights
-

SPI

SPI is concentrating on the following areas:


Savannah service (bug tracking, task management,
etc.)


>160 hosted projects, >1350 registered users
(doubled in one year)


Web Page:
http://savannah.cern.ch/


Software services (installation and distribution of
software)


>90 external packages installed in the external
service


Software development service


Tools for development, testing, profiling, QA


Web, Hypernews, Documentation


SPI Web Page

http://lcgapp.cern.ch/project/spi/



SPI
-

Software Configuration

An
LCG configuration

is a

combination of packages and

versions which are coherent and

compatible

Configurations are given names like

“LCG_40”

Experiments build their application

software based on a given LCG

configuration


Interfaces to the experiments
configuration systems are provided
(SCRAM, CMT)


Concurrent configurations are
everyday situation

Configurations are decided in the AF



SPI
-

Software Releases

The AA/Experiments software stack is quite

large and complex


Many steps and many teams are involved

Only 2
-
3
production quality releases

per year
is

affordable


Complete documentation, complete
platform set, complete regression tests,
test coverage, etc.

Feedback is required before the production

release is made


No clear solution on how to achieve this


Currently under discussion

As often as needed
bug fix releases


Quick reaction time and minimal time to
release

non
-
HEP specific

software packages

Exp. Framework

Applications

Core Libraries

Simulation

Data

Mgt.

Distrib.

Analysis

release order

Where we are?

Individual experiments



Software Domain Decomposition

Core

PluginMgr

Dictionary

MathLibs

I/O

Interpreter

GUI

2D Graphics

Geometry

Histograms

Fitters


Simulation

Foundation

Utilities

Engines

Generators


Data Management

Persistency

FileCatalog

Framework

DataBase


Grid Services

Batch

Interactive

OS binding

3D Graphics

NTuple

Physics

Collections

Conditions


Exper Frameworks


Simulation Program


Reconstruction Program

Analysis Program

Event

Detector

Calibration

Algorithms

Experiments Software Architecture

&

Frameworks



Frameworks: ATLAS+LHCb (I)

ATLAS+LHCb: Athena/Gaudi












Converter

Algorithm

Event Data

Service

Persistency

Service

Data

Files

Algorithm

Algorithm

Transient
Event Store

Detec. Data

Service

Persistency

Service

Data

Files

Transient
Detector

Store

Message

Service

JobOptions

Service

Particle Prop.

Service

Other

Services

Histogram

Service

Persistency

Service

Data

Files

Transient

Histogram
Store

Application

Manager

Converter

Converter



Frameworks: Alice (II)



Framework CMS: Component Architecture
(III)

CMS: New framework in 2005


Five types of dynamically loadable processing components


Source


Provides the Event to be processed


OutputModule


Stores the data from the Event


Producer


Creates new data to be placed in the Event


Filter


Decides if processing should continue for an Event


Analyzer


Studies properties of the Event


Components only communicate via the Event

Components are configured at the start of a job using a
ParameterSet




Framework CMS: Processing Model (IV)

Source creates the Event

The Event is passed to execution paths

Path is an ordered list of Producer/Filter/Analyzer modules

Producers add data to the Event

OutputModule given Event if certain Paths run to completion



POOL File



Framework CMS: Accessing Event Data (VI)

Event class allows multiple ways to access data




//Ask by module label and default product label


Handle<
TrackVector
>
trackPtr
;


event
.getByLabel(
“tracker”
,
trackPtr

);



//Ask by module and product label


Handle<
SimHitVector
>
simPtr
;


event
.getByLabel(
“detsim”
,
“pixel”

,
simPtr

);



//Ask by type


vector<Handle<
SimHitVector
> >
allPtr
;


event
.getByType(
allPtr

);




//Ask by Selector


ParameterSelector<int>
coneSel(
“coneSize”
,
5
)
;


Handle<
JetVector
>
jetPtr
;


event
.get(
coneSel
,
jetPtr

);




Framework CMS: Job Configuration (IX)

Job configuration is done in the configuration file

After configuration is complete, all components will have
been loaded into the application

process RECO = {


source

= PoolSource {


string filename =
“test.root”


}


module

tracker = TrackFinderProducer {}




module

out = PoolOutputModule {








string
filename = “test2.root”}


path p = {tracker,out}

}



Simulation and Detector Description

in the experiments



Simulation (I)

Geant4: success story; Deployed by all experiments.


Functionality essentially complete. Detailed physics studies
performed by all experiments.


Very reliable in production (better than 1:10
5
)


Good collaboration between experiments and Geant4 team


Lots of feedback on physics (e.g. from testbeams)


LoH (Level of Happiness): very high

LHCb : ~ 18 million volumes

ALICE : ~3 million volumes



Simulation: ATLAS (II)

Atlas Detector Description



Simulation: ATLAS (III)



Simulation: Alice (IV)

FLUKA VMC implementation

completed


Testing well advanced


T
Geo
/FLUKA validation
completed


Good agreement with G3
and Testbeam


FLUKA VMC will be used in
the

next ALICE Physics data
challenge


Plan to use Geant4 as
alternative simulation engine


under developement



Simulation : CMS (V)

The CMS detector description system (DDD) provides an
application
-

independent way to describe the geometry


Simulation, Reconstruction, Event Display etc. use by definition the
same geometry

Geometry data are stored in a database with a Hierarchical
Versioning

System

Alignment corrections are applied with reference to a given
baseline

geometry



Simulation : CMS (VI)

Event generator framework interfaces multiple packages


including the Genser distribution provided by LCG
-
AA

Simulation with Geant4 since end 2003


>100M events fully simulated up to now since mid
-
2005


1/10
6

crashes in latest productions

Digitization tested and tuned with Test Beam



Detector
Simulation

Generation

Digitization

Hit collection
.

Hit object with
timing, position,
energy loss info.

Based on Geant4



Digi Collection

Digi objects which
include realistic
modeling of electronic

signal.


MC truth collection

include info
from particle gun or physics
generator about vertices and
particles. Stored in HepMC
format
.




Simulation (II)

Tuning to data: ongoing. Very good progress made

CMS HCAL:

Brass/Scintillator

ATLAS Tilecal:

Fe/Scintillator

Geant4 / data for e/
p



GEANT4


Improvements in Geant4.8

Improvements in
multiple scattering

process


Addressing issues with ‘electron transport’

Speedups for initialisation/navigation


Option to only re
-
optimise parts that change with
run


New voxelisation options being studied for regular
geometries

Overlap checks at geometry construction

Revised implementation of particles


Impacting advanced users, customizing


Refinements in hadronic physics


End Lecture 1