High Performance Computing (HPC) and Activities of Computer Centre IIT Kanpur Activities of Computer Centre, IIT Kanpur

domineeringobsceneElectronics - Devices

Nov 7, 2013 (4 years and 1 month ago)

92 views

High Performance Computing (HPC) and
ActivitiesofComputerCentreIITKanpur
Activities

of

Computer

Centre
,
IIT

Kanpur
A presentation to the Board of Governors, IIT Kanpur
May 28, 2010
Amalendu Chandra
Head, Computer Centre
IIT Kanpur
Acknowledgments

BoardofGovernors,IITKanpur

Board

of

Governors,

IIT

Kanpur
CC Engineers, HPC Group
IWD Engineers
CC@IITK has a glorious history
Computer Centre
Thiscentrewasestablishedin1964anditwas
startedinWesternLabsunderDepartmentof
ElectricalEngineering.
Itmovedtoitspresentbuildingin1969,whenit
wasrecognizedasanindependentdepartmentin
theInstitute.
IBM
-
7044
IBM-1620wasthefirstComputeracquiredbyIIT
Kanpur.NextwasIBM-7044in1966,followedby
anIBM-1401.
IBM
-
7044
SeveralspecializedComputerssuchasIBM-1800,
PDP-1etcwereaddedinsubsequentyears.
Thenextmajorupgradewastheadditiono
f
DEC-
1090mainframecomputerin1979,whichwasthe
firstsharingcomputerofIITKanpur.Thiswasfirst
computerwhichhadterminals.
PDP-1
In1989,CCpurchasedSuperminicomputersofHP
9000series.
History of Computer Centre
Computer Centre
In1987,thefirstPClabwassetupprovidingDOS
environment
environment
.
Convex220wassetupin1990.CCgotitsMini
Su
p
ercom
p
uter
pp
IBMSP2wassetupin1999.Firstparallelcomputer
HP Servers
EmailservicewasrununderErnetprojectinearly
1990sandwassubsequentlymovedtocomputer
centrearound1994
In1995theCampusNetworkwasupgradedto100
MbpsFiberBackboneand10MbpsUTPAccess
Network.
Convex-220
64KbpsInternetlinkwassetupin1998andtoday
thebandwidthhasincreasedtomorethan1Gbps.
LinuxCluster(SUN&HP)setupin2004-05
Current Staff of Computer Centre
Principal Computer Engineer : 2 + 1 (on deputation)
Senior Computer Engineer : 3 + 3 (OA)
Computer Engineer : 1+1 (retiring soon)
Jr.TechnicalSuperintendent:3
Jr.

Technical

Superintendent

:

3
Jr. Technician : 2
Facilities Provided by CC
Computer Centre
ComputerCenterprovidesstate-of-the-artComputing,E-mail,
Internetandotherfacilities24hoursadayand365daysa
year
for
more
than
7500
users
.
year
for
more
than
7500
users
.
MajorfacilitiesprovidedbyCCare:
Computing
hardware
Computing
hardware
Applicationsoftware
CampusNetwork
Email
and
Internet
Email
and
Internet
LinuxandWindowsLabs
FileStorageandBackup
Services
Services
HostingofIITKwebsite
OfficeAutomation(underDD)
Technical
support
via
phone
and
email
Technical
support
via
phone
and
email
MaintenanceofPCsandperipherals
Activities
•4-5 hours of Classes for UG and PG per day at CC
•Students work on computing and CAD assignments using CC Lab facilities
•User training / Education / Familiarization / Help with compilation coding etc.
•Workshops and Seminars
•Troubleshooting
•Research and Related Activity

MaintenanceofHardwareandSoftware

Maintenance

of

Hardware

and

Software
•Software installation and upgradation
•Development and Installation of regularly used software (in-house)

MassUserregistration/authenticationandloginiddistributionforsupport
Mass

User

registration

/

authentication

and

login

id

distribution

for

support
•Support for
–Alumni Office, DRPG Lists, Office Automation, OARS, Regular Courses Mailing,
Conferences, Short Term Courses, Mass Internal Mailing and announcements,
NODuessupplyandrepairofPCsforadministrationandothersectionsweb
NO

Dues
,
supply

and

repair

of

PCs

for

administration

and

other

sections
,
web

hosting
•Security
•Mail, Domain Name Service, Networking, Software Downloads sites, Mirror
Sitit
Sit
e ma
i
n
t
enance
Com
p
utin
g
Facilit
y
Computer Centre
pgy
C
C
h
3
4
CPU
d
d
146
dl
CPU
C
omputer
C
entre
h
as
3
4
-
CPU
masterno
d
esan
d
146
d
ua
l
-
CPU
computenodesintheClusteramountingto292coresof
computingpowerconnectedover1GbpsLAN.
Applications-Gaussian,Linda,Charmm,FEM,Diff.Equ.Solving
Molpro,parallellibrariesetc.
Environment
-
Parallelaswell
Environment

-
Parallel

as

well

as sequential on open source
OS
A
new
facility
of
about
3000
A
new
facility
of
about
3000
coresofveryfastcompute
nodeswith40GbpsInfiniband
interconnectisinthepipeline.
Storage and Backup
Computer Centre
Storage
One33TBHPStora
g
eWorks
g
EVA8000Enterprise
V
irtual
A
rrayand
One6TBSUNStorEdge6120
File
Service
File
Service
OneHPStorageWorksClusteredFile
Systems(PolyServeSymmetric
ClusterFileSystem)andOneVirtual
File
Service
(PolyServe
Matrix
Server)
File
Service
(PolyServe
Matrix
Server)
BackupService
OneBacku
p
ServerforUsers'home
p
directoriesandUsers'MailwithHP
MSL6000DP
BackupPolicy:Dailyincremental
backup,savedforoneweek,Weekly
itl
bk
d
f
HP EVA 8000 Enterprise VA
i
ncremen
t
a
l
b
ac
k
up,save
d
f
orone
MonthandMonthlyfullbackup,
savedforoneyear.
New Storage:
100 TB for HPC
Linux and Windows Environment
Linux Working Environment
•Three Labs equipped with 143 Latest Configuration PC with Ubuntuand
Fedora.

9ComputationalServers

9

Computational

Servers
.
•Software Installed for Simulations and Modelling, Optimization etc.
WindowsWorkingEnvironment
Windows

Working

Environment
•Domain Controller and License Server, SAMBA Server and Deployment
Server
•Two Labs equipped with 75 Latest
Configuration PCs
•Software Installed in wide category
Email Setup
•Approximately 7500 mail boxes

Quotarangingfrom500MBto1500MB
Quota

ranging

from

500MB

to

1500

MB
•Around 250000 Mails sent and received every
dayoutofwhich90%ofincomingmailisSPAM
day

out

of

which

90%

of

incoming

mail

is

SPAM

which is filtered by Barracuda SPAM firewall

BothLinuxPostfixandMicrosoftExchange
Both

Linux

Postfix

and

Microsoft

Exchange

platforms provided

Thesetupisstate
-
of
-
theartenterpriseclass

The

setup

is

state
-
of
-
the

art

enterprise

class

providing high availability and fault tolerance.
Email Architecture
iitk.ac.in and other
served domains
Internet
HTTP/
PUSH
SMTP Server
Un-Authenticated
MS
HTTP
Other Served
Domains
Iitkalumni.org
security.iitk.ac.in
anatara
g
ni.iitk.ac.in
Spam Filter Pair
Local lists server
MS

Exchange
Server
MS EXCHANGE
PROTOCOLS
g
techkriti.org
cse.iitk.ac.in etc.
SMTP
Mail Hub Pair
SMTPServer
Web Mail Server
IMAP
HTTPHTTP
SMTP Exchange
Registerd users
SMTP

Server
POP
IMAP
POP
Local Mail Store
WebMail
and
MSXchange
Clients
SMTP/POP
Clients
Authenticated /
Unauthenticated
Current Network Setup
Computer Centre
ItittGibitLAN(LlA
I
ns
tit
u
t
e
Gi
ga
bit

LAN

(L
oca
l

A
rea
Network) with more than 15000
nodes covering Academic Area
and Student Hostels.
2 Core Switches, ~50 Distribution
Switches, ~800 Access Switches
Gigabit Fiber Optic Backbone Network
withmorethan21KmsofFiberlaidin
with

more

than

21

Kms

of

Fiber

laid

in

the Campus
Fully Managed Network
1 Gbps Internet Bandwidth from
Airtel
Backup Bandwidth from Reliance and
NKN
OverlayWi
FiNetworkinthe
Overlay

Wi
-
Fi

Network

in

the

Academic Area
~500 Access Points
InternetApplicationServersfor
Internet

Application

Servers

for

providing Web, Mail, Proxy, DNS
and other Internet services to the
users.
Network Architecture
Computer Centre
FORTIGATE 3600A
UTM (HA MODE)
L2 Switch
AIRTELMUX
CISCO ROUTER
7200
CISCO 6500
CORE SWITCHES
CISCO DS 3750
CISCO AS 2960G
CISCOAS2960G
BSNL MUX
AIRTEL

MUX
7200
InternetInternet
CISCO

AS

2960G
CISCO AP 1131
Cloud
Cloud
Network:ImmediateandLongTermPlans
Computer Centre
Network:

Immediate

and

Long

Term

Plans
ImmediatePlans:
Expandthenetworktoaccommodatenewbuildings/
facilities
and
increasing
student
strength
facilities
and
increasing
student
strength
Provide1GbpsLANintheresidentialarea
LongTermPlans:
BuildanAllIPNetworktoprovideintegratedVoice-
V
ideo-Datanetwork
ProvideIPbasedVoice/Videophonesand
Desktop
Conferencing
facility
Conferencing
facility
C
y
ber Securit
y
Computer Centre
yy
InviewoftherequirementtostrengthentheCyberSecurity,
flli
h
b
k
i
h
f
o
ll
ow
i
ngsteps
h
ave
b
eenta
k
en
i
nt
h
erecentpast:
Alltheswitcheshavebeenreplacedwithmanagedswitches.
This
allows
binding
an
IP
address
with
every
network
port
This
allows
binding
an
IP
address
with
every
network
port
whichhelpsintracingthemachine/individualwhohasbeen
involvedinanyhacking.
ACCTVIPcamerahasbeeninstalledattheentranceofCC
torecordtheentryandexitofusers.
AFortigateUTM(UnifiedThreatManagement)devicehas
beeninstalledattheInternetGatewaytomonitorandcontrol
theInternetTrafficandallowtightercontrolonthetrafficto
Internet
Application
Serers
for
better
secrit
Internet
Application
Ser
v
ers
for
better
sec
u
rit
y.
Measures to Improve Cyber Security
BlockalltheunusedTCP/UDPportsforInternetApplication
ServersontheInternetGatewayFirewall.
Implement
electronic
Access
Control
based
on
identity
in
all
Implement
electronic
Access
Control
based
on
identity
in
all
CClabs.Inaddition,installCCCamerainallthelabs.
ImplementDHCPbasedIPaddressallocationpolicy.A
machineshouldbeabletousethenetworkonlyifithas
b
lltd
IP
dd
thh
DHCP
b
eena
ll
oca
t
e
d
IP
a
dd
ress
th
roug
h
DHCP
.
ImplementNetworkauthenticationforbothwiredand
wirelessnetwork.Makeprovisionforissuingtemporaryids
for
visitors
.
for
visitors
.
ImplementWirelessIntrusionDetectionandPrevention
system.
Implement
more
secure
authentication
than
currently
Implement
more
secure
authentication
than
currently
followedscheme.
Implementbettersecurityforsystemlogsontheindividual
servers.Thiswillhelptracetheattack.
A
ctivities Underwa
y
Computer Centre
y
Windows
and
Linux
Labs
for
more
than
200
users
in
Windows
and
Linux
Labs
for
more
than
200
users
in
NewCoreBuilding
Gigabit
LAN
connectivity
in
Residences
Gigabit
LAN
connectivity
in
Residences
NewMailStorage(25TB)
New
UPS
(
900
KVA)
New
UPS
(
900
KVA)
GPUserversandhighendworkstations
NewHPCfacility
New

HPC

facility

Why HPC ?
To solve complex problems in science and engineering
Higher resolution simulations for longer time
Sometimes experiments cannot be done !
!
Computational experiments can be used to
simulate
extreme
conditions
simulate

extreme
conditions
VastexpertiseinNumericalMethods
p
Vast

expertise

in

Numerical

Methods
Aliti
A
pp
li
ca
ti
ons
Hardware
HPC
Support
S
y
stems
Basic
science
Visualization
Numerical
lith
y
science
Visualization
a
l
gor
ith
ms
More than 100 facult
y
members across
y
various disciplines are involved in computing
Overall
(CHM+PHY+AE+BSBE+CE+CHE+EE+ME+MME)
150
200
250
300
255
125
0
50
100
150
AllInvolved in
Computing
HPC Research @IITK
Multiscale,AdaptiveFiniteElementMethods
usingDomainDecomposition
Flo
w
PastBodieswithCom
p
lexGeometr
y
and
LargeEddySimulationofTurbulence
VortexDominatedFlowsandHeatTransfer
p
y
Corners
FlowInducedVibrations
AnalysisofAircraftStructures
Vitl
Rlit
Pseudo-spectralTurbulenceSimulations
Geo-seismicProspecting
EnhancedOilRecovery
StressAnalysisandCompositeMaterials
Vi
r
t
ua
l
R
ea
lity
ComputationalChemistry
NanoblockSelfAssembly
MolecularSimulation(MolecularDynamics&
VibrationandControl
SemiconductorPhysics,FeynmanIntegrals
ThermalandHydraulicTurbomachinery
Numerical
Weather
Prediction
MonteCarloMethods)
StatisticalThermodynamics
GeometricOptimizationofLargeOrganic
Systems
Numerical
Weather
Prediction
TurbulenceModellingthroughRANS
NeuralNetworks
ImpuritiesinAnti-FerroMagnets
Systems
ElectronicStructureCalculations
AggregationandEtching
QuantumSimulations
RamanScattering
SpinFluctuationinQuantumMagnets
Robotics
Multi
-
Body
Dynamics
ThinFilmDynamics
Optical/EMFieldCalculations
ParallelSpectralElementMethods
Multi
-
Body
Dynamics
ComputerAidedTomography
NuclearMagneticResonance
Awards and Honours
S.S. Bhatnagar Prize
Fellowship of Academies:
FNA; FASc; FNAE
Research Fellowships:
J.C. Bose; Swarnajayanti; Raja Ramanna
Status of Central HPC Facilities at Academic Institutions
•IISc Bangalore: 8192 processors IBM BLUE GENE (17 TF)

IITBombay:380Nodes:XeonDualCore(partlyoninfiniband)
IIT

Bombay

:

380

Nodes

:

Xeon

Dual

Core

(partly

on

infiniband)
•IIT Madras : 256 Nodes, Xeon dual core (partly on infiniband)
•JNCASR : 128 nodes, Xeon dual core (infiniband)
•Univ. of Hyderabad : P690 SMP server (32 processors x 4)

IISERPune:64NodesXeonQuadcore(Infiniband)
IISER

Pune:

64

Nodes

Xeon

Quadcore

(Infiniband)

•IIT Hyderabad: 64 Nodes Xeon Quadcore (Infiniband) (6 TF)
IITK144dAMDOtil

IIT

K
anpur :
144
no
d
es,
AMD

O
p
t
eron s
i
ng
l
e core
5 year old Hardware (< 1 TF)
Our goal is to …

bethebestinthecountryandoneofthebest

be

the

best

in

the

country

and

one

of

the

best

in the world in HPC

carry out cutting edge research on computational
science and engineering

develop large parallel software for research
liti
app
li
ca
ti
ons

collaboratewithinIITKandwithneighbouring

collaborate

within

IITK

and

with

neighbouring

academic institutions

have training programs for students and scientists.
The New HPC Setup
Smaller Test
The Main
Cluster:260nodes
Clusters
Dual proc, Nehalem Quad core
Dual proc; Nehalem Quad core
Cluster:

260

nodes
HPC
Infiniband
Ntk
Servers
N
e
t
wor
k
(40 Gbps)
Disk
100 TB storage
Nehalem Quadcore/GPU
Visualization Lab
HihdhiW/S
Hi
g
h
en
d
grap
hi
cs
W/S
System Integration
Linux cluster (260 nodes)
Mstr
Mgmt
Mgmt
Mgmt
Comp
Comp
Comp
compute nodes
GB switches
Connection with
IITK network
IB switch layer
GB switch
GB switch
servers
(Multi-node)
S
S
Switch
S
witch
Smaller Test Clusters
comp
Comp
Comp
compute
nodes
Storage
100 TB disk
GB switch
The New HPC
New HPC Facility at IITK
The integrated facility will have a total of 372 nodes
andaprojecteddeliveredperformanceof~30TF
and

a

projected

delivered

performance

of

~

30

TF
Should be the best HPC facilit
y
amon
g
all academic
yg
Institutes in the country.
Second best among Government organizations
(C-DAC has got a 38 TF cluster)
We might break into top 500 globally
!
Layout of the area for HPC facility at ground floor of CC
N
E
Proposed HPC Data Centre
N
N
E
List of work associated with the HPC set-up
1. Layout of the proposed area for HPC facility
2. PAC requirements for the facility
3AC(Non
-
PAC)requirements
3
.
AC

(Non
-
PAC)

requirements
4. UPS requirements
5. Total power requirements
6UPS/btt/tll
6
.
UPS/b
a
tt
ery
/
con
t
ro
l
pane
l
rooms
7. How to provide the required power from main/DG set
8. Civil work in the
p
ro
p
osed HPC area
pp
9. Civil work in the UPS/Battery rooms
10. Electrical work/laying of lines and panels
11Firesafetyissues
11
.
Fire

safety

issues
12. Building Management System (BMS)
CC and IWD com
p
onents of HPC-related work
p
1.HPC Systems CC
2UPS(700KW)CC
2
.
UPS

(700

KW)

CC
3. PAC (630 kW) CC
4. Fire safety and BMS CC
5. AC (non-PAC) IWD
6. Substation
(
1.5 MW
)
IWD
()
7. DG set (1.5 MW) IWD
8. Electrical equipment/distribution IWD
9Civilwork
*
/layingofelectricallinesIWD
9
.
Civil

work/laying

of

electrical

lines

IWD
*Flooring work in main HPC area will be done by PAC vendor
A HUB for Collaborative Research
BHU
Alld U
SGPIMS,
Delhi U
HCRI,
Alld
HPC
BHU
SGPIMS,

LKW
AMU
CDRI
HPC

Centre
IITK
AMU
Aligarh
CDRI
,
LKW
Kanpur
Univ +
HBTI
JNU
HBTI
MNNIT
Allahabad
NIIT, All
d
Lucknow
Univ
Training and Workshops
•Visitors program

Summerschools/workshops
Summer

schools/workshops
•International/national conferences
HPCti

HPC
users mee
ti
ng
•Academic Program at IIT Kanpur

Masters in Com
p
utational Science and
p
Engineering

DoctoralProgram
Doctoral

Program
Proposal submitted to MHRD
Inspire Young Minds ….
Computer Centre as both Service and Academic Centre
Immediate need for HPC
Four Project Scientists (DST, advertisement made)
System Administrator, Secretary (Institute)
For General CC jobs: Engineers, Technical staff
Number of users has gone up, number of CC personnel has gone
down
Immediate need of more office space for HPC and CC staff. Also, a
seminar room, visitors room and test labs should be there in place
forthistransformationtooccur
for

this

transformation

to

occur
Need more space, more manpower
AproposalwithmoredetailshasbeensenttotheSpace
A

proposal

with

more

details

has

been

sent

to

the

Space

Committee
CC as the Nucleus of IITK Activities
CC
Smart Card
Thankyou
Thank

you