Upload
others
View
5
Download
0
Embed Size (px)
Citation preview
• 64 total nodes • 128 2.4 GHz Intel Xeon CPUs• 64GB memory• 2.4TB disk space• private Gigabit network• Gigabit connection to campus backbone
Remote Power Control• single point of control• cascaded startups• remote on/off
Snapshot from a test run of MM5 atmospheric modeling system on the cluster.
Parallel Nbody simulationSpacetime patterns in parallel programs
Setup and installation
• PIs: Amit Jain (Computer Science) and Paul Michaels (Geophysics)• Porting, tuning of research codes by graduate and undergraduate students working closely with researchers across campus. Current efforts are listed below (with faculty/researcher names in parentheses):
• Hydraulic Tomography by Kevin Nuss (with Tom Clemo from Geophysics)• MM5 by Kevin Nuss and Robert Walters (with Paul Dawson, Mechanical Engg.) • Nanotechnology by Juan Carlos and Kevin Nuss (with Charles Hanna, Physics) • Modeling of Ocean Currents by Hongyi Hu (with Jodi Mead, Mathematics) • Waveform Relaxation by Hongyi Hu (with Barbara ZubikKowal, Mathematics)• Modeling multiphase flow by Oralee Nudson (with Stephen Brill, Mathematics)• Bayesian Analysis of Phylogeny (Jim Smith, Biology)• and others...
• Remote Power Control project by Brady Catherman (sophomore, CS)• Cluster Monitoring Software by Joey Mazzarelli (senior, CS)• Cisco switch configuration by Luke Hindman (sophomore, CS)• David Zuercher (senior, CS), Dan Crow (senior, CS), and Jeff Cope (sophomore, CS) helped with cluster installation.
Funded by National Science Foundation Major Research Infrastructure Grant No. 0321233
A Beowulf Cluster is a supercomputer made out of commodity parts.
Project goals:
• Establish a resource for researchers on campus with large computing needs.
• Help researchers convert their code to run on the cluster.
• Research performance bottlenecks.
• Develop tools to improve the usability of clusters
Beowulf Cluster