16
Electronic Visualization Laboratory, University of Illinois a MPI on Argo-new Venkatram Vishwanath [email protected] Electronic Visualization Laboratory University of Illinois at Chicago

Electronic Visualization Laboratory, University of Illinois at Chicago MPI on Argo-new Venkatram Vishwanath [email protected] Electronic Visualization

Embed Size (px)

Citation preview

Page 1: Electronic Visualization Laboratory, University of Illinois at Chicago MPI on Argo-new Venkatram Vishwanath venkat@evl.uic.edu Electronic Visualization

Electronic Visualization Laboratory, University of Illinois at Chicago

MPI on Argo-new

Venkatram Vishwanath

[email protected]

Electronic Visualization Laboratory

University of Illinois at Chicago

Page 2: Electronic Visualization Laboratory, University of Illinois at Chicago MPI on Argo-new Venkatram Vishwanath venkat@evl.uic.edu Electronic Visualization

Electronic Visualization Laboratory, University of Illinois at Chicago

ARGO-NEW Cluster Configuration

• 64 Node heterogeneous cluster • 16 dual-core Opterons

– argo1-1 to argo4-4

• 16 single-processor Xeons – argo5-1 to argo8-4

• 32 dual-core dual-processor Xeons – argo9-1 to argo16-4

• Gigabit Ethernet connectivity between nodes• PBS batch scheduling system

Page 3: Electronic Visualization Laboratory, University of Illinois at Chicago MPI on Argo-new Venkatram Vishwanath venkat@evl.uic.edu Electronic Visualization

Electronic Visualization Laboratory, University of Illinois at Chicago

Logging in to ARGO-new

• Access Argo-new from a machine at UIC

• Account name is your UIC username

eg. ssh -l [email protected]

• Access via ssh

• Remote Access from home is possible

– Log in from bert, ernie or icarus

Page 4: Electronic Visualization Laboratory, University of Illinois at Chicago MPI on Argo-new Venkatram Vishwanath venkat@evl.uic.edu Electronic Visualization

Electronic Visualization Laboratory, University of Illinois at Chicago

Setting up the Environment Variables

• MPICH2 (Argonne NL) and PGI(Portland Group) are installed on Argo-new.

• Talk will focus on MPICH2

• MPICH2 related environment settings– setenv MPI_PATH /usr/common/mpich2-1.0.1– setenv LD_LIBRARY_PATH $MPI_PATH/lib:$LD_LIBRARY_PATH– setenv PATH $MPI_PATH/bin:$MPI_PATH/include:$PATH

– In Bash• export MPI_PATH=/usr/common/mpich2-1.0.1

Page 5: Electronic Visualization Laboratory, University of Illinois at Chicago MPI on Argo-new Venkatram Vishwanath venkat@evl.uic.edu Electronic Visualization

Electronic Visualization Laboratory, University of Illinois at Chicago

Configure the .mpd.conf file

• Create a .mpd.conf in your home directory

• Add a single line “secretword=##&&****”

• [vvishw1@argo-new ~]$ cat .mpd.conf • secretword=sjdkfhsdkjf

• Set the permission to 600 for .mpd.conf– MPI will NOT work if the permissions of .mpd.conf is 755

• [vvishw1@argo-new ~]$ ls -al .mpd.conf • -rw------- 1 vvishw1 student 23 Feb 13

17:15 .mpd.conf

Page 6: Electronic Visualization Laboratory, University of Illinois at Chicago MPI on Argo-new Venkatram Vishwanath venkat@evl.uic.edu Electronic Visualization

Electronic Visualization Laboratory, University of Illinois at Chicago

A Typical MPICH2 Session on Argo-new

• Log in to Argo-new

• Work and compile your MPI program

• Set up the MPD ring

• Run the MPI Program using the PBS scheduler

• Bring down the MPD ring

• Logout

Page 7: Electronic Visualization Laboratory, University of Illinois at Chicago MPI on Argo-new Venkatram Vishwanath venkat@evl.uic.edu Electronic Visualization

Electronic Visualization Laboratory, University of Illinois at Chicago

Compile your MPI Program

• MPICH provides wrapper scripts for C, C++• Use mpicc in place of gcc and mpicxx for g++

#include <stdio.h>#include "mpi.h"#include <unistd.h>int main(int argc, char **argv){ int rank, size; MPI_Init(&argc, &argv); MPI_Comm_rank(MPI_COMM_WORLD, &rank); MPI_Comm_size(MPI_COMM_WORLD, &size);

char buf[1024]; memset(buf, '\0', 1024); gethostname(buf, 1024);

printf("Hello from %s , I'm rank %d; Size is %d \n", buf, rank, size);

MPI_Finalize(); return 0;}

Use the following Flags:

• -I $MPI_PATH/include

• -L $MPI_PATH/lib -lmpich

mpicc -o Hellow testHostname.c -I $MPI_PATH/include -L $MPI_PATH/lib -lmpich

Page 8: Electronic Visualization Laboratory, University of Illinois at Chicago MPI on Argo-new Venkatram Vishwanath venkat@evl.uic.edu Electronic Visualization

Electronic Visualization Laboratory, University of Illinois at Chicago

Set up the MPD ring

• MPD ring needs to be set up on the nodes where the MPI program would run.

• Launch the MPD daemons– rsh <A_host_in_hostfile>

"/usr/common/mpich2-1.0.1/bin/mpdboot -n <total_hosts_in_file> -f <path_to_hostfile> -r rsh -v”

e.g rsh argo1-1 "/usr/common/mpich2-1.0.1/bin/mpdboot -n 4 -f $HOME/mpd.hosts -r rsh -v”

• Check the Status of the MPD daemons using mpdtrace– rsh argo1-1 "/usr/common/mpich2-1.0.1/bin/mpdtrace”

Page 9: Electronic Visualization Laboratory, University of Illinois at Chicago MPI on Argo-new Venkatram Vishwanath venkat@evl.uic.edu Electronic Visualization

Electronic Visualization Laboratory, University of Illinois at Chicago

Run the MPI Program using the PBS scheduler

• Submit a job to the scheduler using qsub• qsub -l nodes=<list of nodes> <job_script_file>• Typical Job Script:

– mpiexec -machinefile <complete_machinefile_path> -np <number of nodes> <complete_path_to_exec>

– mpiexec -machinefile /home/homes51/vvishw1/my_machinefile -np 4 /home/homes51/vvishw1/workspace/hellow

• qsub -l nodes=argo1-1+argo1-2+argo1-3+argo1-4 my_script.sh

• Returns a status message giving you the job id- “33112.argo-new.cc.uic.edu”- 33112 is your job id

Page 10: Electronic Visualization Laboratory, University of Illinois at Chicago MPI on Argo-new Venkatram Vishwanath venkat@evl.uic.edu Electronic Visualization

Electronic Visualization Laboratory, University of Illinois at Chicago

A few things to remember

• The List of nodes in qsub should match the nodes in the machinefile

• The List of nodes in qsub should be a subset of the nodes used in the MPD ring

• Restrict the number of submitted jobs to 3

Page 11: Electronic Visualization Laboratory, University of Illinois at Chicago MPI on Argo-new Venkatram Vishwanath venkat@evl.uic.edu Electronic Visualization

Electronic Visualization Laboratory, University of Illinois at Chicago

Argo-new stats online

• Argo-new Stats is available online at:– http://tigger.uic.edu/htbin/argo_acct.pl

QuickTime™ and a decompressor

are needed to see this picture.

Page 12: Electronic Visualization Laboratory, University of Illinois at Chicago MPI on Argo-new Venkatram Vishwanath venkat@evl.uic.edu Electronic Visualization

Electronic Visualization Laboratory, University of Illinois at Chicago

Typical Machine Files

• [vvishw1@argo-new ~/]$ cat my_machinefile argo1-1argo1-2argo1-3argo1-4

• To simulate 8 logical nodes on 4 physical nodes[vvishw1@argo-new ~/]$ cat my_machinefile argo1-1:2argo1-2:2argo1-3:2argo1-4:2

Page 13: Electronic Visualization Laboratory, University of Illinois at Chicago MPI on Argo-new Venkatram Vishwanath venkat@evl.uic.edu Electronic Visualization

Electronic Visualization Laboratory, University of Illinois at Chicago

Other Useful PBS commands

• qstat <job id> – status of the job

• qstat -f <job id> - detailed job information

• qstat -u <username> - information on all submitted user jobs

• qdel <job id> - delete a submitted job

• qnodes - status of all the argo-new nodes – Use this as a hint for the nodes in your MPD ring.

Page 14: Electronic Visualization Laboratory, University of Illinois at Chicago MPI on Argo-new Venkatram Vishwanath venkat@evl.uic.edu Electronic Visualization

Electronic Visualization Laboratory, University of Illinois at Chicago

The output of your MPI program

• Standard output and standard error are both redirected to files

• [script name].{e,o}[job id]

• Ex: – stderr in my_script.e3208– stdout in my_script.o3208

• The .e file should usually be empty.

Slide courtesy Paul Sexton

Page 15: Electronic Visualization Laboratory, University of Illinois at Chicago MPI on Argo-new Venkatram Vishwanath venkat@evl.uic.edu Electronic Visualization

Electronic Visualization Laboratory, University of Illinois at Chicago

Stop the MPD environment

• The MPD ring needs to be brought down using mpdallexit

• rsh <host_in_config_file> "/usr/common/mpich2-1.0.1/bin/mpdallexit”

– rsh argo1-1 "/usr/common/mpich2-1.0.1/bin/mpdallexit”

Page 16: Electronic Visualization Laboratory, University of Illinois at Chicago MPI on Argo-new Venkatram Vishwanath venkat@evl.uic.edu Electronic Visualization

Electronic Visualization Laboratory, University of Illinois at Chicago

Miscellaneous Topics

• Simulating Topologies– MPI_Cart_create will help create 2D grids, hypercubes, etc

• Measuring Time– MPI_Wtime

• Ideally, Performance results should be statistically significant

• Computation time is the maximum of all nodes • MPE for in depth performance analysis of MPI

programs.