Software:Frontenac
The Frontenac cluster includes a wide variety of software and compilers. There are several new ways of accessing and using software, which are documented here. For a list of software available on the SW cluster, please see our SW software page.
The "module" system
Frontenac uses a new method of loading software compared to the SW cluster, the Lmod modules system. We have switched to this system to ensure consistency with other Compute Canada systems and provide a better user interface to our software. The modules system uses more or less the same concepts as the "use" system in use on the SW cluster.
On a large compute cluster, it is impossible to have all sets of software loaded all the time by default. Some software has multiple versions, some packages conflict with each other, and some pieces of software need to be configured separately for different use cases. Environment modules are designed to solve this problem, by treating each software package and all of its associated files as a distinct package to be loaded on demand. Modules also handle the loading of dependencies. For instance, loading the R programming language would be done by loading the "r" module - any dependencies would be handled behind the scenes by the module system without any user intervention.
How to use the module system
What you want to do | Lmod command (Frontenac cluster) | "Use" command (SW cluster) |
---|---|---|
See all available software | module avail | use -l |
See a short description of what each package does | module spider | <no equivalent> |
Load the software package "packageName" | module load packageName | use packageName |
Use a specific version of a software package | module load packageName/version | use packageName-version |
View currently loaded packages | module list | <no equivalent> |
Unload a package | module unload packageName | <no equivalent> |
Unload all packages | module purge | use none |
Please note that all commands are case-sensitive. For an extremely comprehensive set of documenation on using the module system (such as how to write your own modules), you can refer to the official Lmod documentation here: http://lmod.readthedocs.io/en/latest/
Local vs. Compute Canada software
Software on the Frontenac cluster can come from two locations: locally or from Compute Canada's centralized software stack. The Compute Canada software stack is standardized, and contains a set of software that is identically compiled and setup across every cluster it is installed on. This is a fantastic tool for reproducibility and scaling your work across multiple clusters: the same software will work the same way, regardless of where you are using it. There is also a large amount of locally installed software as well. This is how most software requiring licensing or other special local considerations is installed. Using both sets of software is identical- just run module load softwareName
.
Please note that if you are the first user to use a Compute Canada software package on a node (or it has not been used in some time), the software may initially appear to "hang" and do nothing for several seconds on launch. This is normal - the software is being re-downloaded and cached on the local system. To tell if a piece of software being used is coming from this centralized stack, you can run which <some_command>
. If the output begins with /cvmfs
, it is part of the Compute Canada software stack.
List of all installed software
This is a comprehensive list of software installed on the Frontenac cluster. Note that some software is loaded by default.
Note that if you just want to load the default version of a piece of software you do not need to include the version when loading a module. For instance, module load gcc
will load the default version of GCC (version 5.4.0).
Software Name | Package Name / version | Usage Notes |
---|---|---|
ABINIT | abinit/8.2.2 | |
ABySS | abyss/1.9.0 | |
AGOUTI | agouti/v0.3.3 | |
ALLPATHS-LG | allpaths-lg/52488 | |
Anaconda Python distribution | anaconda/2.7.13 | |
anaconda/3.5.3 | ||
ARPACK | arpack-ng/3.4.0 | |
BamTools | bamtools/2.4.1 | |
BCFtools | bcftools/1.4 | |
beagle-lib | beagle-lib/2.1.2 | |
BEAST | beast/2.4.0 | |
BEDTools | bedtools/2.26.0 | |
Bioperl | bioperl/1.7.1 | |
BLAST | blast+/2.6.0 | |
BLAT | blat/3.5 | |
Boost | boost/1.60.0 | |
Boost MPI libraries | boost-mpi/1.60.0 | |
Bowtie | bowtie/1.1.2 | |
Bowtie2 | bowtie2/2.3.0 | |
BWA | bwa/0.7.15 | |
CLHEP | clhep/2.3.1.1 | |
Cufflinks | cufflinks/2.1.1 | |
Eclipse | eclipse/4.6.0 | |
Eigen | eigen/3.3.2 | |
FastQC | fastqc/0.11.5 | |
FFTW | fftw/3.3.5 | |
Gaussian | gaussian/g09e1_sse4 | |
gaussian/g16a3_sse4 | ||
GCC | gcc/4.8.5 | |
gcc/5.4.0 | ||
GDAL | gdal/2.1.3 | |
Geant4 | geant4/10.02.p03 | |
GEOS | geos/3.6.1 | |
GLPK | glpk/4.61 | |
GMAP + GSNAP | gmap-gsnap/2017-04-13 | |
GROMACS | gromacs/4.6.7 | |
gromacs/5.0.7 | ||
gromacs/5.1.4 | ||
gromacs/2016.3 | ||
GSL | gsl/2.2.1 | |
HDF5 | hdf5/1.8.18 | |
HDF5 MPI libraries | hdf5-mpi/1.8.18 | |
HPL | hpl/2.2 | |
htslib | htslib/1.4 | |
igraph | igraph/0.7.1 | |
IMPUTE2 | impute2/2.3.2 | |
Intel Math Kernel Library (MKL) | imkl/11.3.4.258 | |
imkl/2017.1.132 | ||
Intel Compiler Suite (local version) | ics/2017u1 | |
Intel Compiler Suite (CC version) | intel/2016.4 | |
intel/2017.1 | ||
InterProScan | interproscan/5.23-62.0 | |
JAGS | jags/4.2.0 | |
JasPer | jasper/1.900.1 | |
Java 8 | java/1.8.0_121 | |
Jellyfish | jellyfish/2.2.6 | |
Julia | julia/0.5.1 | |
Libxc | libxc/3.0.0 | |
MACH | mach/1.0.18 | |
MATLAB | matlab/R2017a | |
MATLAB Compiler Runtime | mcr/R2013a | |
mcr/R2014a | ||
mcr/R2014b | ||
mcr/R2015a | ||
mcr/R2015b | ||
mcr/R2016a | ||
mcr/R2016b | ||
METIS | metis/4.0.3 | |
metis/5.1.0 | ||
Minimac2 | minimac2/2014.9.15 | |
Minimac3 | minimac3/2.0.1 | |
Mothur | mothur/1.39.4 | |
MrBayes | mrbayes/3.2.6 | |
NetCDF | netcdf/4.4.1.1 | |
NetCDF MPI libraries | netcdf-mpi/4.4.1.1 | |
NetCDF C++ | netcdf-c++/4.2 | |
NetCDF C++ MPI libraries | netcdf-c++-mpi/4.2 | |
NetCDF C++4 MPI libraries | netcdf-c++4-mpi/4.3.0 | |
NetCDF Fortran MPI libraries | netcdf-fortran-mpi/4.4.4 | |
Octave | octave/4.2.1 | |
OpenMPI | openmpi/2.0.2 | |
ParaView | paraview/5.3.0 | |
ParaView Offscreen support | paraview-offscreen/5.3.0 | |
Perl 5 | perl/5.22.2 | |
PETSc | petsc/3.7.5 | |
petsc-64bits | ||
PGI compilers | pgi/17.3 | |
Picard Tools | picard/2.1.1 | |
Python | python/2.7.13 | |
python/3.5.2 | ||
Python 2 Scipy Stack | python27-scipy-stack/2017a | |
Qhull | qhull/2015.2 | |
qrupdate | qrupdate/1.1.2 | |
Qt GUI toolkit | qt/4.8.7 | |
qt/5.6.1 | ||
Quantum ESPRESSO | quantumespresso/6.0 | |
R | r/3.3.3 | |
Ray | ray/2.3.1 | |
Redundans | redundans/a6621dc | |
Repast for High Performance Computing | repasthpc/2.2.0 | |
SAMTools | samtools/0.1.20 | |
samtools/1.3.1 | ||
SIESTA | siesta/4.0 | |
Apache Spark | spark/2.1.0 | |
Stacks | stacks/1.45 | |
SuiteSparse | suitesparse/4.5.4 | |
SuperLU | superlu/5.1.1 | |
Intel Threading Building Blocks (TBB) | tbb/2017.2.132 | |
TopHat 2 | tophat/2.1.1 | |
TransDecoder | transdecoder/3.0.1 | |
Trimmomatic | trimmomatic/0.36 | |
Trinity | trinity/2.4.0 | |
WPS | wps/3.8.0 | |
wps/3.8.1 | ||
WRF | wrf/3.8.0 | |
wrf/3.8.1 |