Index

Argument checking
MPI Routine Argument Checking

Breakpoint setting
Setting Breakpoints

Code hangs
My code runs correctly until it reaches MPI_Finalize() and then it hangs.

Combining MPI with tools
How do I combine MPI with insert favorite tool here?

Components
MPI Components

Continuing processes
Continuing and Stepping Processes

Debuggers
dbx and gdb
Using dbx and gdb with MPI programs
ProDev WorkShop
Using the ProDev™ WorkShop Debugger with MPI Programs

Distributed applications
Launching a Distributed Application

Features
MPI Features

Frequently asked questions
Troubleshooting and Frequently Asked Questions

Getting started
Getting Started

histx tool
histx+

Internal statistics
MPI Internal Statistics

Introduction
Introduction

Memory placement and policies
Memory Placement and Policies

Memory use size problems
The ps(1) command says my memory use (SIZE) is higher than expected.

Modifying code for MPI_Wait
Must I modify my code to replace calls to MPIO_Wait() with MPI_Wait() and recompile?

MPI launching problems
What does MPI: could not run executable mean?

MPI-2 compliance
MPI-2 Standard Compliance

MPI-2 spawn functions
to launch applications
Using MPI-2 Spawn Functions to Launch an Application

MPI_REQUEST_MAX too small
I keep getting error messages about MPI_REQUEST_MAX being too small, no matter how large I set it.

mpimon tool
Performance Co-Pilot (PCP)

mpirun command
to launch application
Using mpirun to Launch an MPI Application

mpirun failing
What are some things I can try to figure out why mpirun is failing?

mpivis tool
Performance Co-Pilot (PCP)

MPMD applications
Launching a Multiple Program, Multiple Data (MPMD) Application on the Local Host

MPT software installation
How can I get the MPT software to install on my machine?

perfex tool
perfex

Performance Co-Pilot (PCP)
Performance Co-Pilot (PCP)

ProDev WorkShop debugger
Using the ProDev™ WorkShop Debugger with MPI Programs

profile.pl tool
profile.pl

Profiling interface
Profiling Interface

Profiling library freeware
Profiling Interface

Profiling tools
Jumpshot
Third Party Products
mpimon
Performance Co-Pilot (PCP)
mpivis
Performance Co-Pilot (PCP)
perfex
Using Profiling Tools with MPI Applications
SpeedShop
Using Profiling Tools with MPI Applications
third party
Third Party Products
Vampir
Third Party Products

Programs
compiling and linking, IRIX
Compiling and Linking IRIX MPI Programs
compiling and linking, Linux
Compiling and Linking Linux MPI Programs
debugging methods
Debugging MPI Applications
launching distributed
Launching a Distributed Application
launching multiple
Launching a Multiple Program, Multiple Data (MPMD) Application on the Local Host
launching single
Launching a Single Program on the Local Host
launching with mpirun
Using mpirun to Launch an MPI Application
MPI-2 spawn functions
Using MPI-2 Spawn Functions to Launch an Application
SHMEM
Compiling and Running SHMEM Applications on IRIX Systems
Compiling and Running SHMEM Applications on Linux Systems
with TotalView
Using TotalView with MPI programs

Rerunning processes
Rerunning a Process

SHMEM applications for IRIX
Compiling and Running SHMEM Applications on IRIX Systems

SHMEM applications for Linux
Compiling and Running SHMEM Applications on Linux Systems

SHMEM information
Where can I find more information about SHMEM?

Single copy optimization
avoiding message buffering
Avoiding Message Buffering - Enabling Single Copy
using global memory
Using Global Memory for Single Copy Optimization
using the XPMEM driver
Using the XPMEM Driver for Single Copy Optimization

SpeedShop tool
SpeedShop

Stack traceback information
Why do I see “stack traceback” information when my MPI job aborts?

stdout and/or stderr not appearing
I am not seeing stdout and/or stderr output from my MPI application.

Stepping processes
Continuing and Stepping Processes

TotalView
Using TotalView with MPI programs

Troubleshooting
Troubleshooting and Frequently Asked Questions

Tuning
avoiding message buffering
Avoiding Message Buffering - Enabling Single Copy
buffer resources
Tuning MPI Buffer Resources
enabling single copy
Avoiding Message Buffering - Enabling Single Copy
for running applications across multiple hosts
Tuning for Running Applications Across Multiple Hosts
memory placement and policies
Memory Placement and Policies
MPI/OpenMP hybridcodes
Tuning MPI/OpenMP Hybrid Codes
reducing run-time variability
Reducing Run-time Variability
using dplace
Using dplace for Memory Placement
using global memory
Using Global Memory for Single Copy Optimization
using MPI_DSM_CPULIST
MPI_DSM_CPULIST
using MPI_DSM_DISTRIBUTE
MPI_DSM_DISTRIBUTE (Linux only)
using MPI_DSM_MUSTRUN
MPI_DSM_MUSTRUN (IRIX only)
using MPI_DSM_PPM
MPI_DSM_PPM
using MPI_DSM_VERBOSE
MPI_DSM_VERBOSE
using PAGESIZE_DATA and PAGESIZE_STACK
PAGESIZE_DATA and PAGESIZE_STACK
using the XPMEM driver
Using the XPMEM Driver for Single Copy Optimization

Using MPIO_Wait and MPIO_Test
Must I use MPIO_Wait() and MPIO_Test()?

Window use
Finding Windows