The next MPI Forum meeting will be in Portland, OR, USA, in early March.
One of the major topics on the agenda will be voting on the MPI 3.1 standard.
You might be wondering what’s new in MPI-3.1.
I’m glad you asked. Read More »
Tags: HPC, mpi, MPI Forum, MPI-3, MPI-3.1
Jeff Hammond has recently started developing the BigMPI library.
BigMPI is intended to handle all the drudgery of sending and receiving large messages in MPI.
In Jeff’s own words:
[BigMPI is an] Interface to MPI for large messages, i.e. those where the count argument exceeds
INT_MAX but is still less than
SIZE_MAX. BigMPI is designed for the common case where one has a 64b address space and is unable to do MPI communication on more than 231 elements despite having sufficient memory to allocate such buffers.
Read More »
Tags: BigMPI, mpi, MPI Forum
Here’s some MPI quick-bites for this week:
- The MPI_MPROBE proposal was voted into MPI-3 a few weeks ago. Yay! (see this quick slideshow for an explanation of what MPI_MPROBE is)
- The Hardware Locality project just released hwloc v1.2. This new version now includes distance metrics between objects in the topology tree. W00t!
- Support for large counts looks good for getting passed into MPI-3; it’s up for its first formal reading at the upcoming Forum meeting.
- The same is true for the new MPI-3 one-sided stuff; it, too, is up for its first formal reading at the upcoming Forum meeting (they haven’t sent around their new PDF yet, but they will within a week or so — stay tuned here for updates).
- Likewise, the new Fortran-08 bindings are up for their first Forum presentation next meeting. We solved all of the outstanding Fortran issues with the F77 and F90 bindings… with the possible exception of non-blocking communication code movement. That one is still being debated with the Fortran language standardization body — it’s a complicated issue!
- Finally — the new MPI tools interface chapter is up for a first formal reading, too.
That’s a lot of first formal readings in one meeting…
Tags: HPC, hwloc, mpi, MPI Forum, MPI-3