Skip to content

Incompatibility of mpi4py and multiprocessing in depletion #3640

@paulromano

Description

@paulromano

The openmc.deplete module relies on calls to the OpenMC shared library via openmc.lib in order to perform neutron transport, update material compositions, etc., which means parallel execution has to be coordinated using mpi4py and the Python script itself should be called with mpiexec. The actual solve of the Bateman equations is done using the CRAM method, which is coded entirely in Python (using scipy.sparse); the parallelization strategy there is to break up the full list of materials being depleted over MPI processes (via mpi4py) and then over thread via multiprocessing. The default configuration of mpi4py and multiprocessing do not play well together due to the following sequence of events:

  1. Importing mpi4py will result in MPI_Init getting called
  2. For most Python versions, using a multiprocessing pool will create multiple processes via an OS fork
  3. MPI processes that are already initialized generally should not be forked as this results in implementation-dependent behavior and can result in deadlocks.

A few workarounds currently are to:

  • Disable use of multiprocessing in depletion
  • Use a start method other than "fork" in multiprocessing
  • Disable automatic initialization/finalization from mpi4py

Further down the line, we may want to consider using free-threading in Python 3.14+ as this should allow us to get parallelization without creating new processes that cause problems with MPI.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions