Forward models and MCMC inference for planetary interior structure, with a focus on tidal response and gravitational moments of gas giant planets and icy satellites.
Author: Benjamin Idini (bidini@ucsc.edu)
-
Idini, B. & Stevenson, D.J. (2021). Dynamical tides in Jupiter as revealed by Juno. The Planetary Science Journal, 2(2), 69. 10.3847/PSJ/abe715
-
Idini, B. & Nimmo, F. (2024). Resonant stratification in Titan's global ocean. The Planetary Science Journal.
interiorize provides:
- Interior structure models — density and composition profiles for n=1
polytropes with heavy-element gradients (
CalcInterior,StitchInterior). - Tidal response solvers — dynamical and quasi-static tide ODE systems for polytropes and uniform-density oceans, with and without Coriolis.
- MCMC inference — emcee-based sampler with parallel-tempering, MPI support, and convergence diagnostics for fitting Jupiter's gravitational moments.
- Seismology utilities — GYRE wrapper for normal-mode frequencies and tidal coupling integrals.
Python ≥ 3.10. Core dependencies (from requirements.txt):
| Package | Purpose |
|---|---|
| numpy, scipy | numerics |
| matplotlib | plotting |
| sympy | Clebsch–Gordan coefficients |
| emcee | MCMC sampling |
| schwimmbad | MPI pool abstraction |
| mpi4py | MPI parallelism (MCMC only) |
| pygyre | GYRE output reader (seismology) |
| corner, tabulate, tqdm, pyfiglet | MCMC output utilities |
# 1. Clone the repository
git clone https://github.com/bidini/interiorize.git
cd interiorize
# 2. (Recommended) create a virtual environment
python -m venv .venv
source .venv/bin/activate # Windows: .venv\Scripts\activate
# 3. Install dependencies
pip install -r requirements.txt
# 4. Install the package in editable mode
pip install -e .Note:
mpi4pyrequires a working MPI installation (e.g. OpenMPI or MPICH) even on a laptop. Install it first:
- macOS:
brew install open-mpi- Ubuntu/Debian:
sudo apt install libopenmpi-dev- Then:
pip install mpi4pyIf you only need the core forward models and do not plan to run MCMC, you can skip
mpi4pyby removing it fromrequirements.txtbeforepip install.
from interiorize.interior_calculations import CalcInterior
theta = [0.3, 0.2, 0.6, 0.02] # [Zc, hc, Renv, Zenv]
m = CalcInterior(theta)
print(f"R = {m.R/6.9911e9:.4f} R_Jup") # should be ~1
print(f"M = {m.M/1.898e30:.4f} M_Jup") # should be ~1pytest test/test_interior.py test/test_solvers.py -vExpected: 4 passed. (test_mpi.py and test_models.py require mpi4py
and tabulate respectively; they are integration tests, not unit tests.)
from interiorize.interior_calculations import CalcInterior, StitchInterior
# 4-parameter model: dilute core only
theta4 = [0.3, 0.2, 0.6, 0.02] # [Zc, hc, Renv, Zenv]
m = CalcInterior(theta4)
# Access derived quantities (computed lazily on first use)
print(m.R) # radius (cm)
print(m.rho) # density profile (g cm⁻³)
print(m.J2) # J2 gravitational moment
# 7-parameter model: dilute core + outer EOS perturbation
theta7 = [0.3, 0.2, 0.6, 0.02, 0.05, 0.1, 0.85]
m2 = StitchInterior(theta7)Edit the options dict at the top of one of the example scripts in
sampler/examples/ (e.g. mcmc_toy.py):
opt = {
'mpi': False, # use multiprocessing, not MPI
'nwalkers': 32, # number of MCMC walkers
'ite_burn': 500, # burn-in steps
'ite_run': 2000, # production steps
...
}Then run directly:
python sampler/examples/mcmc_toy.pyOutput (chain HDF5 files and diagnostic plots) is written to the directory
configured in opt['output_dir'].
For production runs with many walkers across multiple nodes, set 'mpi': True
in the options dict and launch with mpirun or your scheduler's MPI launcher.
Recommended walker count: use 4–8 × the number of free parameters. For the standard 4-parameter model, 32 walkers is sufficient on a laptop; for the 7–8-parameter models, 200 walkers on 4–8 MPI ranks gives good mixing.
#!/bin/bash
#SBATCH --job-name=interiorize_mcmc
#SBATCH --nodes=4
#SBATCH --ntasks-per-node=16 # 64 MPI ranks total
#SBATCH --cpus-per-task=1
#SBATCH --mem=8G
#SBATCH --time=24:00:00
#SBATCH --output=mcmc_%j.log
# Load modules (adjust for your cluster's module system)
module load python/3.11
module load openmpi/4.1
# Activate environment
source /path/to/venv/bin/activate
# Each MPI rank evaluates one walker per step
mpirun -np $SLURM_NTASKS python sampler/examples/mcmc_8d.py#!/bin/bash
#PBS -N interiorize_mcmc
#PBS -l nodes=4:ppn=16
#PBS -l walltime=24:00:00
#PBS -j oe
cd $PBS_O_WORKDIR
source /path/to/venv/bin/activate
mpirun -np 64 python sampler/examples/mcmc_8d.pyinteriorize uses schwimmbad to abstract
the worker pool. When 'mpi': True, SamplingJob creates an MPIPool and
emcee distributes one log-probability evaluation per walker to each MPI rank.
The master rank (rank 0) drives the sampler loop; all other ranks wait for work.
The pattern in the runner is:
# sampling_job.py (simplified)
from schwimmbad import MPIPool as Pool
with Pool() as pool:
if not pool.is_master():
pool.wait()
sys.exit(0)
sampler = emcee.EnsembleSampler(nwalkers, ndim, log_prob, pool=pool)
sampler.run_mcmc(p0, nsteps)Tips for HPC efficiency:
- Use at least as many MPI ranks as walkers (ideally
nranks = nwalkers). - Each
CalcInteriorevaluation is single-threaded (~10 ms); scaling is linear with ranks up tonwalkers. - Set
OMP_NUM_THREADS=1andMKL_NUM_THREADS=1to avoid BLAS oversubscription when running many single-threaded workers per node:export OMP_NUM_THREADS=1 export MKL_NUM_THREADS=1 mpirun -np 64 python sampler/examples/mcmc_8d.py
- Chain progress is saved to HDF5 backends so runs can be resumed after interruption without losing completed steps.
interiorize/
├── interiorize/
│ ├── interior_calculations.py # CalcInterior, StitchInterior forward models
│ ├── solvers.py # Chebyshev BVP/IVP solver primitives
│ ├── polytrope.py # Tidal ODE coefficients for n=1 polytrope
│ ├── poincare.py # Dynamical tide in uniform-density ocean
│ ├── hough.py # Internal gravity waves in stratified shell
│ ├── static_stability.py # Slow-tide and Brunt–Väisälä helpers
│ └── seismo.py # GYRE wrapper for normal-mode calculation
├── sampler/
│ ├── sampling_job.py # SamplingJob class (MCMC runner + diagnostics)
│ ├── pdf_models.py # Log-probability models (likelihood + prior)
│ ├── ptsampler.py # Parallel-tempering sampler
│ └── examples/ # Ready-to-run example scripts
├── test/
│ ├── test_interior.py # Unit tests: CalcInterior, StitchInterior
│ └── test_solvers.py # Unit tests: Chebyshev solver
├── requirements.txt
└── setup.py
See LICENSE.txt.