GenSBI#

Tests Coverage Version Downloads

GenSBI Logo

Project Status

GenSBI is currently reaching the end of the Alpha cycle. The API is reaching stability, but may still change in the future.

Getting Started#

New to GenSBI?

Start here:

  1. Installation - Get GenSBI installed

  2. Quick Start Guide - 15-minute introduction

  3. My First Model Tutorial - Complete step-by-step walkthrough

Standard Installation (CPU / Compatible)#

pip install gensbi

High-Performance Installation (CUDA 12)#

If you have a compatible NVIDIA GPU, install with CUDA 12 support for significantly faster training:

pip install gensbi[cuda12]

For more installation options, see the Installation Guide.

Key Documentation Sections#

📚 Basics#

Learn the core concepts and how to use GenSBI effectively:

📖 Examples#

See GenSBI in action with complete working examples:

All examples are available in the GenSBI-examples repository.

🔧 API Reference#

Detailed API documentation for all classes and functions:

👥 Contributing#

Want to contribute? Check out the guides:

Examples#

two-moons posterior sampling two-moons posterior sampling

Some key examples include:

Getting Started:

Unconditional Density Estimation:

  • flow_matching_2d_unconditional.ipynb Open In Colab
    Demonstrates how to use flow matching in 2D for unconditional density estimation.

  • diffusion_2d_unconditional.ipynb Open In Colab
    Demonstrates how to use diffusion models in 2D for unconditional density estimation.

Conditional Density Estimation:

  • two_moons_flow_simformer.ipynb Open In Colab
    Uses the Simformer model for posterior density estimation on the two-moons benchmark.

  • two_moons_flow_flux.ipynb Open In Colab
    Uses the Flux1 model for posterior density estimation on the two-moons benchmark.

  • gaussian_linear_flow_flux1joint.ipynb Open In Colab
    Uses the Flux1Joint model for posterior density estimation on the Gaussian Linear benchmark.

  • slcp_flow_simformer.ipynb Open In Colab
    Uses the Simformer model for posterior density estimation on the SLCP benchmark.

See the Examples page for the complete list and detailed descriptions.

AI Usage Disclosure

This project utilized large language models, specifically Google Gemini and GitHub Copilot, to assist with code suggestions, documentation drafting, and grammar corrections. All AI-generated content has been manually reviewed and verified by human authors to ensure accuracy and adherence to scientific standards.

Citing GenSBI#

If you use this library, please consider citing this work and the original methodology papers, see references.

@misc{GenSBI,
  author       = {Amerio, Aurelio},
  title        = "{GenSBI: Generative models for Simulation-Based Inference}",
  year         = {2025}, 
  publisher    = {GitHub},
  journal      = {GitHub repository},
  howpublished = {\url{https://github.com/aurelio-amerio/GenSBI}}
}

Similar packages#

GenSBI is designed to provide a numerically efficient JAX implementation of flow and diffusion models, complementing existing SBI libraries. You might also want to check out:

  • sbi: A comprehensive Pytorch-based package for simulation-based inference. It implements neural posterior estimation (NPE), neural likelihood estimation (NLE), and neural ratio estimation (NRE) methods. It is an excellent choice for a wide range of SBI tasks and supports amortized as well as sequential inference.

  • swyft: An official implementation of Truncated Marginal Neural Ratio Estimation (TMNRE). It is designed to be highly efficient for marginal posterior estimation and scales well to complex simulations, leveraging dask and zarr for handling large datasets.

  • ltu-ili: The “Learning the Universe” Implicit Likelihood Inference library. It unifies multiple SBI backends (including sbi, pydelfi, and lampe) under a single interface, making it easy to benchmark different methods. It is particularly focused on applications in astrophysics and cosmology.

  • sbijax: A simulation-based inference library built on top of JAX. It implements standard neural simulation-based inference methods (NPE, NLE, NRE) as well as ABC, leveraging JAX’s just-in-time compilation and automatic differentiation for high-performance inference. Its API is inspired by the sbi package.