We show that latent diffusion models are robust to compression in the context of physics emulation, reducing computational cost while consistently outperforming non-generative alternatives.
To usher in a new class of machine learning for scientific data, building models that can leverage shared concepts across disciplines. We aim to develop, train, and release such foundation models for use by researchers worldwide.