Trim: A foundation model for physics.

What is it?

Trim is building an AI model that can simulate real-world physical systems evolving over time. For example, given the starting position of waves on a beach, the model generates how those waves move forward in time.

What’s wrong with a traditional physics simulation?

Traditional physics simulations take exponentially longer to run as you simulate more dimensions and polynomially longer as you increase the size of the simulation. The Trim Transformer’s linear-attention scales linearly in computation time with respect to both dimensions and grid size.

With these architectural advantages, latency sensitive tasks like an autonomous vehicle choosing its path drop by several orders of magnitude and previously computationally infeasible tasks like detecting gravitational waves become possible.

How does it work?

Trim trains our models by running traditional physics simulations and feeding the results into our pipeline. The Trim Transformer is a custom implementation of Galerkin-type attention. One way to think of our models is as a constant-time lossy lookup table.

Our blog explains our architecture and the design challenges we’ve overcome.