JAX is a high-performance numerical computing library from Google that combines NumPy-like APIs with automatic differentiation, JIT compilation via XLA, and hardware acceleration across CPUs, GPUs, and TPUs. JAX adopts a functional programming paradigm with pure functions and immutable arrays, enabling powerful transformations like grad, vmap, pmap, and jit for building scalable machine learning and scientific computing pipelines. A key design principle: JAX doesn't impose a framework—it provides composable transformations that can be combined with neural network libraries like Flax, Haiku, or Equinox to build research-grade models with complete control over the training loop, making it especially popular in ML research where flexibility and performance are paramount.