Post Content
JAX is a Python library for high-performance machine learning research. It combines the familiarity of NumPy with hardware acceleration and composable function transformations. Google researchers have built and trained models like Gemini and Gemma on JAX, and it’s also used by researchers for a wide range of advanced applications. This talk will provide an introduction to JAX and the Flax neural network library, showcasing recent features and how to get started.
Resources:
Jax → https://goo.gle/4jYJ0a0
Jax AI Stack → https://goo.gle/3GMHPvH
Speakers: Yufeng Guo, Robert Dyro
Check out the AI session track from Google I/O 2025 → https://goo.gle/io25-ai-yt
Check out all of the sessions from Google I/O 2025→ https://goo.gle/io25-sessions-yt
Subscribe to Google for Developers → https://goo.gle/developers
Event: Google I/O 2025
Products Mentioned: AI/Machine Learning Read More Google for Developers