Post Content
As enterprises invest more into generative AI solutions, finding a toolchain that is easy to use, scalable, and backed by enterprise support is needed to take full advantage of the new innovations.In this workshop, NVIDIA’s Adam Tetelman and Canonical’s Bartlomiej Poniecki-Klotz delve into the NVIDIA NGC Catalog and NVIDIA AI Enterprise which offers a suite of AI tools and frameworks including NVIDIA NIM microservices that integrate into cloud native projects like Kubernetes and KServe while offering security and enterprise support. The speakers demo how to take a state-of-the-art open Meta Llama 3.1 8b model, run inference using NVIDIA NIM, scale the model dynamically using the open-source KServe project, then use advanced PEFT techniques and deploy LoRA adapters trained by NVIDIA NeMo Customizer microservice to have multiple fine-tuned versions of the open model running on a single Kubernetes cluster. Everything shown during this demo can be repeated easily using resources that are readily available thanks to the large amount of community involvement around open model development and deployment. Learn more at https://canonical.com/solutions/ai and https://canonical.com/data Read More Canonical Ubuntu
#linux