Operationalizing – Continuously monitor your LLM apps in production

Operationalizing – Continuously monitor your LLM apps in production

Post Content

​ In the final video of this 5-part series, Takuto and Vishnu will walk you through monitoring your LLM apps in production. Changes in data and consumer data can influence your systems over time; Azure AI Studio’s model monitoring makes this easier for safety and quality. Vishnu Pamula will walk you through the LLM flows previous demonstrated, integrating with CI/CD pipelines, tracking metrics, monitoring LLM performance, and integrating with GitHub (https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-end-to-end-llmops-with-prompt-flow?view=azureml-api-2).   Read More Microsoft Developer 

Author:

Leave a Reply