Generative AI in Production
Contact us to book this course
Learning Track
Generative AI
Delivery methods
On-Site, Virtual
Duration
1 day
Traditional MLOps is a set of practices to productionize traditional ML systems for enterprise applications. Generative AI raises new challenges in managing and productionizing applications at scale. The field of generative AI operations seeks to address these new challenges. In this course, you learn about the challenges that arise when deploying and productionizing generative AI-powered applications. You learn how to secure your generative AI-powered applications. Finally, you will discuss best practices for logging and monitoring your generative AI-powered applications in production.
Course objectives
- Understand the challenges in productionizing applications using generative AI
- Manage experimentation and evaluation for LLM-powered applications
- Productionize LLM-powered applications
- Secure generative AI applications
- Implement logging and monitoring for LLM-powered applications
Audience
- Developers, DevOps engineers, and machine learning engineers who wish to operationalize gen AI-based applications
Prerequisites
- Completion of the Application Development with LLMs on Google Cloud or equivalent knowledge.
Course outline
- Generative AI operations
- Traditional MLOps vs. GenAIOps
- Components of an LLM System
- RAG/ReAct architecture
- Application deployment options
- Deployment, packaging, and versioning
- Lab: Deploying an Agentic Application on Cloud Run
- Maintenance and updates
- Testing and evaluation
- CI/CD pipelines for gen AI-powered apps
- Lab: Tracking Versions of Generative AI Applications
- Security challenges
- Prompt security
- Sensitive Data Protection and DLP API
- Model Armor
- Lab: Securing Generative AI-Powered Applications
- Cloud Operations
- Cloud Logging
- Monitoring
- Cloud Trace
- Agent Analytics and AgentOps
- Putting it all together
- Lab: Logging, Monitoring, and Agent Analytics