🤖

Next Generation Open Machine Learning Operations with Practicus AI

Jul 10, 2024

Lecture: Next Generation Open Machine Learning Operations with Practicus AI

Introduction

  • Practicus AI provides solutions for modern machine learning operations (MLOps).
  • It leverages open source cloud-native technology to avoid vendor lock-in.

Key Concepts

Operational Requirements

  • Successful AI systems require diverse operational needs.
  • Different teams have distinct expertise and tool requirements.
  • Use of multiple data sources across various clouds and on-premises locations.
  • Utilization of modern data mesh architectures for unified user experience and federated governance.

Deployment & Consumption

  • AI models can be deployed and consumed using Practicus AI:
    • Multiple personas can choose different deployment methods.
    • The infrastructure as code is open, well-documented, and repeatable.

User Experience

  • Example: Load data from a data lake and build a model to predict customer churn using AutoML.
  • Models can be deployed as APIs for immediate consumption by others.
  • Supports exporting AutoML code to Jupyter for modifications or custom models.

Technical Implementation

Model Management

  • Practicus AI uses model prefixes, names, and versions as logical elements.
    • Logical elements are tied to Kubernetes deployments.
    • Allows for dynamic service mesh setup.
  • Multiple versions can be deployed simultaneously for A/B testing.
  • Management of hundreds of models and versions daily is supported.

Demo: Web Admin UI

  • Steps for hosting models:
    • Click on model hosting and add a new physical deployment.
    • Define CPU, RAM, and optionally activate auto-scaling.
    • Grant access to groups or individual users.
    • Create model prefixes, which define URLs and production settings.
    • Define models and versions with deployment, stage, and traffic weight settings.

Advanced Scenarios

  • Practicus AI in multi-location deployment:
    • Supports different public and private clouds with varied data sources.
    • Single interface for business users to explore systems and data sources.
    • Developers use fine-grained access control tokens.
  • Global APIs for high availability with automatic traffic routing during failures.

Open Source Benefits

  • System operates continuously even after uninstalling Practicus AI.
  • Legacy/proprietary systems can be modernized by wrapping with Practicus AI MLOps.

Summary

  • Practicus AI offers a comprehensive solution for MLOps with open-source technology.
  • Enables modernization, seamless deployment, and consumption of AI models.

Contact

  • Questions and further inquiries are welcome.