Minikube Setup Guide — NVIDIA Dynamo Documentation
Title: Minikube Setup Guide — NVIDIA Dynamo Documentation
URL Source: https://docs.nvidia.com/dynamo/latest/kubernetes/deployment/minikube.html
Published Time: Fri, 07 Nov 2025 17:50:49 GMT
Markdown Content: Skip to main content
Back to top Ctrl+K
latest
latest0.6.10.6.00.5.10.5.00.4.10.4.00.3.20.3.10.3.00.2.10.2.0
Search Ctrl+K
Search Ctrl+K
latest
latest0.6.10.6.00.5.10.5.00.4.10.4.00.3.20.3.10.3.00.2.10.2.0
Table of Contents
Getting Started
Kubernetes Deployment
User Guides
Components
Design Docs
-
Minikube Setup Guide
Minikube Setup Guide#
Don’t have a Kubernetes cluster? No problem! You can set up a local development environment using Minikube. This guide walks through the set up of everything you need to run Dynamo Kubernetes Platform locally.
- Install Minikube#
First things first! Start by installing Minikube. Follow the official Minikube installation guide for your operating system.
- Configure GPU Support (Optional)#
Planning to use GPU-accelerated workloads? You’ll need to configure GPU support in Minikube. Follow the Minikube GPU guide to set up NVIDIA GPU support before proceeding.
Tip
Make sure to configure GPU support before starting Minikube if you plan to use GPU workloads!
- Start Minikube#
Time to launch your local cluster!
Start Minikube with GPU support (if configured)
minikube start --driver docker --container-runtime docker --gpus all --memory=16000mb --cpus=8
Enable required addons
minikube addons enable istio-provisioner minikube addons enable istio minikube addons enable storage-provisioner-rancher
- Verify Installation#
Let’s make sure everything is working correctly!
Check Minikube status
minikube status
Verify Istio installation
kubectl get pods -n istio-system
Verify storage class
kubectl get storageclass
Next Steps#
Once your local environment is set up, you can proceed with the Dynamo Kubernetes Platform installation guide to deploy the platform to your local cluster.
previous Working with Dynamo Kubernetes Operatornext Observability
On this page
- 1. Install Minikube
- 2. Configure GPU Support (Optional)
- 3. Start Minikube
- 4. Verify Installation
- Next Steps
Privacy Policy | Manage My Privacy | Do Not Sell or Share My Data | Terms of Service | Accessibility | Corporate Policies | Product Security | Contact
Copyright © 2024-2025, NVIDIA CORPORATION & AFFILIATES.
Links/Buttons:
- Skip to main content
- NVIDIA Dynamo Documentation
- latest
- 0.6.1
- 0.6.0
- 0.5.1
- 0.5.0
- 0.4.1
- 0.4.0
- 0.3.2
- 0.3.1
- 0.3.0
- 0.2.1
- 0.2.0
- GitHub
- Installation
- Support Matrix
- Examples
- Deployment Guide
- Kubernetes Quickstart
- Detailed Installation Guide
- Dynamo Operator
- Minikube Setup
- Observability (K8s)
- Metrics
- Logging
- Multinode
- Multinode Deployments
- Grove
- Tool Calling
- Multimodality Support
- Finding Best Initial Configs
- Dynamo Benchmarking Guide
- Tuning Disaggregated Performance
- Writing Python Workers in Dynamo
- Observability (Local)
- Metrics Visualization with Prometheus and Grafana
- Health Checks
- Glossary
- Backends
- vLLM
- SGLang
- TensorRT-LLM
- Router
- Planner
- SLA Planner Quick Start
- SLA-Driven Profiling
- SLA-based Planner
- KVBM
- Motivation
- Architecture
- Components
- Design Deep Dive
- Integrations
- KVBM in vLLM
- KVBM in TRTLLM
- LMCache Integration
- Further Reading
- Overall Architecture
- Architecture Flow
- Disaggregated Serving
- Distributed Runtime
- #
- Minikube installation guide
- Minikube GPU guide
- Privacy Policy
- Manage My Privacy
- Do Not Sell or Share My Data
- Terms of Service
- Accessibility
- Corporate Policies
- Product Security
- Contact