We’re excited to announce that DigitalOcean Gradient™ AI Platform now integrates natively with LlamaIndex - one of the most popular frameworks for building RAG applications.
This means you can now connect your Gradient AI Platform Knowledge Base and LLMs directly to LlamaIndex workflows, using the abstractions you already know. No additional infrastructure. No complex setup. Just install two packages and start building.
If you’ve built RAG applications before, you know the drill: provision a vector database, set up an embedding pipeline, manage credentials across services, and stitch everything together. It’s a lot of overhead before you write a single line of application logic.
With these new integrations, we’ve done the heavy lifting. Your Knowledge Base handles document ingestion, chunking, and embeddings. The LlamaIndex retriever connects directly to it. Add our LLM integration, and you have a complete RAG pipeline running on managed DigitalOcean infrastructure.
llama-index-retrievers-digitalocean-gradientai Connect to your Knowledge Base as a LlamaIndex retriever. Supports hybrid search (keyword + semantic), metadata filtering, and async operations.
llama-index-llms-digitalocean-gradientai Use Gradient AI Platform-hosted LLMs in your LlamaIndex workflows. Supports streaming responses and async for high-throughput applications.
Both packages work with LlamaIndex query engines, chat engines, callbacks, and the broader ecosystem.
From there, configure your Gradient AI Platform credentials and drop the retriever and LLM into your existing LlamaIndex code. Check out our documentation for a complete walkthrough and code examples.
These integrations open up a range of possibilities:
Support assistants grounded in your product documentation
Internal tools that query company wikis and runbooks
Code assistants with context from private repositories
Research tools for document-based Q&A
If you’re already using LlamaIndex, you can integrate the Gradient AI Platform into your existing application. If you’re starting fresh, you now have a fully managed path from Knowledge Base to production RAG app.
This is just the beginning. We’re continuing to expand Gradient AI Platform integrations with popular AI frameworks, and we’d love to hear what you’re building.