Introduction
When I first released the OpenAI module for Backdrop CMS, my goal was simple: bring practical, editor-friendly AI tools to Backdrop. Summaries, taxonomy suggestions, alt-text generation, content analysis, all the things Drupal developers were starting to experiment with, but in a way that fit naturally into Backdrop.
A lot has changed since then. The AI landscape is moving fast. New providers appear constantly. APIs evolve. Pricing models shift. Local models are suddenly realistic. It became clear pretty quickly that tying a CMS integration to a single vendor was going to be limiting.
Rebuilding the OpenAI Module
So I rebuilt the OpenAI module. The result is a new version that’s no longer just “OpenAI.” It’s now a provider-agnostic AI integration framework for Backdrop CMS.
From One Provider to Many
At the core of the new architecture is a simple idea: the module shouldn’t care which AI provider you’re using. Instead of hard-coding everything to OpenAI, the base openai module now defines a common Interface that every provider must implement. Each provider lives in its own sub-module and supplies an adapter class that conforms to that interface.
Right now, supported providers include:
- OpenAI
- Ollama (local models, fully self-hosted)
- OpenRouter
- Groq
- Google Gemini
- Anthropic (Claude)
For providers like OpenRouter, Groq, and Ollama that expose OpenAI-compatible endpoints, the adapter simply routes requests through the existing OpenAI PHP client. For providers like Gemini and Anthropic, which use completely different APIs, the adapter implements their native endpoints directly.
That means:
- You can switch providers without rewriting integrations
- You can experiment with new backends safely
- You can mix hosted and local models depending on your needs
Why Provider-Agnostic Matters
This isn’t just an architectural exercise. AI providers are not stable in the way traditional APIs are. Models change frequently, endpoints evolve, pricing can shift overnight. By separating the provider layer from the core logic, Backdrop sites are no longer locked into one vendor or one strategy.
Today it might be OpenAI. Tomorrow it might be Claude. Next year it might be something running entirely on your own hardware. Your Backdrop code doesn’t have to care.
Backdrop Is Not “Just Drupal 7”
This module also proves something else that I care a lot about: Backdrop is not frozen in time. Backdrop core intentionally stays accessible. It doesn’t require namespaces everywhere, force dependency injection into every form builder, or demand Composer for every site. But contrib modules are free to adopt modern strategies.
This module uses:
- Composer for dependency management
- Namespaces
- Interfaces
- Classes and adapters
- Clean separation between core logic and provider implementations
In other words: modern PHP architecture, inside a system that still lets you write simple procedural code when that’s the right tool. That combination is incredibly powerful.
This Is a Platform, Not Just a Module
What started as an OpenAI integration is now a small platform inside Backdrop. A stable provider interface, pluggable backend implementations, and shared infrastructure for chat, embeddings, and content tools. This architecture is already being used in chatbots and assistants, embeddings and vector search experiments, content analysis and automation tools, SEO audits, alt-text generation, and summaries.
And it opens the door to much more, especially as local and open-source models continue to improve.
Try It Out
The provider-agnostic OpenAI module and its provider sub-modules are available on GitHub and through Backdrop contrib.
