Joule vs Oracle AI vs Custom—Architectural Fit for Mixed SAP/Legacy Landscape?

We’re a mid-sized manufacturer running SAP S/4HANA for finance and inventory, but still have legacy systems handling procurement and quality that aren’t going anywhere soon. Leadership wants AI capabilities embedded across the landscape—conversational interfaces for analysts, automated document processing for AP, and predictive insights for demand planning.

We’ve been evaluating three broad paths: SAP Joule for native S/4HANA integration, Oracle AI agents if we consider moving some modules to Fusion down the road, or building custom AI layers on top of what we have using APIs and external LLMs with RAG. Each has obvious appeal but also real concerns. Joule ties us deeper into SAP’s roadmap and only covers the SAP footprint. Oracle would mean a partial platform migration we’re not ready for. Custom gives flexibility but demands skills we don’t have in-house and raises questions about long-term support and governance.

Has anyone architected a hybrid approach in a mixed environment like this? How did you decide what to buy versus build, and how did you handle integration between vendor AI and legacy systems that lack modern APIs?

From an infrastructure perspective, think about where the AI workloads will run. Joule and Oracle AI are cloud-native and handle scaling for you, but custom solutions on legacy systems often struggle with compute demands. We ended up with a hybrid architecture: ERP on-prem, AI processing in the cloud using Azure, and a secure API gateway in between. That let us use modern LLMs and vector databases without migrating the ERP itself. The tradeoff is latency and added complexity in data synchronization, but it gave us flexibility to experiment with different models and RAG configurations without touching production ERP.

One thing to consider is data readiness before you commit to any architecture. We learned the hard way that our legacy procurement system had duplicate vendor records and inconsistent field mapping, which broke the custom AI agent we tried to deploy. It took three months of data cleansing before the LLM could reliably generate purchase requisitions. If your legacy systems don’t have clean, accessible data, neither Joule nor a custom solution will perform well. Start with a data quality audit across all systems—ERP and legacy—and map out what governance you’ll need regardless of which AI path you choose.

We piloted Joule in finance and it worked well for standard tasks, but when we tried extending it to procurement workflows that span our legacy vendor management system, it hit a wall. Joule doesn’t natively integrate with non-SAP systems, so we had to build custom skills in Joule Studio that call external APIs. That ended up being more effort than just building a standalone agent from scratch. If your differentiating workflows live in legacy systems, vendor AI probably won’t reach them without significant custom work anyway. You might end up building custom either way—just with different starting points.

Don’t underestimate the governance and skills gap. We found that even if you buy Joule or Oracle AI, you still need people who understand how to configure RAG pipelines, manage knowledge catalogs, and validate AI outputs. And if you go custom, the expertise bar is even higher. Almost half of organizations cite lack of AI expertise as a barrier, and that was true for us. We ended up partnering with a consulting firm for the first two pilots and used that time to upskill internal teams. Also, establish AI governance policies early—role-based access, bias checks, audit logs—because once AI starts taking autonomous actions, you need those controls in place.

We’re in a similar spot with Teamcenter PLM and SAP coexisting. What worked for us was treating vendor AI as the baseline for commodity workflows—like invoice matching or simple ledger queries—and reserving custom builds for the processes that actually differentiate us competitively. That meant Joule handled standard finance tasks in S/4, but we built a custom AI layer for engineering change orchestration across PLM and ERP because that workflow is unique to our product development cycle. The key was honest assessment: if the process isn’t a competitive edge, buy it; if it is, build it with APIs and RAG pointing to your own knowledge base.

Have you mapped out which use cases actually need agentic AI versus just conversational assistance? A lot of what leadership asks for—like conversational interfaces for analysts—can be handled with simpler co-pilot tools that don’t require multi-step orchestration or autonomous actions. Save the complex agent architectures for workflows that genuinely need planning and execution across systems, like automated purchase order creation or exception handling. We wasted time trying to build agents for tasks that didn’t need them. Start with an MVP for one high-value, well-defined workflow—document processing in AP sounds like a solid candidate—prove the ROI, then expand. That also buys you time to sort out data quality and integration patterns before scaling enterprise-wide.