In conversations with finance teams navigating automation, a familiar pattern often emerges. Leaders know their accounting operations need to evolve, but the path forward isn’t always clear. The sheer scope of a transformation can be paralyzing.
You can get out of this state of shock and start making strides when you realize you don’t need to overhaul your entire accounting function overnight.
I recommend a more pragmatic approach: Begin with a narrow focus, apply agile methods and build momentum through small, structured wins. Agile, originally a software development methodology, works exceptionally well in finance when adapted thoughtfully. Applied to accounting, it can give you a structured way to modernize processes without sacrificing efficient daily operations.
When you get it right, the transformation can feel like magic — not because it’s effortless but because of how dramatically it simplifies the work.
Step 1: Define your project and assemble your team
Agile begins with a clear purpose. What part of your accounting cycle is ripe for change? It might be:
Reducing manual effort in preparing recurring journal entries
Standardizing reconciliations for high-risk balance sheet accounts
Improving visibility and control over intercompany eliminations
Once you’ve selected your initial focus, identify a small, cross-functional team. That might include one or two accountants who manage the process today, a member of your IT or automation team and a team lead or controller to serve as the product owner.
Your goal is to scope out a project small enough to deliver real progress in a few weeks, rather than trying to automate everything.
Step 2: Choose your sprint cadence
Agile teams work in time-boxed cycles called sprints. In software, sprints typically last two weeks. This same rough sprint cadence also works well for finance. In my experience, two staggered sprints per month allow you to maintain momentum without interfering with the month-end or quarterly close cycle.
The key is to make the sprint regular and predictable. Every two weeks, your team should:
Review what was completed
Set clear, achievable goals for the next sprint
Prioritize the next set of tasks
Assign ownership based on capacity
This rhythm helps you maintain forward progress even amid daily demands and the ebbs and flows of a typical fiscal year.
Step 3: Start with process selection and discovery
Your first sprint should focus on understanding the process you want to improve. Let’s say you choose to automate a journal entry for prepaid expenses. This first step isn’t writing scripts. You need to understand how the process works today (pain points included), what systems and data are involved, what artifacts exist and what volume and complexity you’re dealing with.
Say you’re working on a recurring entry to allocate depreciation. You need to uncover: how the entry is generated today, what triggers it and when in the accounting period, which accounts it impacts, what documentation and validations exist and who reviews or adjusts it before it’s posted to the general ledger. You might also need to gather artifacts like Excel templates, email approval flows or ERP screenshots. These are your starting points for making sure your automation reflects a real workflow rather than an ideal one.
Don’t underestimate the importance of the discovery phase in making sure your automation efforts are grounded in reality.
Step 4: Break down tasks and build your backlog
Once you’ve scoped your process and gathered what you need, it’s time to translate your findings into tasks. Some examples:
Map the current workflow in a flowchart and make sure you cover any places where the process could fail or have to start over
Identify fields and logic needed for journal entry automation, so you know the required data and calculations
Review automation platform capabilities (e.g., templates or connectors)
Write acceptance criteria for a successful automation — this is how you’ll prove your new automation is working
Prepare test data or validate entry logic, and be sure to include several examples of the different kinds of data you might see to cover the most probable cases
Tasks that can’t be finished in this sprint go into your backlog. You can reprioritize that backlog after each sprint based on what you’ve learned or what’s most urgent.
Some tasks may expose gaps in how the process works today, and that’s a good thing. Agile sprints are built for learning, not perfection.
Step 5: Communicate, adjust and demo progress
A key agile principle is transparency. Short, regular check-ins — say, 15 minutes twice a week — keep everyone aligned and aware of blockers. No need for slides or long updates. A quick “What’s done, what’s next and what’s in the way?” is usually enough.
At the end of the sprint, reconvene for a demo. Even if you didn’t automate the entire process, showing a prototype or workflow map can energize your team and stakeholders. Use what you learn to shape the next sprint.
Where to start? Go for high pain, low complexity
If you’re not sure where to begin, I often recommend focusing on account reconciliations. They’re a consistent source of friction and effort, especially for temporary account balances or frequently adjusted liabilities. But many can be standardized or automated with relatively little effort.
For example, bank reconciliations follow a predictable pattern. Accrual accounts only need simple threshold logic. And intercompany receivables/payables might just require timing alignment.
Journal entries are another good candidate, particularly if they’re recurring and related to depreciation, allocations or amortizations. Their fixed logic and regular intervals make them perfect for early wins.
The record-to-report (R2R) cycle contains many interconnected subprocesses that are ideal for incremental automation. Applying agile to this domain brings visibility and momentum to your transformation efforts while minimizing risk and burnout.
Agile is how finance gets things done
Finance doesn’t often borrow from the world of software development, but it should. The pressure is real today to modernize, optimize and transform while still closing the books on time — no small feat. Agile gives your accounting team a way to improve processes iteratively, without waiting for perfect conditions or massive budgets. They get a repeatable structure and still have space for experimentation. Once they see how agile can turn a painful process into a streamlined one, you’ll have the buy-in you need to scale your automation strategy across your finance organization.
You won’t need a wand, just the right structure, people and mindset. Those create the real magic.
A cross-functional team of researchers has spent months developing a next-generation machine learning (ML) model designed to predict how a new compound behaves across multiple biological targets. It’s the kind of computational power that can accelerate drug discovery by weeks or months and bring life-saving therapies to market faster.
Despite an optimized IT infrastructure and cloud environment, the simulation doesn’t start because the latest compound batch data hasn’t been validated in SAP. The experiment metadata is still siloed in spreadsheets, and the model can’t ingest incomplete or inconsistent values. In other words, the fluid connection required between systems isn’t there.
As you may well know if you work in this industry, this isn’t a hypothetical delay. Data readiness can’t be treated as a side task, although it too often is. In which case, it doesn’t matter how advanced an AI model you have. With regulatory pressures high, the cost of a subtle misalignment is steep.
Because this applies whether you’re simulating compounds, ensuring patient records are anonymized and audit-ready or forecasting inventory, critical processes break down when data stays disconnected. Leading healthcare and pharmaceutical organizations are attempting to solve this common problem by rethinking how data moves from SAP to ML platforms to analytics and back.
Life science’s parallel pipelines: Innovation and execution
In life sciences organizations like yours, innovation happens on two fronts. On one side, your R&D teams use AI and massive datasets to accelerate discovery. ML models in AWS SageMaker or Schrödinger Suite predict promising compound structures, while simulation platforms test toxicity and efficacy before running a single experiment.
On the other side, your clinical and supply chain teams ensure those discoveries reach patients safely and cost-effectively while following all compliance regulations. They manage everything from patient enrollment to cold chain logistics to regulatory filing, with each process powered by SAP supply chain and life sciences solutions and custom platforms.
These processes live in very different domains, but they share a common dependency: structured, timely, accurate data. And in too many organizations, that data still moves manually or asynchronously between systems.
Where the cracks appear
When SAP data isn’t orchestrated, critical handoffs break down and molecular data must be manually pulled from SAP R&D Management to feed AI pipelines. Trial operations build forecasts on outdated enrollment data. Lab results live in one system and regulatory documentation in another, with no feedback loop. Business users wait on IT to reconcile siloed datasets and generate reports.
Drug discovery is increasingly computational, but that doesn’t mean the work is fully automated. Whether you’re managing experiments or kits, the pain is the same: unreliable flow, lost time and elevated risk. Without intelligent orchestration, pipelines either fall apart or deliver fragmented, stale information. This directly undermines the performance of AI models and introduces bias or neglects to provide key correlations. Essentially, you end up making decisions with outdated datasets — or worse, hallucinations. Predictive models built to accelerate discovery or optimize trial logistics can quickly fall out of compliance with data lineage and validation requirements.
Meanwhile, if you cling to these fragmented or manually stitched data pipelines, you face another growing disadvantage: You can’t match the speed of your competitors. Those who are investing in intelligent, adaptive data orchestration are moving faster while proving the trustworthiness of their AI-driven insights.
High-fidelity orchestration is the foundation of competitive agility and relevance in your industry.
Research, meet orchestration
Orchestration is what makes AI scale in R&D. Your SAP environment becomes the launchpad for faster, smarter research, enabling you to:
Continuously extract experimental and batch data from SAP R&D Management and SAP Analytics Cloud
Send compound specs to AWS SageMaker or Schrödinger Suite for modeling
Coordinate modeling jobs and return results to Databricks for consolidation
Push insight summaries about ranked candiddates back into SAP
Trigger alerts for research leads of successful outcomes or red flags and send validated results to SAP Datasphere
Clinical delivery, intelligently aligned
On the delivery side, timing is everything. Clinical trial operations depend on up-to-date patient enrollment data, trial protocols and inventory levels across distributed trial sites. If systems aren’t aligned, sites risk running out of supplies or holding expired stock.
With proper orchestration:
Enrollment data from SAP Intelligent Clinical Supply Management flows into forecasting tools
ML models in Azure ML or Databricks predict site-specific demand
Stock levels in SAP Integrated Business Planning (IBP) or S/4HANA Materials Management (MM) are cross-checked automatically
If risk is flagged, replenishment is triggered and stakeholders are notified
Trial performance metrics update automatically in SAP Analytics Cloud
All data is centralized in SAP Business Data Cloud (BDC) for regulatory compliance and real-time insight
Data-driven defense against disruption
When the unexpected hits, data orchestration is the difference between rerouting and reacting.
Take supply chain disruptions, which are a matter of when, not if, in pharma. A shortage of active ingredients, a vendor backlog, a shipping delay — any of these can jeopardize production schedules or trial timelines.
The real risk isn’t the event itself but what happens when your systems can’t respond in time. With orchestrated data pipelines between SAP S/4HANA, SAP IBP and platforms like Databricks or Azure Synapse, you can spot shortages early, simulate impacts and initiate contingency plans.
A research-to-treatment automation fabric
True transformation comes when discovery and delivery are both orchestrated from end to end. Here’s what a real automation fabric looks like.
Forecasting clinical and manufacturing needs
Export enrollment or order data from SAP S/4HANA
Clean and enrich using SAP Datasphere
Run predictive models via Databricks, Azure ML or SageMaker
Feed outputs into SAP IBP for dynamic planning
Managing research and validation
Extract compound data from SAP R&D Management
Coordinate modeling jobs in Schrödinger Suite
Score and validate candidates in Databricks
Trigger SAP updates and notify research teams automatically
Controlling inventory and site logistics
Pull inventory positions from S/4HANA
Reconcile with forecasted site needs from SAP IBP and ML pipelines
Generate and dispatch replenishment orders
Publish everything in SAP Analytics Cloud for transparency
Keeping teams informed and aligned
Push alerts to supply, clinical or research leads based on process outcomes
Route structured datasets to reporting dashboards and compliance archives
Automate audit trails, approvals and next-step triggers
With every step validated, timestamped and secure thanks to RunMyJobs by Redwood, your data flows continuously, allowing you to be proactive instead of reactive.
Audit-ready AI depends on orchestrated data
The rise of AI in life sciences is helping to optimize molecule screening and clinical trial site selection and even personalize patient communications. With that power comes increasing scrutiny.
Regulators are watching closely. Health authorities in the United States, European Union and beyond are issuing new guidelines around AI in clinical decision-making, digital therapeutics and research applications. They want to know: Where did the data come from? Was it anonymized? Who validated it? And can you prove it?
If your data pipelines are fragmented, those answers may simply not exist. But orchestration changes that. When you automate your data moving from SAP modules to Azure ML or from SAP Datasphere to regulatory systems, you also create a system of record. Every dataset has a timestamp, and every transformation is traceable. This strategically enables AI innovation.
The next wave of advancement will hinge on more than modeling accuracy; you’ll need to be able to explain how your model was built or prove the integrity of the data behind it. With the right orchestration solution, you don’t have to choose between speed and control. You can stay audit-ready and future-ready.
Develop a resilient nervous system
Think of your systems like organs. Each one serves a distinct purpose, but they communicate via signals that travel through connective tissue. These signals are orchestration in action!
Want to know more about orchestrating SAP data with RunMyJobs? Read more about using the SAP Analytics Cloud connector.
Across banking, insurance and asset management, financial institutions are realizing data orchestration will define their future competitiveness.
This is apparent in recent headlines. For example, JPMorgan Chase has ambitiously invested in AI, building a team of over 2,000 AI experts and developing proprietary models to improve everything from fraud detection to investment advice. But the story underneath the surface is just as important.
Bold bets can only be made from a solid foundation. Before any AI, analytics or digital transformation initiative can succeed, the data behind it must be clean, connected and controlled. Leading financial services firms recognize these initiatives can only deliver value when the data feeding them is complete, synchronized and auditable.
In an environment where transactions span mainframes, SAP systems, cloud platforms and best-of-breed specialty tools, orchestrating data flows rather than just integrating endpoints becomes the competitive differentiator. Instead of adding more tools, you need to build better pipelines. Your filings, financial statements and liquidity metrics are too critical to allow stale, inconsistent and siloed data to inform them.
The more orchestrated your data movement, the faster and safer your institution can move. Whether you manage $5 billion or $500 billion, orchestration supports financial close acceleration, real-time risk aggregation and ongoing compliance with evolving regulations.
And it’s achievable now.
The stakes are higher in finance
Whereas it would be a mere efficiency problem in some industries, data friction in financial services is a major business risk. When your systems operate in silos or on rigid schedules, you open the door to fines, missed cutoffs, extended close cycles, customer dissatisfaction and other negative outcomes.
Meanwhile, the AI and analytics platforms you’re investing in, from SAP Business Technology Platform (BTP) to Azure, Databricks and beyond, can’t deliver value if the pipelines feeding them are delayed, error-prone or unverifiable. Precision and timing are non-negotiable when you’re dealing with the precious numbers that impact the lives and livelihoods of your valued stakeholders.
From static pipelines to dynamic orchestration
Despite years of modernization efforts, many financial institutions have invested heavily in connecting systems via APIs, ETL pipelines or middleware. These integrations were a necessary step, as they enabled data movement between SAP S/4HANA, legacy mainframes, cloud data warehouses, CRMs and more. But whether data moves isn’t the question; it’s whether it moves correctly, completely and in sync with the events that drive your business.
Without considering this connectivity and complexity, you’ll lack event-driven control, data validation checkpoints, dependency management and real-time recovery, among other key capabilities. An intelligent orchestration layer addresses these gaps, especially if, like most financial operations, yours operates across a hybrid mix:
SAP S/4HANA or SAP Central Finance
Legacy mainframes for core banking or policy systems
Cloud data warehouses and analytics platforms
CRMs like Salesforce
Risk engines, actuarial systems, customer applications and partner ecosystems
It’s important to have a living nervous system connecting it all. A foundation that can monitor, react and adapt automatically across SAP and non-SAP systems will help you meet ballooning expectations brought about by AI, evolving regulations and more industry-specific factors.
True data pipeline enablement requires the ability to:
Trigger workloads across SAP, cloud and legacy systems based on real events instead of static schedules
Validate and sequence data automatically — delaying or rerouting jobs until quality gates are cleared
Coordinate ML model execution tied directly to upstream data pipelines, whether scoring loans, recalculating provisions or updating liquidity forecasts
Automatically log, track and retry processes to maintain auditability and meet SLA commitments
Push structured, enriched datasets to SAP Analytics Cloud, Microsoft Power BI and other downstream consumers
Orchestration makes this possible. It doesn’t replace your SAP platforms, APIs, data lakes or CRM systems. It connects and governs the financial data flowing between them, automatically and intelligently. And AI and compliance-readiness depend on this very orchestration.
Modernizing an SAP landscape at one of the world’s largest wealth managers
Multi-national financial services firm UBS faced complex challenges integrating SAP systems with non-SAP core banking platforms. They needed faster financial reporting, lower operational risk and greater agility to respond to market demands.
By migrating to RunMyJobs by Redwood, they achieved real-time orchestration across hybrid systems, reducing the time required for financial data consolidation and strengthening SLA performance. These changes came alongside a 30% reduction in total cost of ownership (TCO) of the company’s IT process solutions.
Today, UBS runs mission-critical financial workloads reliably and scalably. Read the full story.
Building an efficient automation fabric around everyday financial processes
Your organization lives and dies by its ability to respond to change, and it all begins with having every dataset, account and rate positioned correctly from the outset. An automation fabric is the layer that connects and synchronizes your tools, data sources and processes across your IT environment, no matter how complex it is.
Setting your entire organization up for resilience begins with the first transaction of the day. Here’s what orchestrated start-of-day financial operations can look like with a secure, advanced workload automation platform as your control layer.
Ledger updates and overnight postings
Finalize overnight processes — interest accruals, FX revaluations, journal entries — using SAP Financial Accounting (FI) and SAP Treasury and Risk Management (TRM)
Validate completion of all wrap-up jobs
Check dependencies and prevent downstream jobs if failures are detected
Balance reconciliation
Trigger FF_5 to import bank statements
Run matching logic and update general ledger balances
Launch ML cash application processes in SAP Cash Application (Cash App)
Automatically alert stakeholders about missing files and manage escalation workflows
Opening balances and cash positioning
Refresh One Exposure hub with new data
Load memo records and run liquidity forecasts in SAP Cash Management
Pull FX rates, payment maturities and treasury forecasts from SAP TRM
Data loading for exchange rates and market data
Import daily FX rates and market indices into SAP tables
Validate values against prior-day data
Alert treasury and risk teams of major discrepancies that could impact valuations or cash forecasts
Risk checks and exposure updates
Run FX valuation jobs
Generate treasury dashboards in SAP Analytics Cloud (SAC)
Monitor for trading limit exceptions and notify teams automatically
System readiness and transaction processing enablement
Execute standing instructions and direct debits in SAP Banking Services
Generate payment proposals (e.g., F110, APM)
Route for approvals via SAP Bank Communication Management (BCM) and transmit to banks
Monitor acknowledgments and update One Exposure with outgoing flows
Every step is timestamped, validated and fully auditable, so you’re ready to operate at full speed from the first minute of the business day. Your firm can create resilient, auditable pipelines, reduce risk, enable AI and advanced analytics and scale cross-system processes without adding complexity or risk.
RunMyJobs ensures readiness across SAP FI, TRM, BCM and external systems while automatically triggering ETL pipelines once jobs complete and feeding analytics platforms like Databricks, SAC, Tableau or Power BI.
Supplement your orchestration with Finance Automation by Redwood
High-performing institutions take automation even further. Choosing to complement your advanced workload automation platform with an end-to-end automation solution for financial close, reconciliations, journal entries and disclosures can help you achieve:
Continuous accounting and faster period-end close
Greater accuracy across income statements, balance sheets and cash flow statements
Stronger governance and full traceability from source systems to boardroom-ready reports
Harnessing the orchestrated advantage for hybrid environments
Financial institutions have long recognized the importance of data. However, the sheer volume, velocity and variety of financial data are exploding. Fueled by real-time event streams, the proliferation of APIs and embedded finance, plus an increasing reliance on AI-driven insights, the data landscape is becoming exponentially more complex.
The future demands a fundamentally different approach to managing this ever-growing tide. Intelligent automation and orchestration are essential for building a resilient foundation capable of handling the dynamic and interconnected nature of tomorrow’s financial operations.
To navigate an expanding hybrid data landscape effectively, you must build a robust orchestration layer that ensures data integrity, auditability and observability across all systems.
Read more about how to get your data out of the modern-day maze.
The month ends, the pressure mounts and the race to close the books begins. It’s a familiar cycle, often marked by a frantic push to hit deadlines, sometimes at the expense of accuracy. But what if we could fundamentally change this experience by moving beyond simply meeting the deadline and instead focusing on a smoother, more accurate and, ultimately, less stressful close?
Lately, I’ve been thinking about why the month-end close in so many organizations feels like a series of disconnected tasks, performed by teams working in silos with limited visibility into the bigger picture. Different individuals or teams own specific accounts or processes, diligently working on their piece of the puzzle. Yet, the connections between these pieces — the understanding of how one person’s output directly impacts the next stage and the final financial statements — often feel flimsy.
The problem with traditional close timelines
This situation is often exacerbated by a phenomenon known as Parkinson’s Law, the idea that work expands to fill the time available for its completion. If we allocate a set number of days or hours per month for the close, the work tends to stretch out to occupy that entire timeframe. This happens both consciously and unconsciously. Tasks that we could complete more efficiently can become drawn out and the initial urgency can dissipate, leading to a last-minute scramble.
It reminds me of a poorly orchestrated assembly line. Imagine a car factory where each worker focuses solely on their individual task, like installing a door or tightening a bolt, without any real-time feedback on the quality of their work or how it affects the subsequent steps. Compound this with the fact that each worker feels they have “all day” to complete their seemingly small task.
Then, picture the pressure intensifying. Leadership demands the finished product by a specific time, no excuses. The focus narrows to speed, potentially overshadowing the crucial element of quality. The car rolls off the line “on time,” a superficial victory. But when quality control steps in, the reality hits: misaligned parts, missing components — a fundamentally flawed product requiring significant and costly rework.
Sound familiar? When those month-end financials are delivered on schedule but later reveal discrepancies, incomplete documentation and overlooked details? That frantic, siloed approach, often fueled by the creeping influence of Parkinson’s Law, leads to precisely this outcome. We allow the work to expand to fill the available time and end up creating more work, and potentially more significant issues, down the line.
Assembly line reimagined: What automation makes possible
What if we could transform this disjointed process into a seamless, interconnected “accounting assembly line?” This is where automation comes into play, offering a direct antidote to the inefficiencies brought about by Parkinson’s Law.
Consider the impact of robotics and sophisticated systems in a modern car factory. These technologies not only accelerate production but also dramatically improve accuracy and consistency. Imagine automated systems flagging inconsistencies early in the process, preventing downstream errors. An automated accounting assembly could perform complex tasks with unwavering precision, unaffected by the human tendency to let work fill the available time.
Automation offers the same potential for our month-end close, directly combating Parkinson’s Law by:
Imposing efficiency by design: Automation tools don’t succumb to the temptation to stretch out tasks. They execute processes in a standardized, efficient manner, completing them in their actual required time, regardless of the broader timeframe allocated for the close.
Shrinking task timelines and fostering focus: Automating repetitive and manual processes drastically reduces the time needed for these core closing activities. This inherently shortens the close timeline because it prevents work from expanding unnecessarily and forces a more focused approach.
Promoting timeliness and accountability: Automated workflows with reminders and escalation protocols inject a sense of urgency and ensure tasks are completed on schedule, directly counteracting the procrastination that Parkinson’s Law often encourages.
Enhancing accuracy from the start: Automation minimizes human error, leading to cleaner data and fewer discrepancies. There’s no longer a need for extensive investigations and rework at the tail end of the close. It essentially prevents the rework “penalty” of a rushed, Parkinson’s Law-influenced process.
Fostering integration and visibility: Automation can connect disparate systems and provide a holistic view of the closing process. It breaks down silos and demonstrates how each task contributes to the final outcome.
By understanding the subtle yet powerful influence of Parkinson’s Law on traditional close processes, we can better appreciate why simply allocating more time, adding more bodies, offshoring labor or purchasing siloed automation tools isn’t the solution. Embracing strategic automation isn’t just about closing faster; it’s about reclaiming our time, enhancing accuracy and creating a more streamlined and less stressful month-end close by actively preventing work from expanding to fill the available void.
It’s about building that high-quality “car” efficiently the first time, rather than constantly fixing a rushed and flawed product, then replicating the process to continue to produce that same quality of vehicle.
Why do finance automation adoption numbers lag behind a belief in its importance? See the latest industry stats in “The R2R automation playbook.”
Look for the signs
Think critically about where Parkinson’s Law might be subtly impacting your current close. It doesn’t exactly announce itself.
Ask yourself and your team:
Are we still relying on Excel spreadsheets for close task management?
Do we wait until the last 48 hours to reconcile bank statements or finalize accruals?
Are our ERP systems feeding real-time data into our close checklist, or are we still relying on someone to tick a box when a task is complete?
Are we discovering discrepancies too late and forcing rework that derails forecasting and decision-making?
If the answer to any of these is yes, it’s time to analyze how you might be unintentionally allowing Parkinson’s Law to creep in and shape your workflows.
Win back time and drive a predictable, quality close
Speed alone isn’t the goal. What moves the needle is a close process that doesn’t crumble under pressure.
Automation can empower your team to own a predictable, auditable and resilient close process. When every financial transaction, journal entry and general ledger update flows through a standardized, automated system and quality control is built right into the process, they’ll spend less time chasing manual steps and more time refining strategy.
You’re not removing people from the process; you’re allowing them to work smarter. Not only will automation eliminate the delays and stress that so often plague the month-end effort, but it will also help you with the practical stuff: identifying cash flow issues before they hit the balance sheet, validating metrics, ensuring data consistency and more. Automation is your lever against the inevitable.
If I had a dollar for every time I heard “AI” at SAP Sapphire 2025 …
AI was simply everywhere at this year’s events. From Christian Klein’s keynote to the show floor demos, it was the foundation of nearly every conversation. But beneath the buzzwords and bold visions, I noticed one question kept surfacing: How do you actually do it? How do you make AI actionable inside the day-to-day workings of an enterprise?
That’s the question we were thinking about at the Redwood Software booth and in our customer sessions and roundtables. It was fantastic to see the energy this year: standing-room-only demos, deep discussions with IT and business leaders and a steady stream of customers stopping by to share what they’re already doing with job scheduling, orchestration and workload automation (WLA). The excitement was real, but the deeper story was about who’s already rolling up their sleeves instead of just dreaming about digital transformation that actually realizes the value of AI.
Redwood was proud to be recognized for the second year in a row with the SAP Pinnacle Award in a category honoring innovative partners that provide economically relevant solutions, validating our ability to consistently drive high adoption and ROI for SAP customers. We also announced that RunMyJobs by Redwood is now an SAP Endorsed App, Premium certified — the highest level of SAP verification, indicating outstanding customer value.
The best part? We’re not talking in hypotheticals. These milestones are a testament to the real-world outcomes our customers achieve when integrating with the latest SAP technologies, maximizing the value of their SAP investment. We saw that in full color in sessions and roundtable discussions with RS Group and others, whose teams shared striking results they’ve achieved using RunMyJobs. They haven’t been waiting for the AI wave. Instead, they’ve been preparing for it by modernizing their WLA. And it’s paying off.
We’re making business AI real as we drive digital transformations that help customers thrive in an increasingly unpredictable world.
Klein’s sentiment rang true throughout the event, especially his keynote theme: To thrive in an AI-powered world, it’s not enough to modernize ERP. Foundational processes, especially the ones running behind the scenes, must be intelligent, agile and orchestrated. WLA platforms like RunMyJobs are already doing the work of preparing SAP landscapes for AI by coordinating processes end to end, orchestrating the tasks that drive efficient data pipelines and ensuring the reliability that AI output depends on.
Redwood customers leading the charge
SAP made it clear: the future isn’t about cobbling together best-of-breed tools. It’s about building a smart, cohesive suite. That suite extends beyond core ERP to include the applications and automation fabrics that make an entire business run. Redwood customers are already there.
RunMyJobs isn’t a standalone job scheduler. It’s the connective tissue for automation fabrics across SAP and non-SAP systems, delivering the kind of real-time orchestration that complex, data-intensive environments demand. Redwood’s shared product vision with SAP is helping customers optimize operations to scale with AI. That alignment is also what earned RunMyJobs its SAP Endorsed App status.
We spotlighted compelling Redwood customer stories at SAP Sapphire this year, including the following.
RS Group: Transforming global supply chain operations for a demanding market
As a global industrial distributor, RS Group faces an unforgiving supply chain environment. Before RunMyJobs, they couldn’t even run business operations processing (BOP) daily for all 26 markets they serve. The complexity was enormous. They had to stagger market runs, which put customer promises, such as delivery timelines, at risk.
Using RunMyJobs to re-engineer processes and workstreams and optimize job logic, they now run BOP for all 26 markets daily.
We now meet our promise to our business and customers.
Dharmesh Patel, Head of SAP Development & Services, RS Group
But that was only the beginning. Previously, RS Group faced issues with poor monitoring, alerting and visibility, leading to frequent Priority 1 (P1) and Priority 2 (P2) incidents in critical operations like order processing and warehouse management. With RunMyJobs, they introduced custom alerting, rebuilt job frameworks and created a governance model for continuous improvement.
The payoff of using RunMyJobs for RS Group: No P1 or P2 incidents in critical operations in over a year and job execution reliability over 99%
This isn’t just operational success. It’s setting the stage for AI readiness, because AI needs more than just access to data. It needsreliable, actionable data at the right time, integrated into the processes that power the business. RS Group is ready. When you run a global supply chain, “ready” isn’t a luxury.
Ready on day 1: How fit-to-suite automation prepares you for the AI future
The real takeaway from SAP Sapphire wasn’t that AI is coming. It’s that AI is already here, and the companies reaping the benefits are the ones that did the foundational work early. Redwood customers like RS Group have already modernized their WLA. They’re not bolting on AI. They’re ready for what’s happening now and what’s to come because their automation is fit-to-suite: deeply integrated, spanning SAP and non-SAP systems and built for scale and AI innovation.
RunMyJobs provides the automation fabrics enterprises need to orchestrate complex, cross-system workflows and support the data pipelines AI depends on. It connects SAP S/4HANA to the hybrid architectures, business process layers and related data AI needs to drive better, faster decisions and more efficient attainment of business outcomes.
When your business runs on well-managed, intelligent processes, you don’t just hope your AI strategy will work — you know it can.
An obvious and undeniable message of SAP Sapphire 2025? WLA modernization isn’t a side project. It’s a prerequisite. See how Redwood supports SAP customers in future-proofing their ecosystems.
A large United States-based manufacturer recently approached Redwood Software with a high-stakes decision to make: Renew their legacy workload automation (WLA) contract at five times the cost or modernize and move to the cloud. Their IT leadership had already committed to a cloud-first strategy aligned with their broader digital transformation goals. Renewing with their vendor would have meant staying tethered to costly on-premises infrastructure and putting off much-needed modernization.
The business case was clear for moving to a cloud-native WLA solution. But the clock was ticking. With just three months before their existing contract expired, the company needed to evaluate new platforms, prepare for migration and go live in that tight timeframe without disrupting critical business operations.
That’s when they turned to Redwood.
Our team of migration experts quickly mobilized, leaning on Redwood’s proven methodology, cloud-native platform and proprietary migration tools. We helped this company not only meet their deadline, migrating from a legacy platform in just 14 weeks, but also use the migration as a strategic opportunity to improve automation processes, retire technical debt and set the stage for long-term success in the cloud.
This isn’t an edge case. Whether you’re facing similar licensing deadlines, preparing for a RISE with SAP transformation or simply looking to modernize a fragmented automation landscape, you’re not alone — and you don’t have to start from scratch.
At Redwood, we understand that migration isn’t just a technical change. It’s your chance to rethink how automation supports your business and make sure you’re ready for what the future brings.
Speed is essential — but so is strategy
Time constraints are common in these scenarios. Redwood frequently works with organizations facing license renewals that force a go/no-go decision, RISE with SAP transitions that require cloud-readiness and/or internal mandates for tool consolidation and legacy system modernization.
These deadlines create urgency, but a rushed migration without strategy leads to risk. It can carry over inefficiencies and complications into your next-generation platform. Too often, we see companies fall into the trap of replatforming without rethinking.
In our experience, there are two primary mindsets when it comes to WLA migration:
Lift-and-shift first, optimize later: Move jobs as-is to meet tight deadlines, with plans to modernize after go-live.
Modernize as you move: Take the opportunity to streamline architecture, remove redundancies and improve process logic as you migrate.
Most organizations fall somewhere in between, and that’s exactly why Redwood approaches migration by tailoring it to your environment, not a one-size-fits-all script.
Migration as momentum: Essential considerations
What kind of change are you driving? Are you simply replicating jobs or using this transition to streamline, modernize and reduce complexity?
How will you optimize the new platform? Are you planning for better performance and improved reliability from the start?
Is your automation strategy aligned with broader goals? Will the migration support larger initiatives like cloud adoption, tool consolidation or SAP transformation?
Who needs to be involved: Are departments, service providers or external teams part of the process, and are they looped in early?
Redwood evaluates your:
Source platform and job volume
Critical business processes and dependencies
Timeline flexibility and go-live constraints
Appetite for technical debt cleanup
This ensures we don’t just recreate your existing environment but deliver a better one.
Rather than thinking of migration as a one-time event, consider it the start of a smarter operating model. Redwood’s Professional Services team brings decades of experience helping enterprises like yours transition from legacy WLA platforms to our modern, cloud-native solution, RunMyJobs by Redwood. Here’s what that means for your business.
IT infrastructure savings
Migrating off legacy systems sooner lets you decommission outdated infrastructure, eliminate those redundant support contracts and reduce operational overhead. This is especially important if you’re heading toward hybrid or full cloud adoption.
Business process improvements
We don’t just move your jobs; we evaluate them. During migration, we help you identify inefficiencies, unnecessary handoffs and outdated dependencies. This is your chance to streamline.
Operational efficiencies
Redwood provides pre-built templates, connectors and industry best practices to fast-track implementation. These accelerators and our unique testing frameworks help you get to production faster.
The groundwork for long-term gains
One of the most overlooked benefits of a well-executed migration is how quickly you can begin realizing value, and not just from the software itself. Value comes from removing friction. Thus, you need a team with a track record of doing just that.
With Redwood, you begin seeing results almost immediately:
Noticeably stronger stability: Our migration process is designed to minimize disruption and deliver a stable production environment from day one. You don’t need weeks or months of post-migration troubleshooting to feel the benefits.
Improved visibility: Instead of toggling between tools and spreadsheets, you have a single source of truth for managing jobs enterprise-wide. Thus, fewer blind spots and better operational alignment.
Reduced manual effort: With intelligent automation and reusable templates, your teams spend less time on repetitive tasks and more time on process improvement.
Accelerated business outcomes: Faster financial closes, improved service availability … whatever you’re after, Redwood removes the bottlenecks and gets you there quickly.
Greater agility: Once you’re on a modern, cloud-native platform, you can scale, adapt and evolve your automation environment in lockstep with your business. Adding new systems or integrating third-party tools becomes significantly easier.
Modernize on your terms
Migrating to a new WLA solution involves much more than moving scripts or job chains. Your goal should be to enable a new level of orchestration across your enterprise. That’s why it pays to work with a partner who specializes in this exact domain.
Redwood’s Professional Services team is focused solely on successful automation implementations. We offer:
Proven methodologies for assessment, migration and rollout
Proprietary tools to streamline job mapping, testing and cutover
Flexibility to adjust your scope in real time
Risk mitigation with detailed validation and go-live readiness
Post-migration services to keep advancing your automation maturity
At Redwood, we don’t just bring technology. We also offer unmatched focus, tools and experience. Organizations across industries have trusted Redwood to help them leave behind legacy WLA platforms.
If you’re feeling the pressure of an expiring contract, a cloud deadline or a business that’s outgrown your current WLA solution, Redwood’s proven migration approach is here to move you forward with a clear vision.
Hear directly from Daniel Sivar, Technologist at American Water, about how Redwood guided the largest regulated water and wastewater utility company in the United States through “managed waves” to ensure a successful migration.