Operations & Strategy

AI Automation for Operations Leaders: A Practical Guide for 2026

The Boring First Method, the RAPID Framework, and a 90-day playbook — from 75+ SMB conversations and 13 years in the Navy.

By Anthony Pinto · · 14 min read

Last updated: March 2026

I spent 13 years on nuclear submarines in the U.S. Navy.

On a sub, every system has a procedure. Every procedure has a checklist. Every checklist has been tested under the worst possible conditions — because when you are 400 feet underwater, "we usually just kind of wing it" is not an acceptable standard operating procedure.

Then I started working with small and mid-size businesses.

Honestly, the contrast was staggering. After 75+ conversations with SMBs through Veteran Vectors, I kept hearing the same thing: "We need AI." "We need automation." "We need to modernize."

But when I asked to see their process documentation, I got blank stares.

They didn't have a process problem. They had a documentation problem.

Most small businesses don't have an actual process. They have a memory of what worked last time. The founder carries it all in their head. The ops manager "just knows" how things get done. And that works — until it doesn't. Until someone leaves, or volume doubles, or you try to hand the task to a new hire and realize you can't explain what you do because you've never written it down.

You become the bottleneck.

This guide is the playbook I wish I could hand every operations leader before they spend a single dollar on AI tools. It is built on real engagements, real numbers, and two frameworks I use with every client at Veteran Vectors: the Boring First Method and the RAPID Framework.

The Three Camps (Which One Are You?)

Before we get into frameworks, I need you to be honest about where you are. In my experience, every operations leader falls into one of three camps:

Camp 1: "I haven't really looked into it." You know AI is a thing. You've seen the headlines. But your day is consumed by fires, and exploring AI feels like one more thing on a list that's already too long.

Camp 2: "I'm interested but don't know where to start." You've maybe tried ChatGPT. You've sat through a vendor demo or two. But nothing has clicked into an actual plan for your operations.

Camp 3: "We're already using it." You've deployed some tools. Maybe a chatbot, maybe some Zapier workflows. But it feels scattered. You're not sure if you're getting real ROI or just playing with new toys.

Honestly, it doesn't matter which camp you're in. This playbook works for all three. Because the starting point is the same: you have to document what you actually do before you can automate any of it.

The Boring First Method

This is the single most important framework in this entire guide. I call it the Boring First Method because everyone wants to skip to the exciting part — the AI, the automation, the shiny dashboard. But the exciting part fails without the boring part.

Here's the breakdown:

Step 1: Strip the software. Open a blank document. Not a project management tool. Not a flowchart app. A blank doc. Google Docs, Word, a legal pad — it doesn't matter.

Step 2: Write the boring version. Every step. Every decision. Every handoff. Who does what, when, and why. Write it like you're explaining the process to someone who has never worked at your company and never will again. No shortcuts, no "you'll figure it out," no assumed knowledge.

Step 3: THEN add technology. Only after you have the complete, boring, plain-English version of your process do you start thinking about which steps can be automated, which tools to use, and where AI fits.

I know this sounds almost insultingly simple. That's the point.

On a submarine, we called this "writing the procedure before building the system." You don't install a reactor cooling system and then figure out how to operate it. You write the operating procedure first. You validate every step. You test it under failure conditions. Then you build.

The result? When you finally deploy automation, it works. Because you're not automating chaos. You're automating a documented, validated, understood process.

I worked with an 8-figure manufacturing client last year. They came to me saying they needed AI. They had dashboards, integrations, the works. But their data was a mess. Honestly, they didn't need AI. They needed data they could actually use.

We applied the Boring First Method. Stripped everything back. Documented every process on paper. And in that process, we found $340K in duplicate inventory. Not with AI. With documentation. We also cut data entry by 18 hours per week — again, mostly by eliminating redundant steps we discovered during the documentation phase.

The automation came after. And it was 10x easier to build because we knew exactly what we were automating.

The RAPID Framework: Picking What to Automate

Once you've documented your processes using the Boring First Method, the next question is: which ones should I automate first?

This is where most operations leaders get stuck. They see 20 potential automations and don't know how to prioritize. Or they pick the wrong one — something complex and low-impact — and the project stalls.

I built the RAPID Framework to solve this. It's a simple scoring system.

Here's the breakdown:

  • R — Repetitive. Does this task happen on a predictable schedule? Daily, weekly, monthly? The more repetitive, the higher the automation ROI.
  • A — Accuracy-dependent. Does getting this wrong cause real problems? Financial errors, compliance issues, customer-facing mistakes? If accuracy matters, automation beats human consistency every time.
  • P — Process-driven. Can you describe this task as a series of if/then steps? If yes, it's automatable. If it requires constant judgment calls and improvisation, it's not — at least not yet.
  • I — Input-heavy. Does this task involve entering, copying, or transforming data from one place to another? Data entry is the number one source of wasted operations time.
  • D — Digital-ready. Is the information already in a digital system, or is it on paper napkins and sticky notes? Digital-ready tasks can be automated immediately. Paper-based ones need a digitization step first.

If a task checks 3 or more of those boxes, it's a candidate for automation.

If it checks all 5, stop reading this article and go automate it right now. I'm serious.

Here's how common operations tasks score:

Task R A P I D Score
Cross-system data sync Yes Yes Yes Yes Yes 5/5
Weekly report generation Yes Yes Yes Yes Yes 5/5
Invoice processing Yes Yes Yes Yes Partial 4.5/5
Approval routing Yes Partial Yes Partial Yes 4/5
Inventory tracking Yes Yes Partial Yes Partial 4/5
Strategic planning No No No No Partial 0.5/5

See that last row? Strategic planning scores a 0.5. That's the kind of task AI can assist with, but should never own. This is exactly the distinction most companies miss when 95% of generative AI pilots fail. They try to automate the exciting, ambiguous stuff. The biggest ROI is in the boring back-office work: eliminating business process outsourcing, cutting external agency costs, and streamlining repetitive operations.

Andrew Ng's AI Transformation Playbook (And How I Adapt It)

Andrew Ng published an AI Transformation Playbook with 5 steps any company can take to become an AI-powered organization. I reference it constantly because the framework is sound. But I've adapted it for the SMBs and operations teams I actually work with at Veteran Vectors.

Here's the breakdown:

1. Execute pilot projects to gain momentum. Ng says start with small wins. I agree — but I add the Boring First Method on top. Don't pilot AI on an undocumented process. Document first. Pilot second.

2. Build an in-house AI team. For large enterprises, sure. For SMBs? You don't need a team. You need one person who understands your operations and a partner (like Veteran Vectors) who understands the technology. That's your AI team.

3. Provide broad AI training. This one I'm passionate about. Your ops team doesn't need a machine learning course. They need 2 hours of training on how the specific automations in their workflow function, what to do when something breaks, and when to escalate. Practical, not theoretical.

4. Develop an AI strategy. Ng puts this at step 4 deliberately — after you've had some wins. I've seen too many companies spend 6 months on an "AI strategy" before deploying a single automation. Get wins first. Strategy crystallizes from experience, not from a conference room whiteboard.

5. Develop internal and external communications. Tell your team what's happening and why. Tell your customers how it benefits them. Simple, but almost everyone skips it. Don't skip it.

Your 30/60/90 Day Implementation Timeline

I use this exact timeline with every client engagement. It's designed to deliver value fast and build momentum. No 6-month planning phases. No "transformation roadmaps" that collect dust in a shared drive.

Phase Timeline Key Milestones Expected Outcome
Document & Deploy Days 1–30 Boring First audit of all processes (Week 1–2). RAPID scoring of every task (Week 2). Deploy first automation on highest-scoring process (Week 3–4). 1 automation live. Team sees proof it works. Baseline metrics captured.
Expand & Train Days 31–60 Deploy automations 2 and 3 (Week 5–6). Monitor and tune first automation (Week 7). Train broader ops team (Week 8). 3 automations in production. Team adapting to new workflows. First ROI numbers ready for leadership.
Scale & Systematize Days 61–90 Deploy remaining high-priority automations (Week 9–10). Build internal SOPs for monitoring (Week 11). Full ROI review with executive team (Week 12). 4–5 automations operational. 40+ hours/week recovered. Executive team has hard data for next phase.

The key insight: you don't wait 90 days to see results. The first automation is live within 30 days. Every deployment after that compounds the returns.

On a submarine, we had a saying: "The best time to test a system is before you need it." Same principle here. Deploy early, learn fast, scale what works.

Why 95% of AI Pilots Fail (And How to Be in the 5%)

That number — 95% of generative AI pilots fail — gets thrown around a lot. But most people don't dig into why.

Here's the breakdown from what I've seen across 75+ SMB engagements:

Failure #1: Automating an undocumented process. This is the big one. If you can't write it down, you can't automate it. Period. The Boring First Method exists specifically to prevent this.

Failure #2: Chasing shiny use cases. Companies want the AI chatbot that talks to customers. The AI that writes marketing copy. The AI that predicts revenue. Meanwhile, someone on their team is spending 18 hours a week on data entry. The biggest ROI in back-office automation is eliminating business process outsourcing, cutting external agency costs, and streamlining the boring stuff nobody wants to talk about.

Failure #3: No baseline metrics. If you don't measure the "before," you can't prove the "after." Every automation we deploy at Veteran Vectors starts with documented baselines: hours spent, error rates, cycle times, costs.

Failure #4: Tool-first thinking. "Let's use [insert trending AI tool]" is not a strategy. Define the workflow, the inputs, the outputs, and the success criteria first. Then pick the tool. Sometimes the right answer is a no-code platform. Sometimes it's a custom integration. The requirements drive the architecture, not the other way around.

Failure #5: Trying to do everything at once. Andrew Ng says start with pilot projects for a reason. One automation, validated, measured, and delivering value, beats five half-built projects every single time.

The Real Numbers: What This Looks Like in Practice

I'm a numbers person. In the Navy, everything got measured. So here are real numbers from real engagements.

8-figure manufacturing company. They came to us asking for AI dashboards. We ran the Boring First Method and discovered their data was riddled with duplicates and inconsistencies. We found $340,000 in duplicate inventory they didn't know they had. Cut data entry by 18 hours per week. Total investment in the documentation and cleanup phase: a fraction of what they would have spent on an AI tool that would have just visualized bad data faster.

The result? When we did deploy AI after the cleanup, it worked on the first try. Because the data was clean and the processes were documented.

Anthony Pinto's rule of thumb: For every dollar you plan to spend on AI tools, spend fifty cents on documentation first. You'll save three dollars on the back end in rework, failed pilots, and wasted licenses.

Putting It All Together: Your Action Plan

If you've read this far, you're not in Camp 1. You're either in Camp 2 (interested, don't know where to start) or Camp 3 (already using AI, want to do it better).

Either way, here's your action plan for this week — not this quarter, this week:

Monday: Pick one process your team does every single week. The more boring, the better.

Tuesday–Wednesday: Apply the Boring First Method. Open a blank doc. Write every step. Every decision. Every handoff. No software, no flowcharts, just words.

Thursday: Score it with RAPID. Is it Repetitive? Accuracy-dependent? Process-driven? Input-heavy? Digital-ready? If it scores 3 or higher, you've found your first automation candidate.

Friday: Calculate the ROI. Hours per week x loaded hourly rate x 52 weeks = annual cost of the manual process. Even a conservative 70% automation rate on a 15-hour-per-week process at $45/hour saves $24,570 per year.

That's it. One week. No consultants, no software purchases, no AI strategy deck. Just documentation, scoring, and math.

The result? You walk into your next leadership meeting with a specific process, a specific cost, and a specific savings number. That's how projects get funded.

The Bottom Line

Honestly, after 75+ conversations with operations leaders at businesses of all sizes, the pattern is always the same. The companies that succeed with AI automation are not the ones with the biggest budgets or the fanciest tools. They're the ones who did the boring work first.

Document your processes. Score them with RAPID. Deploy one automation. Measure the result. Then do it again.

That's the entire playbook. No jargon. No 47-slide deck. Just the boring work that makes the exciting stuff actually work.

I learned this on submarines. I apply it at Veteran Vectors. And every operations leader I've worked with who follows this approach gets results.

For more on the financial side of automation, check out our breakdown of the real ROI of AI automation for small businesses. And if you want to see which of your processes are automation-ready, our guide on what business processes you can automate with AI goes deep on specific use cases.

Frequently Asked Questions

What is the Boring First Method for AI automation?

The Boring First Method is a three-step framework: (1) strip the software and open a blank document, (2) write the boring version of your process — every step, every decision, every handoff in plain English, and (3) only then add technology. It ensures you automate a clean, documented process rather than digitizing chaos. Anthony Pinto developed this approach at Veteran Vectors after seeing dozens of failed AI implementations that skipped documentation.

How do I know which tasks to automate with AI?

Use the RAPID Framework: score each task on five criteria — Repetitive, Accuracy-dependent, Process-driven, Input-heavy, and Digital-ready. If a task checks 3 or more boxes, it's a strong automation candidate. Tasks like cross-system data sync and report generation typically score 5/5. Strategic planning scores near zero. Start with the highest-scoring tasks for maximum ROI.

Why do 95% of AI automation pilots fail?

The primary reasons are automating undocumented processes, chasing flashy use cases instead of high-ROI back-office tasks, lacking baseline metrics to prove results, choosing tools before defining requirements, and trying to do everything at once. Companies that start with the Boring First Method — documenting processes before adding technology — dramatically increase their success rate.

How long does it take to implement AI automation for operations?

A realistic timeline follows a 30/60/90 day structure. Days 1–30: document processes and deploy your first automation. Days 31–60: expand to 2–3 automations and train the team. Days 61–90: scale across the organization and conduct a full ROI review. The first automation should deliver measurable value within 30 days — not 6 months.

What ROI can operations leaders expect from AI automation?

Results vary by process, but real-world outcomes include $340K in duplicate inventory found at an 8-figure manufacturing company, 18 hours per week cut from data entry, and 40+ hours per week recovered across operations teams within 90 days. A conservative estimate: automating a 15-hour-per-week process at $45/hour loaded rate saves over $24,000 annually, even at just 70% automation coverage.

Ready to Automate Your Operations?

Book a free discovery call and we'll identify the highest-impact automation opportunities for your ops team.

Book Your Free Discovery Call