Skip to content
Search

Automation TCO and Sensitivity Analysis

Part of the automation selection framework. See Automation Selection Scoring Matrix for the 8-step scoring methodology, criterion weights, and knockout criteria.

Full technology selection lifecycle, vendor landscape, scripted demo protocol, reference check structure, and commercial negotiation levers for logistics automation and WMS/WES/WCS procurement.


Every technology selection runs through eleven phases; skip one and you pay for it in implementation:

PhaseKey activity
0. Steering committee formationExecutive sponsor, operations, engineering, IT, finance, procurement, change management — all committed before Phase 1 begins
1. Current state assessmentWalk every process flow; document use cases
2. Requirements definitionProcess flows → use cases → requirements (never requirements from a blank spreadsheet)
3. RFI (optional)Run when >12 potential vendors or ballpark pricing needed to calibrate budget; keep under 30 pages
4. RFP development10-section structure; scripted demo included as attachment
5. Vendor evaluation and scoring40/20/15/15/10 weighted scorecard
6. Scripted demosOne day per vendor; your actual data; 2-dimension rubric
7. Reference checks12 questions per reference; all references called
8. Final shortlist and BAFOBest and Final Offer from top 2–3 vendors
9. Commercial negotiation12 levers negotiated before contract signing
10. Contract executionGoverning document; final review before signature
11. Kickoff and mobilizationCommunication cadence, decision rights, escalation paths in week 1

Software-only (WMS, TMS, LMS): 3–6 months. Major automation: 6–12 months. Do not compress — compression means skipping requirements depth or accepting vendor boilerplate.

Steering committee requirement: Missing operations or IT from the steering committee is the most common structural failure. Include IT from the beginning — adding them late adds six months to implementation for any engagement involving WMS, WES, WCS, or robotics.


Correct order: (1) Process flows, (2) use cases within each flow, (3) requirements derived from use cases. When requirements are derived from use cases, every line item has a business justification. Assembled from wish lists = padding from every stakeholder.

Integration cost is 30–50% of total deal value in complex environments — the number most consistently underestimated on every selection. Ask every vendor for integration count to your specific ERP version and average integration cost at your scale.

MoSCoW prioritization: Must Have / Should Have / Could Have / Won’t Have this time. Limit Must Have to 20–30 requirements that are genuinely make-or-break. More than 30 Must Haves means the prioritization work has not been done.


CategoryDefault weightFinance will push for…Hold the line because…
Functional capability40%LowerA system that can’t do what operations needs will be ripped out in 3 years
Commercial / TCO20%Higher
Technical architecture and security15%
Vendor viability and references15%
Implementation approach10%

Unscripted demo = vendor sales pitch. Scripted demo forces demonstration against your reality: your order profiles, your exception scenarios, your edge cases.

One-day structure: 8:00–8:30 introductions and scoring instructions; 8:30–12:00 scripted sections 1–4 (inbound, inventory management, outbound, labor management); 12:00–1:00 working lunch; 1:00–4:30 scripted sections 5–7 (reporting, integration scenario, edge case); 4:30–5:00 evaluator scoring with vendor out of the room. Use real data.

Two-dimension scoring rubric (each step scored 1–5):

  • Capability (1 = cannot do it; 3 = significant configuration required; 5 = out-of-the-box, excellent UX) — weight 70%
  • Demonstration Quality (1 = punted to slides; 3 = functional but slow; 5 = confident, fast, live answers in the system) — weight 30%

Absolute rule: verbal assertions of capability that isn’t demonstrated in the current product version score 2 or lower. “Our system can do that” without showing it = 2. No exceptions.


Call every reference; take verbatim notes:

  1. What version did you implement vs. what you’re on now?
  2. Original vs. actual go-live date and cause of delay?
  3. Original vs. final implementation budget and cause of overages?
  4. Example of something that went wrong and how the vendor responded?
  5. Actual uptime vs. SLA?
  6. Realistic support response time on a critical issue?
  7. Rate quality of the on-site implementation team vs. what was presented in selection (1–10)?
  8. What integrations did you build and how difficult were they?
  9. What do you know now that you wish you’d known before selecting?
  10. Would you buy this system again?
  11. Would you work with this vendor again? (listen for the gap between 10 and 11)
  12. How has the vendor’s product roadmap actually delivered on what was promised during your selection?

Red flags: all references 5+ years old; references significantly different in scale; all references with the same SI (compensating for product gaps with custom code).


Negotiate all 12 before signing; they cost nothing at contract time and are worth everything when something goes wrong:

  1. Contract length for implementation discount — 3 or 5-year SaaS commitment for 15–25% off
  2. Most-Favored-Nation pricing clause
  3. Automatic SLA credits — 10–30% of monthly fees per percentage point below SLA
  4. Source code escrow (Iron Mountain) with triggers for bankruptcy, product discontinuation, acquisition
  5. Termination for convenience with 90–180 days notice
  6. UAT acceptance criteria tied to final payment
  7. Liquidated damages for automation hardware — 0.5–1% of contract value per week of delay, capped at 10–15%
  8. Performance bonds for automation contracts above $5M — bond cost 0.5–2%; passes through to contract price
  9. Change order governance requiring written authorization before out-of-scope work begins
  10. Specified training volume and post-go-live support period
  11. Data portability — all data exportable in standard format at your request and within 30 days of termination
  12. Integration sandbox environment upgraded before production during every major release

Perpetual license maintenance industry standard: 18–22% of original license value annually. Apply rate against discounted fee (not list price); cap annual escalation at 2–3%.


Tier 1 Enterprise (5-year TCO $1.5M–$5M+; implementation 9–18 months):

VendorBest fitKey notes
Manhattan Active WMRetail, e-commerce, 3PLCloud-native, continuously updated; strongest LM integration; on-prem unavailable — killer criteria if required
Blue Yonder WMS / LuminateGrocery, retail, industrial (global)Formerly JDA; Panasonic-owned; tightest native WMS-TMS integration
Oracle WMS CloudExisting Oracle ERP customers (Fusion/NetSuite)Native integration removes a complexity layer
SAP Extended Warehouse Mgmt (EWM)SAP S/4HANA environmentsDeepest MHE integration; strong Europe; requires specialized EWM integrators

Tier 2 Mid-Market (5-year TCO $300K–$1.5M; implementation 4–9 months):

VendorNotes
Infios (formerly Körber, rebranded March 2025)Rare mid-market WMS + TMS (acquired MercuryGate 2024)
TecsysStrong in healthcare distribution and regulated environments; 2024 Nucleus WMS leader
SofteonHighly configurable; strong OMS integration
Infor WMSIndustry-specific templates for food, beverage, distribution
Made4netCloud-native; strong 3PL billing
LogiwaCloud-native; fast implementation; high-velocity e-fulfillment
Extensiv (formerly 3PL Central, rebranded May 2022)Purpose-built small-to-mid 3PLs

Integration costs are additive and often match or exceed software costs in Tier 1 implementations.


WES (orchestrates work across labor and automation; sits between WMS and WCS):

  • Honeywell Intelligrated Momentum WES — market-leading; cloud-native; supports robotics orchestration and AS/RS integration
  • SVT Robotics — integration platform for heterogeneous robotics environments
  • Lucas Systems — voice-directed picking with Dynamic Slotting

WCS (machine-level control: conveyor speeds, sort divert logic, AS/RS crane control):

  • Integrator-tied: Dematic iQ, Honeywell Intelligrated Momentum WCS, SSI Schaefer WAMAS, Vanderlande VISION/miControl, Daifuku/Wynright
  • Independent: Numina Group (mid-market), Cirro (cloud-based WCS/WES)
  • Non-negotiables: real-time latency (machine control cannot tolerate >100ms); equipment vendor certification for specific MHE model and firmware version; failover architecture

TMS (Gartner 2024 leaders): Blue Yonder, e2open, Manhattan Associates, Oracle TMS, SAP TM; Infios for mid-market. Enterprise: $500K–$2M+ Year 1; mid-market: $100K–$500K.

LMS: Blue Yonder LMS and Manhattan Active Labor for Tier 1. Typically drives 15–25% productivity improvement; payback within 12–24 months for 75+ direct labor headcount.

Slotting: Optricity/OptiSlot DC (Fortna-owned), Lucas Systems Dynamic Slotting, Optioryx Pulse.


AS/RS:

  • AutoStore — grid-based cube storage; 1,700+ installations; sold through certified integrators only; R5 Pro robot (NRF 2024) reduces bot count per grid ~15%
  • Exotec Skypod — modular goods-to-person; next-gen released February 2025
  • Attabotics / Lafayette Systems — 3D cubic storage; returned under Lafayette Systems branding April 2026 after 2025 restructuring — validate financial stability before specifying
  • Traditional crane AS/RS: Dematic, SSI Schaefer, Swisslog, Daifuku for pallet and case-level high-ceiling operations

AMR:

  • Locus Robotics — 7B+ picks; LocusONE platform; Origin and Vector bots
  • Geek+ — Asia-dominant; PopPick, MovePick; CarouselAI orchestration
  • 6 River Systems (Shopify) — Chuck AMR for mid-market e-commerce
  • Symbotic — AI-powered case-handling; 42+ Walmart DCs; Tier 1 scale only

Robotic piece picking:

  • Covariant — RFM-1 AI model; deployed at McKesson with KNAPP; acquired by Amazon 2024 — evaluate vendor independence risk
  • RightHand Robotics — piece-picking work cell partnered with Locus
  • Mujin — strong in automotive and logistics automation

Two selection principles:

  1. “Buy the application, not the technology” — evaluate which technologies serve the application at your operational parameters; don’t select on demo impressiveness
  2. AMR pilot programs (50–100 robots, one zone) are reasonable; for fixed automation (AS/RS, sortation) there is no meaningful pilot — get the exit ramp in the contract before you sign

Seven Common Mistakes in Technology Selection

Section titled “Seven Common Mistakes in Technology Selection”
  1. Demo charisma over substance — trust the scripted rubric; surface the tension between gut feel and scorecard rather than overriding the process quietly
  2. Gartner MQ over-reliance — MQ rewards scale and marketing investment, not implementation success; Niche Player vendors may be best fit
  3. Integration cost underestimation — 30–50% of total implementation value; ask every vendor for integration count to your specific ERP version and average cost at your scale
  4. No exit ramps in contracts — every contract needs termination for convenience, data portability, transition services, and definition of what happens to your data post-termination
  5. Single stakeholder veto — document voting rights, killer criteria, and decision rules at Phase 0
  6. Vendor financials not validated — privately held: request audited financials; public: R&D spend as % of revenue should be 12–20% for healthy enterprise software
  7. Change management cost ignored — budget 10–15% of total project cost; Gartner found only 32% of planners implementing new planning tools actually used the new tool — 68% of spend was wasted due to adoption failure

Pro content

Subscribe to read the rest

This article is part of our Pro library — practitioner-level guidance, frameworks, and decision tools written from real projects.

$9/mo Basic · $13/mo Pro · cancel anytime