Automation TCO and Sensitivity Analysis
Part of the automation selection framework. See Automation Selection Scoring Matrix for the 8-step scoring methodology, criterion weights, and knockout criteria.
Full technology selection lifecycle, vendor landscape, scripted demo protocol, reference check structure, and commercial negotiation levers for logistics automation and WMS/WES/WCS procurement.
Eleven-Phase Selection Lifecycle
Section titled “Eleven-Phase Selection Lifecycle”Every technology selection runs through eleven phases; skip one and you pay for it in implementation:
| Phase | Key activity |
|---|---|
| 0. Steering committee formation | Executive sponsor, operations, engineering, IT, finance, procurement, change management — all committed before Phase 1 begins |
| 1. Current state assessment | Walk every process flow; document use cases |
| 2. Requirements definition | Process flows → use cases → requirements (never requirements from a blank spreadsheet) |
| 3. RFI (optional) | Run when >12 potential vendors or ballpark pricing needed to calibrate budget; keep under 30 pages |
| 4. RFP development | 10-section structure; scripted demo included as attachment |
| 5. Vendor evaluation and scoring | 40/20/15/15/10 weighted scorecard |
| 6. Scripted demos | One day per vendor; your actual data; 2-dimension rubric |
| 7. Reference checks | 12 questions per reference; all references called |
| 8. Final shortlist and BAFO | Best and Final Offer from top 2–3 vendors |
| 9. Commercial negotiation | 12 levers negotiated before contract signing |
| 10. Contract execution | Governing document; final review before signature |
| 11. Kickoff and mobilization | Communication cadence, decision rights, escalation paths in week 1 |
Software-only (WMS, TMS, LMS): 3–6 months. Major automation: 6–12 months. Do not compress — compression means skipping requirements depth or accepting vendor boilerplate.
Steering committee requirement: Missing operations or IT from the steering committee is the most common structural failure. Include IT from the beginning — adding them late adds six months to implementation for any engagement involving WMS, WES, WCS, or robotics.
Requirements: The Step Everyone Rushes
Section titled “Requirements: The Step Everyone Rushes”Correct order: (1) Process flows, (2) use cases within each flow, (3) requirements derived from use cases. When requirements are derived from use cases, every line item has a business justification. Assembled from wish lists = padding from every stakeholder.
Integration cost is 30–50% of total deal value in complex environments — the number most consistently underestimated on every selection. Ask every vendor for integration count to your specific ERP version and average integration cost at your scale.
MoSCoW prioritization: Must Have / Should Have / Could Have / Won’t Have this time. Limit Must Have to 20–30 requirements that are genuinely make-or-break. More than 30 Must Haves means the prioritization work has not been done.
40/20/15/15/10 Weighted Scorecard
Section titled “40/20/15/15/10 Weighted Scorecard”| Category | Default weight | Finance will push for… | Hold the line because… |
|---|---|---|---|
| Functional capability | 40% | Lower | A system that can’t do what operations needs will be ripped out in 3 years |
| Commercial / TCO | 20% | Higher | — |
| Technical architecture and security | 15% | — | — |
| Vendor viability and references | 15% | — | — |
| Implementation approach | 10% | — | — |
Scripted Demo Protocol
Section titled “Scripted Demo Protocol”Unscripted demo = vendor sales pitch. Scripted demo forces demonstration against your reality: your order profiles, your exception scenarios, your edge cases.
One-day structure: 8:00–8:30 introductions and scoring instructions; 8:30–12:00 scripted sections 1–4 (inbound, inventory management, outbound, labor management); 12:00–1:00 working lunch; 1:00–4:30 scripted sections 5–7 (reporting, integration scenario, edge case); 4:30–5:00 evaluator scoring with vendor out of the room. Use real data.
Two-dimension scoring rubric (each step scored 1–5):
- Capability (1 = cannot do it; 3 = significant configuration required; 5 = out-of-the-box, excellent UX) — weight 70%
- Demonstration Quality (1 = punted to slides; 3 = functional but slow; 5 = confident, fast, live answers in the system) — weight 30%
Absolute rule: verbal assertions of capability that isn’t demonstrated in the current product version score 2 or lower. “Our system can do that” without showing it = 2. No exceptions.
Reference Check: 12 Questions Every Time
Section titled “Reference Check: 12 Questions Every Time”Call every reference; take verbatim notes:
- What version did you implement vs. what you’re on now?
- Original vs. actual go-live date and cause of delay?
- Original vs. final implementation budget and cause of overages?
- Example of something that went wrong and how the vendor responded?
- Actual uptime vs. SLA?
- Realistic support response time on a critical issue?
- Rate quality of the on-site implementation team vs. what was presented in selection (1–10)?
- What integrations did you build and how difficult were they?
- What do you know now that you wish you’d known before selecting?
- Would you buy this system again?
- Would you work with this vendor again? (listen for the gap between 10 and 11)
- How has the vendor’s product roadmap actually delivered on what was promised during your selection?
Red flags: all references 5+ years old; references significantly different in scale; all references with the same SI (compensating for product gaps with custom code).
12 Commercial Negotiation Levers
Section titled “12 Commercial Negotiation Levers”Negotiate all 12 before signing; they cost nothing at contract time and are worth everything when something goes wrong:
- Contract length for implementation discount — 3 or 5-year SaaS commitment for 15–25% off
- Most-Favored-Nation pricing clause
- Automatic SLA credits — 10–30% of monthly fees per percentage point below SLA
- Source code escrow (Iron Mountain) with triggers for bankruptcy, product discontinuation, acquisition
- Termination for convenience with 90–180 days notice
- UAT acceptance criteria tied to final payment
- Liquidated damages for automation hardware — 0.5–1% of contract value per week of delay, capped at 10–15%
- Performance bonds for automation contracts above $5M — bond cost 0.5–2%; passes through to contract price
- Change order governance requiring written authorization before out-of-scope work begins
- Specified training volume and post-go-live support period
- Data portability — all data exportable in standard format at your request and within 30 days of termination
- Integration sandbox environment upgraded before production during every major release
Perpetual license maintenance industry standard: 18–22% of original license value annually. Apply rate against discounted fee (not list price); cap annual escalation at 2–3%.
WMS Vendor Landscape by Tier
Section titled “WMS Vendor Landscape by Tier”Tier 1 Enterprise (5-year TCO $1.5M–$5M+; implementation 9–18 months):
| Vendor | Best fit | Key notes |
|---|---|---|
| Manhattan Active WM | Retail, e-commerce, 3PL | Cloud-native, continuously updated; strongest LM integration; on-prem unavailable — killer criteria if required |
| Blue Yonder WMS / Luminate | Grocery, retail, industrial (global) | Formerly JDA; Panasonic-owned; tightest native WMS-TMS integration |
| Oracle WMS Cloud | Existing Oracle ERP customers (Fusion/NetSuite) | Native integration removes a complexity layer |
| SAP Extended Warehouse Mgmt (EWM) | SAP S/4HANA environments | Deepest MHE integration; strong Europe; requires specialized EWM integrators |
Tier 2 Mid-Market (5-year TCO $300K–$1.5M; implementation 4–9 months):
| Vendor | Notes |
|---|---|
| Infios (formerly Körber, rebranded March 2025) | Rare mid-market WMS + TMS (acquired MercuryGate 2024) |
| Tecsys | Strong in healthcare distribution and regulated environments; 2024 Nucleus WMS leader |
| Softeon | Highly configurable; strong OMS integration |
| Infor WMS | Industry-specific templates for food, beverage, distribution |
| Made4net | Cloud-native; strong 3PL billing |
| Logiwa | Cloud-native; fast implementation; high-velocity e-fulfillment |
| Extensiv (formerly 3PL Central, rebranded May 2022) | Purpose-built small-to-mid 3PLs |
Integration costs are additive and often match or exceed software costs in Tier 1 implementations.
WES, WCS, TMS, LMS, and Slotting Vendors
Section titled “WES, WCS, TMS, LMS, and Slotting Vendors”WES (orchestrates work across labor and automation; sits between WMS and WCS):
- Honeywell Intelligrated Momentum WES — market-leading; cloud-native; supports robotics orchestration and AS/RS integration
- SVT Robotics — integration platform for heterogeneous robotics environments
- Lucas Systems — voice-directed picking with Dynamic Slotting
WCS (machine-level control: conveyor speeds, sort divert logic, AS/RS crane control):
- Integrator-tied: Dematic iQ, Honeywell Intelligrated Momentum WCS, SSI Schaefer WAMAS, Vanderlande VISION/miControl, Daifuku/Wynright
- Independent: Numina Group (mid-market), Cirro (cloud-based WCS/WES)
- Non-negotiables: real-time latency (machine control cannot tolerate >100ms); equipment vendor certification for specific MHE model and firmware version; failover architecture
TMS (Gartner 2024 leaders): Blue Yonder, e2open, Manhattan Associates, Oracle TMS, SAP TM; Infios for mid-market. Enterprise: $500K–$2M+ Year 1; mid-market: $100K–$500K.
LMS: Blue Yonder LMS and Manhattan Active Labor for Tier 1. Typically drives 15–25% productivity improvement; payback within 12–24 months for 75+ direct labor headcount.
Slotting: Optricity/OptiSlot DC (Fortna-owned), Lucas Systems Dynamic Slotting, Optioryx Pulse.
Automation Hardware Vendors by Category
Section titled “Automation Hardware Vendors by Category”AS/RS:
- AutoStore — grid-based cube storage; 1,700+ installations; sold through certified integrators only; R5 Pro robot (NRF 2024) reduces bot count per grid ~15%
- Exotec Skypod — modular goods-to-person; next-gen released February 2025
- Attabotics / Lafayette Systems — 3D cubic storage; returned under Lafayette Systems branding April 2026 after 2025 restructuring — validate financial stability before specifying
- Traditional crane AS/RS: Dematic, SSI Schaefer, Swisslog, Daifuku for pallet and case-level high-ceiling operations
AMR:
- Locus Robotics — 7B+ picks; LocusONE platform; Origin and Vector bots
- Geek+ — Asia-dominant; PopPick, MovePick; CarouselAI orchestration
- 6 River Systems (Shopify) — Chuck AMR for mid-market e-commerce
- Symbotic — AI-powered case-handling; 42+ Walmart DCs; Tier 1 scale only
Robotic piece picking:
- Covariant — RFM-1 AI model; deployed at McKesson with KNAPP; acquired by Amazon 2024 — evaluate vendor independence risk
- RightHand Robotics — piece-picking work cell partnered with Locus
- Mujin — strong in automotive and logistics automation
Two selection principles:
- “Buy the application, not the technology” — evaluate which technologies serve the application at your operational parameters; don’t select on demo impressiveness
- AMR pilot programs (50–100 robots, one zone) are reasonable; for fixed automation (AS/RS, sortation) there is no meaningful pilot — get the exit ramp in the contract before you sign
Seven Common Mistakes in Technology Selection
Section titled “Seven Common Mistakes in Technology Selection”- Demo charisma over substance — trust the scripted rubric; surface the tension between gut feel and scorecard rather than overriding the process quietly
- Gartner MQ over-reliance — MQ rewards scale and marketing investment, not implementation success; Niche Player vendors may be best fit
- Integration cost underestimation — 30–50% of total implementation value; ask every vendor for integration count to your specific ERP version and average cost at your scale
- No exit ramps in contracts — every contract needs termination for convenience, data portability, transition services, and definition of what happens to your data post-termination
- Single stakeholder veto — document voting rights, killer criteria, and decision rules at Phase 0
- Vendor financials not validated — privately held: request audited financials; public: R&D spend as % of revenue should be 12–20% for healthy enterprise software
- Change management cost ignored — budget 10–15% of total project cost; Gartner found only 32% of planners implementing new planning tools actually used the new tool — 68% of spend was wasted due to adoption failure
Pro content
Subscribe to read the rest
This article is part of our Pro library — practitioner-level guidance, frameworks, and decision tools written from real projects.
$9/mo Basic · $13/mo Pro · cancel anytime