Skip to content
Search

Concept Design Report

The Concept Design Report is a 40-100 page structured engineering document — not a slide deck — produced during Phase 1 (pre-award) of an automation project. It is simultaneously an engineering document and a commercial document. The integrator’s assumptions embedded in it are already priced. Wrong assumptions become change orders or margin erosion; there is no third option.

Every bad automation project starts as a bad concept phase. Not bad execution — bad concept.

Section 1 — Executive Summary: Recommended concept, ROM budget range, ROI headline, key assumptions and risks, recommended next steps.

Section 2 — Current State Analysis: Facility overview, operational profile, order profile analysis, SKU velocity analysis, peak period characterization, current-state constraints.

Section 3 — Operational Requirements: Design throughput targets (average, peak, peak surge), accuracy requirements, system availability target, physical constraints, product specification envelope, integration requirements, workforce targets.

Section 4 — Technology Screening and Alternatives Analysis: Technologies evaluated and rationale, screening matrix, 2-3 short-list alternatives developed to comparison depth, technology risk assessment.

Section 5 — Recommended Concept Description: System overview, equipment list at concept level, block-level floor layout, high-level controls architecture, DES simulation summary, staffing model.

Section 6 — ROM Budget Estimate: Total range (±30-50%), breakdown by category: equipment / software / civil-structural / electrical / installation / commissioning / engineering / contingency.

Section 7 — Project Schedule (Conceptual): Phase milestones only. Long-lead procurement items and customer-owned milestones (building readiness, WMS project, IT resources) called out explicitly.

Section 8 — ROI and Payback: Labor savings, space savings, throughput capacity value, accuracy improvement value, payback (simple + discounted), NPV at 10 years, sensitivity analysis at ±20% volume.

Section 9 — Risk Register: 8-12 risks with likelihood, impact, and mitigation. Open assumptions requiring customer confirmation.

Section 10 — Appendices: Order profile data, SKU velocity detail, simulation model documentation, preliminary CAD floor plan, vendor cut sheets, reference projects.

Minimum 12 months of transaction-level data. Two years is better (captures two seasonal cycles).

What to build:

  • Orders per day distribution — not just the average. The 95th and 99th percentile days. The duration of peak (consecutive days at peak). This determines whether you design for absolute peak or 95th percentile with operational mitigation.
  • Lines per order distribution — single-line orders behave completely differently than 10-line orders through sortation and packing.
  • Unit cube and weight distribution — determines which technologies are physically feasible.
  • SKU velocity Pareto (A/B/C/D) — top 20% of SKUs typically drive 80% of unit volume. This determines slotting: A-items in GTP or high-access storage, B-items in semi-automated zones, C/D in manual overflow.

ROM budget at concept phase carries ±30-50% accuracy. This is not a hedge — it is an honest statement of what is knowable. Contingency at concept phase should be 15% or higher. Not 5%. Integrators who price at 5% contingency at concept phase are either wrong about accuracy or setting up change orders.

ROM budget breakdown for a $20-40M system:

  • Equipment: 50-60% of total
  • Software/WCS: 8-12%
  • Civil/structural/electrical: 10-15%
  • Installation: 10-15%
  • Engineering: 5-8%
  • Contingency: 15%+

Unknown at concept phase: final equipment quantities, exact installation labor hours, WMS interface complexity, building modification scope, supply chain pricing at time of purchase.

Screen each technology against: throughput capability, scalability and flexibility, capital cost (ROM), operational complexity, implementation risk, maintenance burden, floor space efficiency. Score against weighted criteria. Weights are negotiated with the customer — making priorities explicit prevents the “it depends on what you value” conversation from surviving into steering committee.

Standard tools: Arena, Simio, proprietary integrator platforms (Dematic’s own environment).

A concept-phase DES is not the same as a detailed-design DES. What it can do: show whether the proposed system architecture can theoretically achieve design throughput, and identify where bottlenecks are.

Simulation assumptions that must be documented explicitly:

  • System availability (95-98% assumed)
  • Product conformance rate (often assumed 98-99%; frequently 92% in reality)
  • WMS release logic (when and in what sequence orders are released)
  • Maintenance windows (are scheduled breaks included?)
  • Surge duration (1 hour vs. 7-hour sustained peak produces the same throughput answer but completely different system sizing)

If you documented the assumptions and the actual operation runs outside them, you have a productive conversation. If you didn’t document them, you have a dispute.

The five questions a steering committee will always ask:

  1. “Why ±30-50%?” The range will narrow to ±10-15% at firm design completion, which requires X weeks and Y investment. ±30-50% is honest; an integrator who gives ±10% at concept phase is either wrong or undisclosed.

  2. “Show me the data behind the throughput claim.” Know the peak throughput, duration, conformance rate, and availability factor behind every number you present.

  3. “What if volume grows 20% faster?” Present a scalability section: what is the headroom designed in, what is the investment to reach the next capacity increment.

  4. “What are the top 3 risks?” WMS integration complexity, facility readiness timeline, long-lead controls hardware supply chain — with mitigation strategies for each.

  5. “What happens to payback at 80% of forecast?” Run sensitivity before the meeting. Show the payback curve at ±30% volume. If the business case only works above forecast, the committee needs to know that proactively.

  • Designing to average throughput, not peak
  • Accepting customer data without validation (it may exclude cancelled orders, return processing, or reflect last year’s volume in a business that grew 30%)
  • Scoping WMS interface as “standard” (there is no standard WMS interface)
  • Understating brownfield complexity premium (add 15-25% contingency vs. greenfield)
  • Presenting a single concept (committee should see at least two alternatives at a different cost/risk point)
  • Leaving the project schedule as a Gantt without a critical path

Source: 2.6-advanced-automation-design

Pro content

Subscribe to read the rest

This article is part of our Pro library — practitioner-level guidance, frameworks, and decision tools written from real projects.

$9/mo Basic · $13/mo Pro · cancel anytime