On a complex project, I see submittal volume run into the hundreds across the lifecycle. Shop drawings for structural steel connections. Product data for air handling units. Brick samples for the exterior envelope. Test reports for concrete compressive strength. Closeout manuals thick enough to fill a bookshelf.
Each construction submittal type carries different preparation requirements, different review standards, and different failure modes. But they all flow through the same overtaxed review chain.
Project teams struggle here not because they lack competence, but because the manual cross-checking work across submittal types does not scale.
I will walk through what various construction submittal types demand, where completeness and compliance break down, and what changes when AI agents cross-check the package before it moves down the line.
The Submittal Review Chain: Who's Involved and What's at Stake
Every submittal, regardless of type, follows the same contractual review chain established by AIA A201. The subcontractor or supplier prepares, the general contractor pre-reviews and stamps, the architect and discipline-specific consulting engineers review for conformance with design intent, and a disposition comes back.
Review parties typically include the subcontractor, GC project engineer, architect, discipline-specific engineers, and the GC team tracking dispositions. Delivery methods can add the owner's representative on CM-at-risk or multiple-prime projects.
The stakes are both operational and contractual. Under AIA A201 §3.10.2, if the contractor fails to maintain the submittal schedule, that creates material risk to claims for additional time or money tied to review timing.
The reality is that contractor schedules often contain only onsite activities, without fully accounting for submittal review, procurement, and delivery workflows. That means resubmittal cycles and review delays can accumulate off-schedule until they surface as critical-path problems.
12 Construction Submittal Types and Where Each One Breaks
Submittals are classified under CSI MasterFormat into two meta-categories: action submittals requiring architect response, and informational submittals filed for record, across 12 types.
Here is the practitioner taxonomy with the failure modes project teams keep seeing:
Shop Drawings (SD-02). Shop drawings are custom-prepared drawings, diagrams, and schedules illustrating how the contractor proposes to build. Structural steel connection details. Curtainwall mullion profiles. Mechanical ductwork coordination drawings. These are high-stakes action submittals. Where it breaks: Deviations from contract requirements go undetected during GC pre-review and surface only at the architect's desk, or worse, after fabrication.
Product Data (SD-03). Standard manufacturer literature marked to show applicable items. HVAC cut sheets. Roofing membrane data sheets. Luminaire photometrics. Where it breaks: Unmarked catalog pages submitted without highlighting the specified model, leading to automatic rejection.
Samples (SD-04). Physical specimens (e.g., brick color chips, carpet swatches, hardware finishes). Where it breaks: Rejected samples typically require sourcing and shipping a new specimen, extending procurement lead times in ways no digital workflow can compress.
Mock-Ups (01 43 00). Mock-ups are full-scale assemblies built on-site or off-site for visual and performance evaluation (e.g., curtainwall mock-ups, masonry panel assemblies). Where it breaks: Like samples, a rejected mock-up can mean rebuilding, but at far greater cost and schedule impact.
Preconstruction Submittals (SD-01). Preconstruction submittals include CPM schedules, submittal registers, safety plans, and quality control plans. These are administrative rather than product-specific. Where it breaks: The submittal schedule itself can affect the contractor's position on delay-related claims.
Design Data (SD-05). Design data includes engineering calculations for contractor-designed elements such as formwork, cold-formed steel framing, and seismic anchorage. Where it breaks: These often require engineering review beyond a basic completeness check, and errors in contractor-designed elements carry structural and liability risk.
Test Reports (SD-06). Test reports include concrete cylinder breaks, soil compaction results, and air balancing reports. These are informational submittals that do not require architect approval. Where it breaks: Values that fall outside specified ranges can trigger project-stopping conversations.
Certificates (SD-07). Certificates include manufacturer compliance certificates, installer qualifications, and fire-resistance ratings. Where it breaks: Missing or expired certifications discovered during closeout rather than during installation.
Manufacturer's Instructions (SD-08). Instructions cover installation procedures provided by the manufacturer. Where it breaks: Missing or incomplete instructions can void warranty coverage and leave no documented basis for compliant installation.
Field Reports (SD-09). Field reports cover field inspection documentation. Where it breaks: Incomplete field documentation undermines the record of compliant installation, creating exposure during warranty claims or disputes.
O&M Data (SD-10). Equipment manuals, BAS programming documentation, maintenance schedules. Federal timing sets these due within calendar days of equipment delivery. Where it breaks: Routinely submitted incomplete at closeout, forcing teams to reconstruct documentation long after installation.
Closeout Submittals (SD-11). Closeout items include as-built drawings, final approved shop drawings, O&M manuals, warranty management plan and tags, and spare parts data. Per AIA A201 §3.11, the contractor must maintain a complete annotated set of Contract Documents reflecting field changes and turn these over to the Architect for submittal to the Owner. Where it breaks: Closeout packages arrive incomplete because data was never systematically collected during construction.
Sustainable Design / LEED Submittals sit alongside this taxonomy as a cross-cutting category. LEED records include recycled content documentation, waste diversion forms, and regional material sourcing records. Where it breaks: These duplicate other submittal content reformatted specifically for sustainability credit tracking, so missing or inconsistent data in the source submittals cascades directly into incomplete LEED documentation.
Where Manual Workflows Fail Across Submittal Types
This is where the workflow breaks under volume. Each submittal type fails differently:
Shop drawings fail on deviations from contract requirements.
Product data fails on completeness (e.g., unmarked cut sheets, wrong model numbers highlighted, outdated literature).
Closeout submittals fail because O&M data was never collected during construction and must be reconstructed after the fact.
Wrong spec sections and similar avoidable errors can still slip through and may not be caught until the rejection comes back.
Taken together, the cited audit data and practitioner sources point to the same operational reality. Manual workflows create recurring risk around incomplete, late, or non-compliant submittals, while the human review chain catches what it can.
AI Agents Shift Non-Compliance Detection Upstream
GC pre-review is where non-compliance is supposed to get caught. It is also where volume makes that difficult.
In the traditional workflow, non-compliance often reaches the architect's desk only after the GC has already invested review time and after days or weeks have elapsed. Datagrid's AI agents move that detection point upstream by cross-checking submittals against specs, drawings, schedules, and other project files before they leave the GC's hands.
The cross-referencing work varies by submittal type. One package needs dimensional conformance checked. Another needs model numbers verified. A third needs ASTM designations matched. The Summary Spec Submittal Agent detects this kind of structured compliance issue. CMAA research notes that the tasks most prone to human error in submittal workflows are also among the tasks most critical for contract compliance. That is the gap AI agents address.
How AI Agents Handle Submittal Cross-Checking
Datagrid's AI agents cross-check each submittal package against specs, drawings, and connected project data to flag non-compliance before it leaves the GC's desk.
Validate completeness against CSI section requirements (e.g., unmarked catalog sheets, missing certifications, incomplete O&M data)
Compare documents across platforms to catch discrepancies before handover
Route flagged exceptions by discipline and urgency so project engineers focus on judgment calls rather than data assembly
The clearest fit is digital submittals, including shop drawings, product data, test reports, certificates, and O&M documentation. Physical samples and mock-ups still involve review and replacement cycles that digital workflows do not compress.
What Project Teams Are Seeing with AI Submittal Review
"With Datagrid we are able to review submittals much faster than our prior manual workflow." Jacob Freitas, Project Executive, Level 10 Construction
Moving the compliance check earlier in the review chain means the submittals that reach the architect's desk are already cross-referenced against the spec. Project teams stay focused on exceptions and coordination. AI agents cross-check the work between decisions.



