BIM Workflow · Navisworks · Clash Detection
Navisworks Clash Detective:
The Complete Field Guide
From setting up targeted clash tests with Search Sets to grouping results, assigning clashes, and exporting reports the design team can actually act on.
Clash Detective is where BIM coordination either works or falls apart. Run it wrong and you get thousands of irrelevant results that nobody acts on. Run it right — with targeted Search Sets, clear tolerances, and structured output — and you get a list of real problems with real owners.
PART 01 Clash Types — What You're Actually Detecting
Before setting up any test, you need to choose the right clash type. The wrong type produces either too many results or misses what matters.
| Type | What it detects | Tolerance | Best used for |
|---|---|---|---|
| Hard | Physical intersection — elements are literally overlapping in 3D space | 0mm | Structural vs MEP, pipe vs slab penetrations |
| Hard (Conservative) | Intersection based on bounding box — slightly looser than Hard | 0mm | Quick first-pass check, complex geometry |
| Clearance | Elements are within a defined distance of each other — not touching but too close | 50–200mm | Maintenance access, insulation clearance, fire rating gaps |
| Duplicate | Identical elements occupying the same space — usually a modelling error | 0mm | QA checks, model cleanup before coordination |
💡 Field recommendation: Always run Hard clashes first — these are the non-negotiables that must be resolved before construction. Run Clearance checks only after Hard clashes are resolved, using discipline-specific tolerances (MEP maintenance access typically needs 600mm minimum clearance).
PART 02 Setting Up a Clash Test — Step by Step
The difference between a useful clash test and a noise-generating one is almost entirely in how you define the input sets. Entire File vs Entire File on a large federated model can produce 50,000+ results. Search Set vs Search Set on the same model produces 200 results — all relevant.
Go to Home tab → Tools panel → Clash Detective. The Clash Detective panel opens — keep it docked for the session.
Home → Tools → Clash DetectiveClick Add Test. A new test row appears with a default name. Rename it immediately using a consistent convention — e.g. STR vs MEP-HVAC · L3 · Hard. Good names save enormous confusion when you have 20+ tests running.
In the test configuration, find Selection A. Change the dropdown from Entire File to Sets. Your saved Search Sets appear in the list. Select the appropriate set — e.g. STR - Level 3 - All Structural.
Repeat for Selection B. Select the opposing discipline set — e.g. MEP - HVAC - All Ductwork. The two sets must not overlap — if the same element appears in both sets, it will always clash with itself.
Select the Type (Hard, Clearance, Duplicate). For Clearance, set the Tolerance value in mm. For Hard clashes, leave tolerance at 0mm — or set a small positive value (e.g. 5mm) to filter out negligible model misalignments.
Click Run Test (or Run All to execute all configured tests at once). Navisworks processes the geometry intersection and populates the Results tab. Large models may take 30–120 seconds per test.
Clash Detective → Run Test / Run All⚠️ Self-clash trap: If a Search Set contains elements from both disciplines being tested, those elements will clash against themselves and flood your results with false positives. Always verify your Search Sets don't overlap before running a test.
PART 03 Recommended Test Configuration Matrix
Here's the test matrix I use as a starting point on multi-discipline projects. Adapt the Search Set names to your project's naming convention.
// ── HARD CLASH TESTS (Priority 1) ──────────────────────
"STR vs MEP-HVAC · All Levels · Hard"
A: STR - All - Structural Model
B: MEP - HVAC - All Ductwork
Type: Hard Tolerance: 0mm
"STR vs MEP-PIPE · All Levels · Hard"
A: STR - All - Structural Model
B: MEP - PIPE - All Piping
Type: Hard Tolerance: 0mm
"STR vs MEP-ELEC · All Levels · Hard"
A: STR - All - Structural Model
B: MEP - ELEC - Cable Trays
Type: Hard Tolerance: 0mm
"ARCH vs MEP · All Levels · Hard"
A: ARCH - All - Architecture Model
B: MEP - All - MEP Combined
Type: Hard Tolerance: 0mm
// ── CLEARANCE TESTS (Priority 2) ───────────────────────
"MEP-HVAC vs STR · Clearance 50mm"
A: MEP - HVAC - All Ductwork
B: STR - All - Structural Model
Type: Clearance Tolerance: 50mm
"MEP-PIPE vs ARCH · Maintenance 600mm"
A: MEP - PIPE - Over 150mm Diameter
B: ARCH - Core - Walls and Partitions
Type: Clearance Tolerance: 600mm
// ── QA / DUPLICATE CHECKS (Priority 3) ─────────────────
"STR - Duplicate Element Check"
A: STR - All - Structural Model
B: STR - All - Structural Model
Type: Duplicate Tolerance: 0mm
PART 04 Managing Clash Results
Raw clash results are just a number. What matters is turning that number into an actionable list — grouped, assigned, and tracked through to resolution.
Clash Status — What Each Means
After running, click the Results tab in Clash Detective. Each row is one clash instance. Click any row — Navisworks zooms to that clash in the viewport and highlights both conflicting elements.
Select multiple clash rows (Ctrl+click) → right-click → Group. Give the group a meaningful name: e.g. Duct run D-03 vs Beam Grid B/3-4. Grouping consolidates related issues and makes the report readable.
Select a clash or group → in the right panel, set Assigned To (discipline responsible) and add a Comment describing the issue and required action. This information is exported with the report.
Change the clash status as the resolution progresses: New → Active → Reviewed → Resolved. After the model is updated, re-run the test — resolved clashes should disappear from the New/Active count.
🎯 Coordination meeting workflow: Before each coordination meeting, run all tests → filter results to show only New and Active clashes → group related issues → assign to responsible disciplines. The meeting then focuses on resolution decisions, not on sorting through raw data.
PART 05 Exporting Clash Reports
Clash reports let you share findings with team members who don't have Navisworks — architects, engineers, or clients who need to understand what needs to be fixed.
In Clash Detective, click the Report tab (next to Results). This is where you configure what gets included in the export.
Check the fields to include: Clash Name, Status, Description, Assigned To, Comments, Element IDs, Viewpoint snapshot. Viewpoint snapshots (images) are essential — they show exactly where the clash is without needing Navisworks.
Select format — HTML for human-readable sharing, XML for integration with other BIM tools or issue trackers. Click Write Report and choose a save location.
Report tab → Write Report → HTML or XML💡 Naming the report file: Include the test name, date, and status filter in the filename.
ClashReport_STR-vs-MEP_2026-04-06_ActiveOnly.html
This makes it immediately clear what the report covers and when it was generated — critical when managing multiple revision cycles.
PART 06 Real-World Use Cases
Run MEP vs Structural Slab hard clash tests before structural shop drawings are issued. Catch missing sleeve locations before concrete is poured — not after.
Use clearance tests to verify ductwork, piping, and cable trays all fit within the ceiling plenum with required maintenance access — before suspended ceiling heights are fixed.
Run Duplicate checks on each discipline model before it's appended to the federated model. Eliminate internal model errors before they become coordination clashes.
Re-run the same tests after each model update. Track the New / Active / Resolved counts over time — a falling Active count means coordination is working.
The goal isn't zero clashes in Navisworks.
It's zero surprises on the construction site.
PART 07 Clash Detective + Search Sets — The Full Workflow
Putting it all together — here's the complete coordination cycle that connects the Search Set XML workflow from the previous post directly into Clash Detective:
Load the team's master SearchSets_ProjectName_v1.xml into the Sets window. Everyone on the team starts from the same filter definitions.
Build your test matrix in Clash Detective using the imported Search Sets as inputs. Save the NWF file — the test configuration is saved with it.
Click Run All. Review new results, group related clashes, assign to responsible disciplines, add comments.
Export as HTML with viewpoint snapshots. Share with all disciplines before the meeting — they can review clashes assigned to them in advance.
When disciplines update their models, re-run the full test suite. Verified resolved clashes drop off the Active list. New issues surface automatically.
🎯 Key Takeaways
Clash Detective is only as good as its inputs. Search Sets are the foundation — targeted inputs produce targeted results. Entire File vs Entire File produces noise.
Set up your test matrix once at the start of the project. Run it consistently before every coordination meeting. Group results, assign them, track status through to resolution. The clash count going down over time is your coordination health indicator.
This completes the three-part Navisworks coordination series: Selection Sets → Search Sets + XML → Clash Detective. In the next post, I'll cover Viewpoints and Saved Views — how to build a navigation system that makes any federated model easy to review, even for team members who didn't build it.
No comments:
Post a Comment