FlowBuilder + SNORT is the production-proven path for domains that can express their work as file-level units — FlowBuilder supports both Python and Tcl flows. Chopper is the sibling tool for everything else — domains whose Tcl libraries can’t be cleanly split, where trimming has to happen inside shared files. Same goal, different shape: JSON in, trimmed domain out, with a full audit bundle.
by Rajesh Kudipudi — FEV Global / PESG · DDI
Inspired by FlowBuilder by Stelian and SNORT by Mike McCurdy
Chopper is the per-domain trimming tool in the CTH R2G Tool Flow Manager, designed for domains that can’t be reorganized as a file-based flow. Each domain ships with the full mainline feature set; Chopper takes a JSON specification of what a project actually needs and produces a clean, minimal, customer-specific domain on disk — reproducibly, with a full audit bundle.
Builds the master flow run files.
Analyzes the flow to find required files only.
Fills the proc-level gap for domains that can’t reorganize.
<stage>.tcl run scripts (F3) so a Chopper-only domain still has a runnable flowBuilds the
master flow
Analyzes &
finds deps
Filter master
REPO TFM
Customer
target
Three tools · battle-tested · file-level trimming
<stage>.tcl+ .chopper/
audit bundle
One tool · one pass · file and proc level
| Capability | FlowBuilder | SNORT | Chopper |
|---|---|---|---|
Build run scripts (<stage>.tcl) | ✓ | n/a | ✓ |
| Static dependency discovery | n/a | ✓ | ✓ |
| Whole-file include / exclude | n/a | ✓ (via P4 filter) | ✓ |
| Proc-level surgical trim | out of scope | out of scope | ✓ new capability |
| Best fit when… | domain is naturally file-based | domain can’t reorganize into files | |
| Reproducible audit bundle | — | — | ✓ .chopper/ |
<domain>_backup/ stays untouched; re-trim is one command..chopper/ — manifest, dependency graph, diagnostics, trim report.trim Domain Atrim Domain Btrim Domain COne pipeline, all three capabilities ride through it. F1/F2 decisions are made in P3 Compile; F3 stage emission happens in P5 Build.
Discover domain state, load JSONs, validate schema, parse Tcl into proc/namespace facts.
Apply R1 merge rules: explicit include always wins. Decide each file: KEEP / PROC_TRIM / DROP.
Breadth-first walk of proc → callees. Reporting only, never copies extra files.
Rewrite files on disk, generate <stage>.tcl, post-validate, write .chopper/ bundle.
Whole-file include / exclude using literal paths or globs. The unit is the file — keep all of it or drop all of it.
| Pattern | Matches |
|---|---|
procs/*.tcl | .tcl directly under procs/ |
rule?.fm.tcl | rule1.fm.tcl, rule2.fm.tcl |
reports/** | any depth under reports/ |
rules/**/*.fm.tcl | any .fm.tcl under rules/ |
files.include
survives even if a glob in files.exclude would match it.
{
"$schema": "base-v1",
"domain": "fev_formality",
"files": {
"include": [
"setup.tcl",
"vars.tcl",
"procs/**/*.tcl",
"rules/fm_basic.tcl"
],
"exclude": [
"procs/legacy/*.tcl",
"rules/experimental/**"
]
}
}
Keep the file, surgically delete unwanted Tcl proc definitions. Best for
*_procs.tcl shared libraries — you keep utility A, drop utility B, the file remains.
"procedures": { "include": [ { "file": "procs/core_procs.tcl", "procs": [ "run_setup", "load_design", "verify_netlist" ] } ], "exclude": [ { "file": "procs/core_procs.tcl", "procs": [ "debug_dump", "old_verify" ] } ] }
proc definition + its full braced bodyPROC_TRIM and tags the doomed procsprocs: [] is a hard error (VE-03). Use files.include
if you want to keep a whole file.
procs/core_procs.tclfoo calls bar
and only foo is in procedures.include, only foo survives.
bar shows up in dependency_graph.json & TW-* diagnostics —
you decide whether to add it. No silent expansion.
verify_netlist"procedures": { "include": [ { "file":"procs/core_procs.tcl", "procs":[ "verify_netlist" ] } ] }
verify_netlist.chopper/// dependency_graph.json (excerpt) { "verify_netlist": { "file": "procs/core_procs.tcl", "status": "PI", "callees":[ "check_design", "report_summary", "log_msg" ] }, "log_msg": { "status": "unresolved", "diag": "TW-01" } }
TW-04 records the cycle.
Callees sorted lexicographically before traversal — reproducible.
PI = explicitly kept · PT = traced-only (reporting) ·
unres = unresolved (TW-01)
F3 generates <stage>.tcl run scripts directly from JSON stage definitions —
eliminating manually authored run files. Optional: also emit <stage>.stack
files for the scheduler.
"stages": [ { "name": "setup", "load_from": "", "command": "-xt vw Imy_shell -B BLOCK -T setup", "exit_codes":[0], "steps": [ "source setup.tcl", "source vars.tcl", "run_setup" ] }, { "name": "main", "load_from": "setup", "command": "-xt vw Imy_shell -B BLOCK -T main", "exit_codes":[0,3], "dependencies":["setup"], "steps": [ "load_design", "verify_netlist", "report_summary" ] } ]
command field is the scheduler-side tool launcher —
emitted verbatim into <stage>.stack as the J line.
Vendor / tool agnostic: pt_shell, fc_shell, fm_shell,
innovus -nowin, tempus, calibre — Chopper does not interpret it.
--tool-commands FILE CLI flag
feeds the parser/trace tool-command pool. Listed names emit TI-01 known-tool-command
instead of TW-02 unresolved-proc-call, keeping the dependency graph clean for
vendor builtins like read_db, create_clock, setup_design.
main.tclmain.stack opt-in
Same JSON, same Chopper. The command string carries whatever tool invocation
your scheduler needs — pt_shell, fc_shell, innovus -nowin, …
Drop entire Tcl files from the domain — globs, includes & excludes reconciled by R1.
Surgically delete proc definitions inside files you keep — rewrite in place, re-validate after.
Synthesize <stage>.tcl orchestrators from declarative stage specs and flow-actions.
Every run, every domain, in order — no parallelism, no surprises.
A purpose-built VS Code Copilot agent. Discovers your domain, authors JSONs, runs validate & dry-run, explains every diagnostic.
Domain owners have complete freedom. Each base/feature JSON declares any subset of the three sections; at least one must be present, and they cleanly compose.
*_procs.tcl in placeoptions.cross_validate — F3 step targets must exist in surviving F1/F2 set (warn, not fail)options.generate_stack — emit <stage>.stack alongside <stage>.tcl--dry-run — full trim simulation, zero disk writes--strict — exit non-zero on any warning (CI gate)depends_on declares feature ordering for F3| JSON | Where it lives | Required? | Purpose |
|---|---|---|---|
base.json | <domain>/jsons/base.json | YES | Universal files / procs / stages every project in this domain needs |
*.feature.json | <domain>/jsons/features/<name>.feature.json | optional | Adds files / procs / stage modifications for one optional capability |
project.json | Anywhere — committed recipe | optional | Names one base + ordered list of features → single --project flag |
{
"$schema":"base-v1",
"domain": "my_domain",
"vendor": "synopsys",
"tool": "primetime",
"files": { ... },
"procedures": { ... },
"stages": [ ... ]
}
{
"$schema":"feature-v1",
"name": "dft",
"depends_on":[],
"files": { ... },
"procedures": { ... },
"flow_actions": [
{ "action":"add_stage_after",
"reference":"main", ... }
]
}
{
"$schema":"project-v1",
"project":"PROJECT_ABC",
"domain": "my_domain",
"base": "jsons/base.json",
"features":[
"jsons/features/dft.feature.json",
"jsons/features/power.feature.json"
]
}
--base and --features
directly. The project JSON is just a committed, named recipe.
{
"$schema":"base-v1",
"domain":"fev_formality",
"vendor":"synopsys",
"tool":"formality",
"files":{
"include":[
"setup.tcl",
"vars.tcl",
"rules/fm_basic.tcl",
"procs/fev_procs.tcl"
],
"exclude":[
"rules/experimental/**"
]
},
"procedures":{
"include":[
{ "file":"procs/fev_procs.tcl",
"procs":["setup_fev","match","verify"] }
]
},
"stages":[
{ "name":"fev",
"load_from":"",
"command":"fm_shell -64 -f main.tcl",
"exit_codes":[0],
"steps":[
"source setup.tcl","setup_fev",
"match","verify"
]
}
]
}
{
"$schema":"base-v1",
"domain":"sta_pt",
"vendor":"synopsys",
"tool":"primetime",
"files":{
"include":[
"setup.tcl","vars.tcl",
"procs/**/*.tcl",
"sdc/*.sdc"
]
},
"procedures":{
"include":[
{ "file":"procs/sta_procs.tcl",
"procs":["read_design","apply_constraints",
"update_timing","report_qor"] }
],
"exclude":[
{ "file":"procs/sta_procs.tcl",
"procs":["debug_paths","old_report"] }
]
},
"stages":[
{ "name":"sta",
"load_from":"",
"command":"pt_shell -64 -f main.tcl",
"exit_codes":[0,3],
"steps":[
"read_design","apply_constraints",
"update_timing","report_qor"
]
}
]
}
{
"$schema":"base-v1",
"domain":"power",
"vendor":"synopsys",
"tool":"primepower",
"options":{ "cross_validate":true,
"generate_stack":true },
"files":{
"include":[
"setup.tcl","vars.tcl",
"procs/power_procs.tcl"
]
},
"procedures":{
"include":[
{ "file":"procs/power_procs.tcl",
"procs":["read_vcd","compute_power",
"report_power"] }
]
},
"stages":[
{ "name":"power",
"load_from":"sta",
"command":"pt_shell -64 -ppower -f main.tcl",
"exit_codes":[0],
"dependencies":["sta"],
"steps":[
"read_vcd","compute_power","report_power"
]
}
]
}
innovus, tempus, joules),
Synopsys, and Mentor. Vendor and tool are just metadata + a command string.
.chopper/ audit bundle# validate — read-only; never touches disk $ chopper validate --project project.json # trim — backs up & rebuilds <domain>/ (--dry-run to simulate) $ chopper trim --project project.json # loc — read-only LOC report; no .chopper/, no rewrites (2.6.0+) $ chopper loc --project project.json # cleanup — remove <domain>_backup/ when happy $ chopper cleanup --confirm # mcp-serve — read-only MCP stdio server (0.4.0+) $ chopper mcp-serve
--base / --features | bypass project.json |
--project | committed recipe (exclusive w/ above) |
--strict | non-zero exit on any warning (CI gate) |
--dry-run | authoring iteration loop |
--tool-commands FILE | vendor builtins → TI-01 not TW-02 |
--no-color | plain-text output for logs |
.chopper/<domain>/.chopper/ ├── compiled_manifest.json # every file's fate ├── dependency_graph.json # BFS proc call tree ├── trim_report.txt / .json # human + machine summary ├── diagnostics.json # every VE/VW/PE/PW/TW ├── run_result.json # exit code, durations └── inputs/ # exact base + features
<domain>/ → <domain>_backup/, rebuild trimmed copy--confirm requiredmcp-serve is stdio-only and exposes exactly three tools:
chopper.validate, chopper.explain_diagnostic, chopper.read_audit.
Destructive tools (trim, cleanup) are never exposed over MCP — protocol errors emit PE-04.
chopper loc — size a trim before you commit to it 2.6.0+validate (P0–P4 + manifest-only P6) plus the F3 stage generator in no-write mode.chopper trim would produce..chopper/, no rename, no rewrites. Safe to run on read-only checkouts.validate: 0 clean · 1 errors (or warnings under --strict) · 2 CLI · 3 internal.$ chopper loc [--domain PATH]
(--base PATH [--features PATHS] | --project PATH)
--domain PATH | domain root (defaults to cwd) |
--base PATH | base JSON; required unless --project |
--features PATHS | comma-separated feature JSONs; order matters |
--project PATH | project recipe (mutually exclusive) |
--strict | any warning → exit 1 |
--tool-commands | vendor builtin pool (repeatable) |
chopper loc: read-only LOC report files.before: 412 files.after: 187 files.delta: -225 files.reduction_pct: 54.61% lines.before: 38214 lines.after: 12907 lines.delta: -25307 lines.reduction_pct: 66.22% sloc.before: 28903 sloc.after: 9651 sloc.delta: -19252 sloc.reduction_pct: 66.61% treatment.FULL_COPY.files: 92 treatment.PROC_TRIM.files: 95 treatment.PROC_TRIM.lines_before: 30901 treatment.PROC_TRIM.lines_after: 7621 treatment.REMOVE.files: 225 treatment.GENERATED.files: 0
| Treatment | Before | After |
|---|---|---|
FULL_COPY | source lines + SLOC | unchanged |
PROC_TRIM | source lines + SLOC | minus dropped-proc spans (incl. DPA + comment block) |
REMOVE | source lines + SLOC | 0 |
GENERATED | 0 | rendered stage .tcl |
key: value per line — grep, awk, or capture in CI to track LOC reduction over time as features are added.
A purpose-built VS Code Copilot Chat agent at
.github/agents/chopper-agent.agent.md — the single user-facing agent
for anything Chopper-related, from a convoluted Tcl codebase to a validated, trimmed output.
base.json, *.feature.json, project.jsonvalidate + trim --dry-run, explains results.chopper/ artifacts and tells you what to fixschemas/scripts/file_bug_report.pyJSON authoring & review.
No CLI calls, no disk writes.
analyze + validate + dry-run + audit walk.
Live trim only on explicit direction.
.github/prompts/Generate starter JSONs, then run schema and overlay-validation checks.
Walk .chopper/ outputs and pinpoint why files/procs were removed.
Find the breaking feature and verify no unintended surviving-set delta.
Draft a complete report and package logs/artifacts for upload.
chopper mcp-serve exposes three stdio tools:
chopper.validate, chopper.explain_diagnostic, chopper.read_audit.
Destructive tools (trim, cleanup) are CLI-only.
user_docs/01_OVERVIEW.md + user_docs/02_CLI_GUIDE.md (≈45 min)examples/ folder for your shape (F1 / F2 / F3 / combo)jsons/ into your domain root, replace placeholderspython schemas/scripts/validate_jsons.py <domain>/schemas/scripts/file_bug_report.pyuser_docs/README.md — landing + reading orderuser_docs/01_OVERVIEW.md — problem, F1/F2/F3, JSON, BKMsuser_docs/02_CLI_GUIDE.md — every flag, deep examplesuser_docs/03_HOW_CHOPPER_WORKS.md — pipeline, FAQDesigned to ramp from this deck in 60–90 min.
technical_docs/ARCHITECTURE.md — authoritative spectechnical_docs/JSON_AUTHORING_GUIDE.md — every JSON fieldtechnical_docs/CLI_REFERENCE.md — full CLI surfacetechnical_docs/DIAGNOSTIC_CODES.md — every codetechnical_docs/ENGINEERING.md — module & service catalogtechnical_docs/IMPLEMENTATION.md — parser internals + pitfallsexamples/01_base_files_only/examples/02_base_procs_only/examples/07_base_full/examples/09_base_plus_multiple_features/schemas/base-v1.schema.jsonschemas/feature-v1.schema.jsonschemas/project-v1.schema.json