CTH R2G · Tool Flow Manager
For domains that can’t reorganize as a file-based flow.

FlowBuilder + SNORT is the production-proven path for domains that can express their work as file-level units — FlowBuilder supports both Python and Tcl flows. Chopper is the sibling tool for everything else — domains whose Tcl libraries can’t be cleanly split, where trimming has to happen inside shared files. Same goal, different shape: JSON in, trimmed domain out, with a full audit bundle.

by Rajesh Kudipudi — FEV Global / PESG · DDI

Inspired by FlowBuilder by Stelian and SNORT by Mike McCurdy

3
Capabilities — F1 · F2 · F3
8
Pipeline phases P0–P7
4
CLI subcommands
2026.06
CTH global beta release
chopper · onboarding brochure · Rajesh Kudipudi (FEV Global — PESG/DDI)press → to begin

What is Chopper?

02

Chopper is the per-domain trimming tool in the CTH R2G Tool Flow Manager, designed for domains that can’t be reorganized as a file-based flow. Each domain ships with the full mainline feature set; Chopper takes a JSON specification of what a project actually needs and produces a clean, minimal, customer-specific domain on disk — reproducibly, with a full audit bundle.

FlowBuilder, SNORT, and Chopper — siblings, not rivals

1

FlowBuilder

Builds the master flow run files.

  • Production-proven, file-based TFM builder
  • Supports both Python and Tcl flows
  • Generates the full mainline TFM, run scripts & stack files
  • The right tool when the domain can be expressed as files
2

SNORT

Analyzes the flow to find required files only.

  • Production-proven static dependency discovery
  • Output drives P4 templates that filter the master REPO TFM into a customer mini-TFM
  • Excellent for file-based trimming — this is its job, and it does it well
3

Chopper

Fills the proc-level gap for domains that can’t reorganize.

  • Surgical Tcl proc-level trimming (F2) — the new capability
  • Plus selective file include / exclude (F1) when useful
  • Generates <stage>.tcl run scripts (F3) so a Chopper-only domain still has a runnable flow
  • Auto-traces dependencies & call graph; writes a full audit bundle
Pick the right tool for the domain. If your domain is naturally file-based, FlowBuilder + SNORT remains the path of least resistance — it’s tested, proven, and already in production. Chopper exists for domains where that decomposition isn’t feasible: shared Tcl libraries, branchy proc bodies, customer-specific behavior tangled inside common files.
chopper · what is it02 / 17

Two paths to a trimmed domain

03

File-based path: FlowBuilder → SNORT → P4 filter (production, proven)

FlowBuilder

Builds the
master flow

SNORT

Analyzes &
finds deps

P4 templates

Filter master
REPO TFM

Mini TFM

Customer
target

Three tools · battle-tested · file-level trimming

Proc-based path: Chopper, in one JSON-driven run (when files can’t be reorganized)

📜
F3 Run-files
generates <stage>.tcl
🔍
Auto-trace
deps + call graph
📂
F1 File trim
selective include
✂️
F2 Proc trim ★
surgical, unique

Trimmed domain

+ .chopper/
audit bundle

One tool · one pass · file and proc level

Capability matrix — where each tool fits

CapabilityFlowBuilderSNORTChopper
Build run scripts (<stage>.tcl)n/a
Static dependency discoveryn/a
Whole-file include / excluden/a(via P4 filter)
Proc-level surgical trimout of scopeout of scope✓ new capability
Best fit when…domain is naturally file-baseddomain can’t reorganize into files
Reproducible audit bundle.chopper/
chopper · vs FlowBuilder + SNORT03 / 17

The problem we are solving

04

Today — manual trimming

  • Every domain ships with generalized flow code, optional behavior, legacy support, customer-specific feature logic.
  • Owners hand-edit Tcl, delete files, prune procs, fix dangling references — by eye.
  • Risk: broken sources, missed cleanup, inconsistent domains across teams.
  • Audit trail = git diff. Reproducibility = "trust me".

Tomorrow — Chopper

  • JSON declares files, procs, stages the project needs.
  • Chopper parses Tcl, compiles selections, traces the call graph, rewrites the domain on disk.
  • Backup-and-rebuild: <domain>_backup/ stays untouched; re-trim is one command.
  • Every run writes .chopper/ — manifest, dependency graph, diagnostics, trim report.

Project Branch Lifecycle

main branch
full TFM · all domains · all features
git branch project_ABC · 2-week trim window
Owner 1trim Domain A
Owner 2trim Domain B
Owner 3trim Domain C
project branch
trimmed domains, audit-bundled
chopper · problem04 / 17

The 8-phase pipeline (P0–P7)

05

One pipeline, all three capabilities ride through it. F1/F2 decisions are made in P3 Compile; F3 stage emission happens in P5 Build.

P0
Domain
state
P1
Config +
pre-validate
P2
Parse Tcl
P3
Compile
(F1+F2)
P4
Trace BFS
P5
Build output
(F3)
P6
Post-validate
P7
Audit

P0 → P2

Discover domain state, load JSONs, validate schema, parse Tcl into proc/namespace facts.

P3 Compile

Apply R1 merge rules: explicit include always wins. Decide each file: KEEP / PROC_TRIM / DROP.

P4 Trace

Breadth-first walk of proc → callees. Reporting only, never copies extra files.

P5 → P7

Rewrite files on disk, generate <stage>.tcl, post-validate, write .chopper/ bundle.

Determinism: all expansions are normalized, deduplicated, and sorted lexicographically. Same JSONs + same domain ⇒ byte-identical output every run.
chopper · pipeline05 / 17

F1   File-level trimming

06

What it does

Whole-file include / exclude using literal paths or globs. The unit is the file — keep all of it or drop all of it.

Glob patterns

PatternMatches
procs/*.tcl.tcl directly under procs/
rule?.fm.tclrule1.fm.tcl, rule2.fm.tcl
reports/**any depth under reports/
rules/**/*.fm.tclany .fm.tcl under rules/
R1 — explicit include always wins. A literal path in files.include survives even if a glob in files.exclude would match it.

JSON

{
  "$schema": "base-v1",
  "domain": "fev_formality",
  "files": {
    "include": [
      "setup.tcl",
      "vars.tcl",
      "procs/**/*.tcl",
      "rules/fm_basic.tcl"
    ],
    "exclude": [
      "procs/legacy/*.tcl",
      "rules/experimental/**"
    ]
  }
}

Live: before → after

Domain on disk9 files

setup.tcl···
vars.tcl···
procs/core_procs.tcl···
procs/legacy/old_a.tcl···
procs/legacy/old_b.tcl···
rules/fm_basic.tcl···
rules/experimental/wip.tcl···
milestone.tcl···
debug_dump.tcl···

After chopper trim

chopper · F1 file trimming06 / 17

F2   Proc-level trimming

07

Keep the file, surgically delete unwanted Tcl proc definitions. Best for *_procs.tcl shared libraries — you keep utility A, drop utility B, the file remains.

JSON

"procedures": {
  "include": [
    {
      "file": "procs/core_procs.tcl",
      "procs": [
        "run_setup",
        "load_design",
        "verify_netlist"
      ]
    }
  ],
  "exclude": [
    {
      "file": "procs/core_procs.tcl",
      "procs": [ "debug_dump", "old_verify" ]
    }
  ]
}

How it works

  • P2 parser locates each proc definition + its full braced body
  • P3 marks the file as PROC_TRIM and tags the doomed procs
  • P5 rewrites the file in-place, deleting only the marked proc bodies
  • P6 re-parses, verifies brace balance + namespace consistency
Empty procs: [] is a hard error (VE-03). Use files.include if you want to keep a whole file.

Before → after — procs/core_procs.tcl

Before5 procs

# core_procs.tclproc run_setup {} { puts "setup..." }proc load_design {db} { read_db $db }proc verify_netlist {} { check_design -all }proc debug_dump {} { puts "... internal debug ..." }proc old_verify {} { legacy_check }

After

# core_procs.tclproc run_setup {} { puts "setup..." }proc load_design {db} { read_db $db }proc verify_netlist {} { check_design -all }
chopper · F2 proc trimming07 / 17

Proc tracing — the safety net

08
Trace is reporting-only — never copies. If foo calls bar and only foo is in procedures.include, only foo survives. bar shows up in dependency_graph.json & TW-* diagnostics — you decide whether to add it. No silent expansion.

JSON says: keep verify_netlist

"procedures": {
  "include": [
    { "file":"procs/core_procs.tcl",
      "procs":[ "verify_netlist" ] }
  ]
}

BFS walk — starting from verify_netlist

verify_netlistroot
check_designcallee
log_msgcallee
report_summarycallee
format_tablecallee
Frontier (sorted)
Visited
Diagnostics

What you see in .chopper/

// dependency_graph.json (excerpt)
{
  "verify_netlist": {
    "file":   "procs/core_procs.tcl",
    "status": "PI",
    "callees":[ "check_design", "report_summary", "log_msg" ]
  },
  "log_msg": {
    "status": "unresolved",
    "diag":   "TW-01"
  }
}
Cycles are safe. BFS visited-set terminates; TW-04 records the cycle. Callees sorted lexicographically before traversal — reproducible.

PI = explicitly kept · PT = traced-only (reporting) · unres = unresolved (TW-01)

chopper · proc tracing08 / 17

F3   Run-file generation & tool-command injection

09

F3 generates <stage>.tcl run scripts directly from JSON stage definitions — eliminating manually authored run files. Optional: also emit <stage>.stack files for the scheduler.

JSON — stages & tool injection

"stages": [
  {
    "name":      "setup",
    "load_from": "",
    "command":   "-xt vw Imy_shell -B BLOCK -T setup",
    "exit_codes":[0],
    "steps": [
      "source setup.tcl",
      "source vars.tcl",
      "run_setup"
    ]
  },
  {
    "name":      "main",
    "load_from": "setup",
    "command":   "-xt vw Imy_shell -B BLOCK -T main",
    "exit_codes":[0,3],
    "dependencies":["setup"],
    "steps": [
      "load_design",
      "verify_netlist",
      "report_summary"
    ]
  }
]
The stage command field is the scheduler-side tool launcher — emitted verbatim into <stage>.stack as the J line. Vendor / tool agnostic: pt_shell, fc_shell, fm_shell, innovus -nowin, tempus, calibre — Chopper does not interpret it.
Different injection point: the --tool-commands FILE CLI flag feeds the parser/trace tool-command pool. Listed names emit TI-01 known-tool-command instead of TW-02 unresolved-proc-call, keeping the dependency graph clean for vendor builtins like read_db, create_clock, setup_design.

Generated main.tcl

# Chopper-generated: main.tcl # load_from: setup # command: -xt vw Imy_shell -B BLOCK -T main load_design verify_netlist report_summary

Generated main.stack  opt-in

# Chopper-generated stack: main N main J -xt vw Imy_shell -B BLOCK -T main L 0 3 D setup R serial

Vendor & tool agnostic

SNPS · primetime SNPS · fc CDNS · innovus CDNS · tempus Mentor · caliber

Same JSON, same Chopper. The command string carries whatever tool invocation your scheduler needs — pt_shell, fc_shell, innovus -nowin, …

chopper · F3 run-file generation09 / 17

One pipeline. Everything you need.

10
Capability 01
F1

File-level trim

Drop entire Tcl files from the domain — globs, includes & excludes reconciled by R1.

Capability 02
F2

Proc-level trim

Surgically delete proc definitions inside files you keep — rewrite in place, re-validate after.

Capability 03
F3

Run-file generation

Synthesize <stage>.tcl orchestrators from declarative stage specs and flow-actions.

The pipeline

Eight phases.
One deterministic pass.

Every run, every domain, in order — no parallelism, no surprises.

P0 P1 P2 P3 P4 P5 P6 P7
state · validate · parse · compile · trace · trim · re-validate · audit
In-repo agent

Chopper Agent

A purpose-built VS Code Copilot agent. Discovers your domain, authors JSONs, runs validate & dry-run, explains every diagnostic.

analyze-only full-loop bisect prove-safe
Toolset

Three commands.
One audit bundle.

validate
Schema & structure gate
trim
Run the 8-phase pass
cleanup
Restore from .chopper/
BFS trace
Reporting-only graph
mcp-serve
Read-only stdio
VE / TW / PW
Diagnostic registry
chopper · at a glance10 / 17

F1 · F2 · F3 — pick any combination

11

Domain owners have complete freedom. Each base/feature JSON declares any subset of the three sections; at least one must be present, and they cleanly compose.

F1F2F3
F1 only
File-only domains, configs, hooks
F1F2F3
F2 only
Trim a shared *_procs.tcl in place
F1F2F3
F3 only
Generate run scripts; nothing else
F1F2F3
F1 + F2
Drop legacy files and prune procs
F1F2F3
F1 + F3
Trim files; emit run scripts
F1F2F3
F2 + F3
Proc-level + run-file gen
F1F2F3
All three
Full domain trim

Optional switches that work everywhere

  • options.cross_validate — F3 step targets must exist in surviving F1/F2 set (warn, not fail)
  • options.generate_stack — emit <stage>.stack alongside <stage>.tcl
  • --dry-run — full trim simulation, zero disk writes
  • --strict — exit non-zero on any warning (CI gate)

Feature composition

  • Base JSON + 0..N feature JSONs (additive)
  • depends_on declares feature ordering for F3
  • Explicit include from any source always wins
  • A feature's exclude can only prune its own contributions — never base, never another feature
chopper · combinations11 / 17

The three JSON files

12
JSONWhere it livesRequired?Purpose
base.json<domain>/jsons/base.jsonYESUniversal files / procs / stages every project in this domain needs
*.feature.json<domain>/jsons/features/<name>.feature.jsonoptionalAdds files / procs / stage modifications for one optional capability
project.jsonAnywhere — committed recipeoptionalNames one base + ordered list of features → single --project flag

base.json (skeleton)

{
  "$schema":"base-v1",
  "domain": "my_domain",
  "vendor": "synopsys",
  "tool":   "primetime",
  "files":      { ... },
  "procedures": { ... },
  "stages":     [ ... ]
}

feature.json

{
  "$schema":"feature-v1",
  "name":   "dft",
  "depends_on":[],
  "files":         { ... },
  "procedures":    { ... },
  "flow_actions":  [
    { "action":"add_stage_after",
      "reference":"main", ... }
  ]
}

project.json

{
  "$schema":"project-v1",
  "project":"PROJECT_ABC",
  "domain": "my_domain",
  "base":   "jsons/base.json",
  "features":[
    "jsons/features/dft.feature.json",
    "jsons/features/power.feature.json"
  ]
}
You do not need a project JSON. Pass --base and --features directly. The project JSON is just a committed, named recipe.
chopper · JSON structure12 / 17

Real-domain examples — FEV · Timing · Power

13

SNPS · fev_formality

{
  "$schema":"base-v1",
  "domain":"fev_formality",
  "vendor":"synopsys",
  "tool":"formality",
  "files":{
    "include":[
      "setup.tcl",
      "vars.tcl",
      "rules/fm_basic.tcl",
      "procs/fev_procs.tcl"
    ],
    "exclude":[
      "rules/experimental/**"
    ]
  },
  "procedures":{
    "include":[
      { "file":"procs/fev_procs.tcl",
        "procs":["setup_fev","match","verify"] }
    ]
  },
  "stages":[
    { "name":"fev",
      "load_from":"",
      "command":"fm_shell -64 -f main.tcl",
      "exit_codes":[0],
      "steps":[
        "source setup.tcl","setup_fev",
        "match","verify"
      ]
    }
  ]
}

SNPS · sta_pt (Timing)

{
  "$schema":"base-v1",
  "domain":"sta_pt",
  "vendor":"synopsys",
  "tool":"primetime",
  "files":{
    "include":[
      "setup.tcl","vars.tcl",
      "procs/**/*.tcl",
      "sdc/*.sdc"
    ]
  },
  "procedures":{
    "include":[
      { "file":"procs/sta_procs.tcl",
        "procs":["read_design","apply_constraints",
                  "update_timing","report_qor"] }
    ],
    "exclude":[
      { "file":"procs/sta_procs.tcl",
        "procs":["debug_paths","old_report"] }
    ]
  },
  "stages":[
    { "name":"sta",
      "load_from":"",
      "command":"pt_shell -64 -f main.tcl",
      "exit_codes":[0,3],
      "steps":[
        "read_design","apply_constraints",
        "update_timing","report_qor"
      ]
    }
  ]
}

SNPS · power

{
  "$schema":"base-v1",
  "domain":"power",
  "vendor":"synopsys",
  "tool":"primepower",
  "options":{ "cross_validate":true,
                "generate_stack":true },
  "files":{
    "include":[
      "setup.tcl","vars.tcl",
      "procs/power_procs.tcl"
    ]
  },
  "procedures":{
    "include":[
      { "file":"procs/power_procs.tcl",
        "procs":["read_vcd","compute_power",
                  "report_power"] }
    ]
  },
  "stages":[
    { "name":"power",
      "load_from":"sta",
      "command":"pt_shell -64 -ppower -f main.tcl",
      "exit_codes":[0],
      "dependencies":["sta"],
      "steps":[
        "read_vcd","compute_power","report_power"
      ]
    }
  ]
}
Each example uses F1 + F2 + F3 together. Same JSON shape works for Cadence (innovus, tempus, joules), Synopsys, and Mentor. Vendor and tool are just metadata + a command string.
chopper · real examples13 / 17

CLI workflow & .chopper/ audit bundle

14

Five subcommands

# validate — read-only; never touches disk
$ chopper validate --project project.json

# trim — backs up & rebuilds <domain>/ (--dry-run to simulate)
$ chopper trim --project project.json

# loc — read-only LOC report; no .chopper/, no rewrites (2.6.0+)
$ chopper loc --project project.json

# cleanup — remove <domain>_backup/ when happy
$ chopper cleanup --confirm

# mcp-serve — read-only MCP stdio server (0.4.0+)
$ chopper mcp-serve

Useful flags

--base / --featuresbypass project.json
--projectcommitted recipe (exclusive w/ above)
--strictnon-zero exit on any warning (CI gate)
--dry-runauthoring iteration loop
--tool-commands FILEvendor builtins → TI-01 not TW-02
--no-colorplain-text output for logs

Every run writes .chopper/

<domain>/.chopper/
├── compiled_manifest.json   # every file's fate
├── dependency_graph.json    # BFS proc call tree
├── trim_report.txt / .json  # human + machine summary
├── diagnostics.json         # every VE/VW/PE/PW/TW
├── run_result.json          # exit code, durations
└── inputs/                  # exact base + features

Backup-and-rebuild safety

  • First trim: <domain>/<domain>_backup/, rebuild trimmed copy
  • Re-trim rebuilds from backup; backup never touched
  • Failure mid-run: backup intact, next run rebuilds cleanly
  • Cleanup is explicit; --confirm required
MCP read-only surface (0.4.0). mcp-serve is stdio-only and exposes exactly three tools: chopper.validate, chopper.explain_diagnostic, chopper.read_audit. Destructive tools (trim, cleanup) are never exposed over MCP — protocol errors emit PE-04.
chopper · CLI & audit14 / 17

chopper loc — size a trim before you commit to it 2.6.0+

15

What it does

  • Runs the same front half as validate (P0–P4 + manifest-only P6) plus the F3 stage generator in no-write mode.
  • Prints a line-oriented LOC report comparing the source domain against what chopper trim would produce.
  • Writes nothing — no .chopper/, no rename, no rewrites. Safe to run on read-only checkouts.
  • Same exit codes as validate: 0 clean · 1 errors (or warnings under --strict) · 2 CLI · 3 internal.

Invocation & flags

$ chopper loc [--domain PATH]
              (--base PATH [--features PATHS] | --project PATH)
--domain PATHdomain root (defaults to cwd)
--base PATHbase JSON; required unless --project
--features PATHScomma-separated feature JSONs; order matters
--project PATHproject recipe (mutually exclusive)
--strictany warning → exit 1
--tool-commandsvendor builtin pool (repeatable)

Sample output

chopper loc: read-only LOC report
files.before: 412
files.after: 187
files.delta: -225
files.reduction_pct: 54.61%
lines.before: 38214
lines.after: 12907
lines.delta: -25307
lines.reduction_pct: 66.22%
sloc.before: 28903
sloc.after: 9651
sloc.delta: -19252
sloc.reduction_pct: 66.61%
treatment.FULL_COPY.files: 92
treatment.PROC_TRIM.files: 95
treatment.PROC_TRIM.lines_before: 30901
treatment.PROC_TRIM.lines_after: 7621
treatment.REMOVE.files: 225
treatment.GENERATED.files: 0

How metrics are computed

TreatmentBeforeAfter
FULL_COPYsource lines + SLOCunchanged
PROC_TRIMsource lines + SLOCminus dropped-proc spans (incl. DPA + comment block)
REMOVEsource lines + SLOC0
GENERATED0rendered stage .tcl
Pipe-friendly. key: value per line — grep, awk, or capture in CI to track LOC reduction over time as features are added.
chopper · loc report15 / 17

CHOPPER AGENT  in-repo agent

16

A purpose-built VS Code Copilot Chat agent at .github/agents/chopper-agent.agent.md — the single user-facing agent for anything Chopper-related, from a convoluted Tcl codebase to a validated, trimmed output.

What it does

  • Q1–Q5 discovery protocol on an unfamiliar codebase
  • Authors base.json, *.feature.json, project.json
  • Runs validate + trim --dry-run, explains results
  • Reads .chopper/ artifacts and tells you what to fix
  • Explains any diagnostic against the registry
  • Files structured bug reports via schemas/scripts/file_bug_report.py

Two operating modes

ANALYZE-ONLY

JSON authoring & review.
No CLI calls, no disk writes.

FULL-LOOP

analyze + validate + dry-run + audit walk.
Live trim only on explicit direction.

Named playbooks

  • Bisect — find which feature broke trim
  • Compare — diff two runs' audit bundles
  • Prove-safe — verify no surviving-set delta

Prompt library — .github/prompts/

bootstrap-domain + validate-my-jsons

Generate starter JSONs, then run schema and overlay-validation checks.

explain-last-run + why-was-dropped

Walk .chopper/ outputs and pinpoint why files/procs were removed.

bisect-feature-breakage + prove-safe

Find the breaking feature and verify no unintended surviving-set delta.

report-chopper-bug + package-bug-artifacts

Draft a complete report and package logs/artifacts for upload.

Plus — MCP read-only surface. chopper mcp-serve exposes three stdio tools: chopper.validate, chopper.explain_diagnostic, chopper.read_audit. Destructive tools (trim, cleanup) are CLI-only.
chopper · companion agent16 / 17

Next steps

17

Status

  • Currently in alpha testing — internal validation across pilot domains
  • 2026.06beta release targeted

What domain owners do today

  1. Read user_docs/01_OVERVIEW.md + user_docs/02_CLI_GUIDE.md (≈45 min)
  2. Pick the closest examples/ folder for your shape (F1 / F2 / F3 / combo)
  3. Copy jsons/ into your domain root, replace placeholders
  4. Run python schemas/scripts/validate_jsons.py <domain>/
  5. Use the CHOPPER AGENT agent in VS Code Copilot Chat for help authoring
  6. File feedback / bug reports via schemas/scripts/file_bug_report.py

📘 User docs — start here

  • user_docs/README.md — landing + reading order
  • user_docs/01_OVERVIEW.md — problem, F1/F2/F3, JSON, BKMs
  • user_docs/02_CLI_GUIDE.md — every flag, deep examples
  • user_docs/03_HOW_CHOPPER_WORKS.md — pipeline, FAQ

Designed to ramp from this deck in 60–90 min.

🗃️ Technical docs — deep dive

  • technical_docs/ARCHITECTURE.md — authoritative spec
  • technical_docs/JSON_AUTHORING_GUIDE.md — every JSON field
  • technical_docs/CLI_REFERENCE.md — full CLI surface
  • technical_docs/DIAGNOSTIC_CODES.md — every code
  • technical_docs/ENGINEERING.md — module & service catalog
  • technical_docs/IMPLEMENTATION.md — parser internals + pitfalls

📂 Examples & schemas

  • examples/01_base_files_only/
  • examples/02_base_procs_only/
  • examples/07_base_full/
  • examples/09_base_plus_multiple_features/
  • schemas/base-v1.schema.json
  • schemas/feature-v1.schema.json
  • schemas/project-v1.schema.json
Questions? Issues? Authoring help?
Open VS Code Copilot Chat → CHOPPER AGENT.
Follow-up office hours will be scheduled after this kickoff.
chopper · thank you17 / 17