Skip to content

Feature Manifest

Every Mistflow project carries a feature manifest at .mistflow/manifest.json. It’s the bookkeeping layer that turns “a plan” into a verifiable set of promises, and turns “I think it’s done” into “here is what’s still unverified.”

Each acceptance criterion moves through three states:

StateMeaning
plannedThe backend LLM generated this criterion when the plan was created. Nothing has been built for it yet.
implementedThe AI has written code it believes satisfies the criterion. Files have been touched; mist_implement marked the step complete.
verifiedEvidence exists. QA ran against the live app, or a human confirmed it. Only verified criteria count as delivered.

The gap between implemented and verified is where most regressions live. The manifest makes that gap visible.

As of MCP 0.5.0, the backend LLM generates acceptance criteria as part of the plan. You don’t write them; they come out of mist_plan.

For a habit tracker, the plan might produce something like:

{
"features": [
{
"id": "habit-crud",
"name": "Habit CRUD",
"criteria": [
{ "id": "hc-1", "text": "A signed-in user can create a habit with name, target days per week, and color." },
{ "id": "hc-2", "text": "Editing an existing habit updates the row without creating a duplicate." },
{ "id": "hc-3", "text": "Deleting a habit removes it from the list view within 1s." }
]
},
{
"id": "streaks",
"name": "Streak Tracking",
"criteria": [
{ "id": "sk-1", "text": "Marking a habit complete for today increments its streak by 1." },
{ "id": "sk-2", "text": "Missing a day resets the streak to 0 the next time the habit is loaded." }
]
}
]
}

That’s two features, five criteria. Every one starts in planned.

planned → implemented: When mist_implement finishes a step that maps to a feature, the criteria for that feature flip to implemented. The step → feature mapping comes from the plan’s files_affected and feature_id fields.

implemented → verified: When mist_qa drives the live app through the plan’s acceptance criteria, any criterion whose probe passes flips to verified. Criteria can also be marked verified manually via mist_project action=get --mark-verified <criterion-id> if you’ve confirmed them by hand.

Show the current manifest state:

Terminal window
mist_project action=`get`

Sample output for the habit tracker mid-build:

Project: habits-app
Feature: Habit CRUD (3 criteria)
[x] hc-1 A signed-in user can create a habit... (verified)
[x] hc-2 Editing an existing habit... (implemented)
[ ] hc-3 Deleting a habit removes it from... (planned)
Feature: Streak Tracking (2 criteria)
[ ] sk-1 Marking a habit complete... (planned)
[ ] sk-2 Missing a day resets the streak... (planned)
Progress: 1 verified · 1 implemented · 3 planned

Use --json for a machine-readable version the editor can consume.

Projects created before the manifest landed don’t have .mistflow/manifest.json. Seed it from the existing plan:

Terminal window
mist_project action=`get` --init-from-plan

This reads the plan, extracts features and criteria, and writes a fresh manifest with everything set to planned. Subsequent mist_project action=get“ calls don’t re-initialize, they just show current state.

Show every criterion that is not verified, grouped by feature:

Terminal window
mist_debug

This is the “what’s left to do” list. The editor uses it to pick what to work on next, and QA uses it to know what still needs to be probed. Criteria drop off the list as soon as they’re verified.

Sample output mid-build:

Unverified criteria (4):
Habit CRUD
[ ] hc-2 Editing an existing habit updates the row without creating a duplicate.
status: implemented, needs QA verification
[ ] hc-3 Deleting a habit removes it from the list view within 1s.
status: planned, not yet implemented
Streak Tracking
[ ] sk-1 Marking a habit complete for today increments its streak by 1.
status: planned, not yet implemented
[ ] sk-2 Missing a day resets the streak to 0 the next time the habit is loaded.
status: planned, not yet implemented

Without a manifest, “done” is whatever the AI last said. With one, done means something specific: the criteria are in .mistflow/manifest.json and they’re flipped to verified. Users can look at their own project and see, concretely, which pieces of the original plan were actually delivered and which are still in flight.