L3 Academy

Module 2: The Issue Lifecycle

Master the 7-state lifecycle and label system that powers all L3 work.

Estimated time: 15 minutes

The 7 States

Every issue in the L3 ecosystem follows a strict lifecycle. No skipping states — except for P1 emergencies.

Backlog → Triaged → To Do → In Progress → In Review → Deployed in Dev → Done
StateWhat HappensTime Limit
BacklogIssue just landed. Claude auto-triage picks it up within seconds.Max 24 hours
TriagedClaude has processed it. Waits for the Thursday weekly review where Ops + Tech Lead decide what enters the next cycle.Issues not accepted within 7 days are auto-archived Friday
To DoAccepted into the cycle. Assigned to an engineer.Engineer must pick it up within 48 hours
In ProgressEngineer is actively working. A GitHub branch must be linked.
In ReviewPR is open, review requested. PR must reference Fixes #N.Reviewers have 48 hours
Deployed in DevCode merged and deployed to the dev environment. This is a mandatory gate — you cannot skip it. QA or the engineer validates in dev before promoting to production.
DoneDeployed to production and verified live. This is the terminal state.

Key Rule: Merged ≠ Done

A common mistake: marking an issue as Done when the PR is merged. Merged is not Done. The change must be deployed to production and confirmed working. That's why "Deployed in Dev" exists as a mandatory gate.

P1 Fast-Track

For urgent issues (production down, data loss, critical security), issues skip Triaged and To Do and go directly to In Progress. They still must pass through In Review and Deployed in Dev. Ops gets pinged in Slack immediately.

P1 criteria:

  • Active production outage
  • Data corruption in progress
  • Critical security vulnerability being exploited
  • Customer-facing outage with revenue/SLA impact

The Label System

Labels are how the platform classifies and routes work. There are four categories:

Type Labels

LabelWhen to Use
BugDefect in existing functionality
FeatureNew capability or enhancement
ImprovementEnhancement to existing functionality
SpikeTime-boxed investigation, no deliverable code expected

Platform Labels

LabelScope
FEFrontend — web client, mobile, UI, CSS, layout
BEBackend — server, API, database, infrastructure
DesignUI/UX, visual design, design systems

Important rule: If a bug shows wrong, missing, or duplicate data — even if it appears in the UI — classify it as BE. Layout, styling, interactions, and responsiveness are FE. When in doubt between FE and BE, lean BE.

Complexity Labels

Set automatically by the ingestion service's Claude classification:

LabelDefinition
complexity:lowSingle file, mechanical change (rename, config, style tweak, copy fix)
complexity:mediumMultiple files, clear scope but requires investigation (new endpoint, multi-file bug fix)
complexity:highUnclear scope, requires design decisions, touches architecture, multiple systems

AI Workflow Labels

These are the most important labels for understanding automation:

LabelMeaningWhat Happens
needs-reviewClaude's classification confidence was below 0.7. A human must review before any work begins.Blocks Claude Code automation
ai-planHuman has decided Claude should create an implementation plan (no code).Triggers claude-code-plan.yml → creates a draft PR with a plan document
ai-triagedHuman has decided Claude should fully implement this issue.Triggers claude-code.yml → Claude implements and opens a PR
human-onlyRequires human judgment. Claude should not auto-triage or implement.No automation triggers

Priority (Not a Label)

Priority uses Linear's built-in Priority field, not labels:

  • Urgent (P1): Production down, data loss, critical security
  • High (P2): Blocks work, client regression, significant friction
  • Medium (P3): Planned work, normal priority (default)
  • Low (P4): Nice-to-have, future consideration

Check Your Understanding

When an issue is first ingested via webhook, what AI workflow label does it receive?
Which of these are AI workflow labels? (Select all that apply)
needs-review
ai-plan
complexity:high
ai-triaged
human-only
An issue has been merged to main. What state should it be in?

Checkpoints

I can name all 7 lifecycle states in order
I've looked at 3 real issues in Linear and identified their current states
I know the difference between ai-plan, ai-triaged, needs-review, and human-only

Module Assessment

Module Assessment

1. How many lifecycle states does the L3 issue system have?

2. What does the 'ai-plan' label trigger?

3. A bug report shows incorrect data in a table UI. What platform label should it get?

4. What happens to issues in Triaged state that aren't accepted within 7 days?

5. Which label blocks Claude Code automation on comment triggers?