Workflow Basics
This chapter covers the fundamentals of creating Prodigy workflows. You’ll learn about workflow structure, basic commands, and configuration options.
Overview
Prodigy workflows are YAML files that define a sequence of commands to execute. They can be as simple as a list of shell commands or as complex as parallel MapReduce jobs.
Two Main Workflow Types:
- Standard Workflows: Sequential command execution (covered here)
- MapReduce Workflows: Parallel processing with map/reduce phases (see MapReduce chapter)
Simple Workflows
The simplest workflow is just an array of commands:
# Simple array format - just list your commands
- shell: "echo 'Starting workflow...'"
- claude: "/prodigy-analyze"
- shell: "cargo test"
This executes each command sequentially. No additional configuration needed.
Full Workflow Structure
For more complex workflows, use the full format with explicit configuration:
# Full format with environment and merge configuration
commands:
- shell: "cargo build"
- claude: "/prodigy-test"
# Global environment variables (available to all commands)
env:
NODE_ENV: production
API_URL: https://api.example.com
# Secret environment variables (masked in logs)
secrets:
API_KEY: "${env:SECRET_API_KEY}"
# Environment files to load (.env format)
env_files:
- .env.production
# Environment profiles (switch contexts easily)
profiles:
development:
NODE_ENV: development
DEBUG: "true"
# Custom merge workflow (for worktree integration)
merge:
- shell: "git fetch origin"
- claude: "/merge-worktree ${merge.source_branch}"
timeout: 600 # Optional timeout in seconds
Available Fields
Standard workflows support these top-level fields:
Field | Type | Required | Description |
---|---|---|---|
commands | Array | Yes* | List of commands to execute sequentially |
env | Map | No | Global environment variables |
secrets | Map | No | Secret environment variables (masked in logs) |
env_files | Array | No | Paths to .env files to load |
profiles | Map | No | Named environment profiles |
merge | Object | No | Custom merge workflow for worktree integration |
Note: commands
is only required in the full format. Simple array format doesn’t use the commands
key.
Command Types
Prodigy supports several types of commands in workflows:
Core Commands
shell:
- Execute shell commands
- shell: "cargo build --release"
- shell: "npm install"
claude:
- Invoke Claude Code commands
- claude: "/prodigy-lint"
- claude: "/analyze codebase"
Advanced Commands
goal_seek:
- Goal-seeking operations with validation (see Advanced Features)foreach:
- Iterate over lists with nested commands (see Advanced Features)validate:
- Validation steps with configurable thresholds (see Commands)
Deprecated:
test:
- Deprecated in favor ofshell:
withon_failure:
handlers
For detailed information on each command type and their fields, see the Command Types chapter.
Command-Level Options
All command types support additional fields for advanced control:
Basic Options
- shell: "cargo test"
id: "run-tests" # Step identifier for output referencing
commit_required: true # Expect git commit after this step
timeout: 300 # Timeout in seconds
Conditional Execution
Run commands based on conditions:
- shell: "deploy.sh"
when: "${branch} == 'main'" # Only run on main branch
Error Handling
Handle failures gracefully:
- shell: "risky-command"
on_failure:
shell: "cleanup.sh" # Run on failure
on_success:
shell: "notify.sh" # Run on success
Output Capture
Capture command output to variables:
- shell: "git rev-parse HEAD"
id: "get-commit"
capture: "commit_hash" # Capture to variable
capture_format: "string" # Format: string|json|lines|number|boolean
For comprehensive coverage of these options, see:
- Advanced Features - Conditional execution, output capture, timeouts
- Error Handling - on_failure and on_success handlers
- Variables - Variable interpolation and capture formats
Environment Configuration
Environment variables can be configured at multiple levels:
Global Environment Variables
env:
NODE_ENV: production
DATABASE_URL: postgres://localhost/mydb
Secret Variables
Secret variables are masked in logs for security:
secrets:
API_KEY: "${env:SECRET_API_KEY}"
DB_PASSWORD: "${env:DATABASE_PASSWORD}"
Environment Files
Load variables from .env files:
env_files:
- .env
- .env.production
Environment Profiles
Switch between different environment contexts:
profiles:
development:
NODE_ENV: development
DEBUG: "true"
API_URL: http://localhost:3000
production:
NODE_ENV: production
DEBUG: "false"
API_URL: https://api.example.com
Activate a profile with: prodigy run --profile development
For more details, see the Environment Variables chapter.
Merge Workflows
Merge workflows execute when merging worktree changes back to the main branch. This feature enables custom validation, testing, and conflict resolution before integrating changes.
When to use merge workflows:
- Run tests before merging
- Validate code quality
- Handle merge conflicts automatically
- Sync with upstream changes
merge:
commands:
- shell: "git fetch origin"
- shell: "git merge origin/main"
- shell: "cargo test"
- claude: "/prodigy-merge-worktree ${merge.source_branch}"
timeout: 600 # Optional: overall timeout for merge workflow
Available merge variables:
${merge.worktree}
- Worktree name (e.g., “prodigy-session-abc123”)${merge.source_branch}
- Source branch (worktree branch)${merge.target_branch}
- Target branch (usually main/master)${merge.session_id}
- Session ID for correlation
These variables are only available within the merge workflow context.
Complete Example
Here’s a complete workflow combining multiple features:
# Environment configuration
env:
RUST_BACKTRACE: 1
env_files:
- .env
profiles:
ci:
CI: "true"
VERBOSE: "true"
# Workflow commands
commands:
- shell: "cargo fmt --check"
- shell: "cargo clippy -- -D warnings"
- shell: "cargo test --all"
- claude: "/prodigy-lint"
# Custom merge workflow
merge:
commands:
- shell: "cargo test"
- claude: "/prodigy-merge-worktree ${merge.source_branch}"
timeout: 300
Next Steps
Now that you understand basic workflows, explore these topics:
- Command Reference - Detailed guide to all command types and options
- Environment Variables - Advanced environment configuration
- Error Handling - Handle failures gracefully
- MapReduce Workflows - Parallel processing for large-scale tasks
- Conditional Execution - Run commands based on conditions
- Output Capture - Capture and use command outputs