Zero dependencies. Language agnostic. Self-contained.
Docs | Quick Start | Features | Installation | Community
Dagu solves the problem of complex workflow orchestration without requiring a dedicated infrastructure team. Unlike traditional workflow engines that demand databases, message queues, and careful operational overhead, Dagu runs as a single binary with zero external dependencies.
After managing hundreds of cron jobs across multiple servers, I built Dagu to bring sanity to workflow automation. It handles scheduling, dependencies, error recovery, and monitoring - everything you need for production workflows, without the complexity.
- Local‑first. Workflows should run offline on laptops, air‑gapped servers, or the cloud—your choice.
- Zero foot‑print. One static binary; no databases, brokers, or sidecars.
- Bring‑your‑own language. Bash, Python, Go, or anything just works.
- DAG definition - Express complex dependencies in readable YAML
- Scheduling - Cron expressions with timezone support
- Queueing - Control concurrency with named queues
- Error handling - Retries, failure handlers, cleanup hooks
- Conditional execution - Run steps based on conditions
- Parallel execution - Control concurrent step execution
- Variables & Parameters - Pass data between steps, parameterize workflows
- Docker support - Run steps in containers
- SSH executor - Execute commands on remote hosts
- HTTP requests - Integrate with APIs
- Email notifications - SMTP integration
- Hierarchical workflows - Nest DAGs to any depth
- Authentication - Basic auth, API tokens, TLS
- Web UI - Real-time monitoring and control
- REST API - Full programmatic access
# Install
curl -L https://raw.githubusercontent.com/dagu-org/dagu/main/scripts/installer.sh | bash
# Create dagu configuration directory
mkdir -p ~/.config/dagu/dags
# Create your first workflow
mkdir -p ~/.config/dagu/dags
cat > ~/.config/dagu/dags/hello.yaml << 'EOF'
steps:
- name: hello
command: echo "Hello from Dagu!"
- name: world
command: echo "Running step 2"
EOF
# Execute it
dagu start hello
# Check the status
dagu status hello
# Start the web UI
dagu start-all
# Visit http://localhost:8080
# Latest
curl -L https://raw.githubusercontent.com/dagu-org/dagu/main/scripts/installer.sh | bash
# Specific version
curl -L https://raw.githubusercontent.com/dagu-org/dagu/main/scripts/installer.sh | bash -s -- --version v1.17.0
# Install to a specific directory
curl -L https://raw.githubusercontent.com/dagu-org/dagu/main/scripts/installer.sh | bash -s -- --prefix /path/to/install
# Homebrew
brew install dagu-org/brew/dagu
docker run -d \
--name dagu \
-p 8080:8080 \
-v ~/.dagu:/var/lib/dagu \
ghcr.io/dagu-org/dagu:latest dagu start-all
Download from releases and add to PATH.
- Getting Started - Tutorial and first steps
- Core Concepts - Architecture and design
- Writing Workflows - Complete authoring guide
- CLI Reference - Command-line usage
- API Reference - REST API documentation
- Configuration - Configuration options
- Changelog - Release notes and version history
Find more in our examples documentation.
name: daily-etl
schedule: "0 2 * * *"
steps:
- name: extract
command: python extract.py
output: DATA_FILE
- name: validate
command: python validate.py ${DATA_FILE}
- name: transform
command: python transform.py ${DATA_FILE}
retryPolicy:
limit: 3
- name: load
command: python load.py ${DATA_FILE}
steps:
- name: data-pipeline
run: etl
params: "ENV=prod REGION=us-west-2"
- name: parallel-jobs
run: batch
parallel:
items: ["job1", "job2", "job3"]
maxConcurrency: 2
params: "JOB=${ITEM}"
---
name: etl
params:
- ENV
- REGION
steps:
- name: process
command: python etl.py --env ${ENV} --region ${REGION}
---
name: batch
params:
- JOB
steps:
- name: process
command: python process.py --job ${JOB}
name: ml-pipeline
steps:
- name: prepare-data
executor:
type: docker
config:
image: python:3.11
autoRemove: true
volumes:
- /data:/data
command: python prepare.py
- name: train-model
executor:
type: docker
config:
image: tensorflow/tensorflow:latest-gpu
command: python train.py
- name: deploy
command: kubectl apply -f model-deployment.yaml
preconditions:
- condition: "`date +%u`"
expected: "re:[1-5]" # Weekdays only
- Data Engineering - ETL pipelines, data validation, warehouse loading
- Machine Learning - Training pipelines, model deployment, experiment tracking
- DevOps - CI/CD workflows, infrastructure automation, deployment orchestration
- Media Processing - Video transcoding, image manipulation, content pipelines
- Business Automation - Report generation, data synchronization, scheduled tasks
- Run steps in a DAG across multiple machines (distributed execution)
- Track artifacts by dropping files in
$DAGU_ARTIFACTS
- Pause executions for webhooks, approvals, or any event (human-in-the-loop, event-driven workflows)
- Integrate with AI agents via MCP (Model Context Protocol)
Prerequisites: Go 1.24+, Node.js, pnpm
git clone https://github.com/dagu-org/dagu.git && cd dagu
make build
make run
Contributions are welcome. See our documentation for development setup.
GNU GPLv3 - See LICENSE
If you find Dagu useful, please ⭐ star this repository