Track progress, catch silent failures, and get alerts for your Celery tasks, scheduled jobs, and long-running processes.
import taskbadger as tb
task = tb.Task.create("data-export")
task.update(value=75, status="processing")
data-export
prod · 2s ago
75%
TaskBadger was a huge help for Scriv.ai
Being able to get visibility into the long-running indexing jobs and get notified when things broke saved me hours of time and stress.
Cory Zue
Founder, Scriv.ai
Three ways to start monitoring. Pick whichever fits your stack.
Install the package:
uv add taskbadger
Initialize in your app:
import taskbadger
from taskbadger.systems import CelerySystemIntegration
taskbadger.init(
organization_slug="my-org",
project_slug="my-project",
token="***",
systems=[CelerySystemIntegration()],
tags={"environment": "production"}
)
import taskbadger as tb
task = tb.Task.create("data-export")
task.update(value=50, status="processing")
task.update(value=100, status="success")
Run directly with uvx:
uvx taskbadger run "nightly-backup" -- ./backup.sh
That's it. Tasks appear in your dashboard automatically.
Know immediately when tasks fail, stall, or never start.
See where long-running jobs are at, from anywhere.
Trends over time, not just what's running right now.
See how it compares to the tools you might already be using.
| Flower | Sentry | Logs | Task Badger | |
|---|---|---|---|---|
| Task progress tracking | ||||
| Silent failure detection | ||||
| Persistent history & trends | ||||
| Configurable alerts | ||||
| Not just errors | — | |||
| No infra to manage |
More integrations coming soon.
3,307,000+
tasks monitored
No credit card required to get started.
See pricingInstall the SDK, add two lines of config, and see your tasks in the dashboard.