Tokscale: Track AI Token Usage Across Multiple Platforms – CLI Tool
Tokscale: Track AI Token Usage Across Multiple Platforms – CLI Tool
In an era where AI‑assisted coding is everywhere, tokens have become the new currency of productivity. Every prompt you send to Claude, Gemini, Codex or any other LLM‑powered assistant consumes characters, and those characters cost real money. Developers who don’t track usage quickly find themselves staring at unexpected bills.
Enter Tokscale – a lightweight, open‑source command‑line utility that automatically aggregates token usage from a wide array of tools: OpenCode, Claude Code, Codex CLI, Gemini CLI, Cursor IDE, AmpCode, Factory Droid, and OpenClaw. With a single command, Tokscale pulls data from your local session folders, queries LiteLLM’s pricing database, and produces tidy reports and beautiful visualizations.
✅ Why Tokscale?
- 20+ supporters on GitHub
- 508 ⭐
- 7 day+ real‑time pricing from LiteLLM
- Interactive terminal UI (OpenTUI)
- Leaderboard & social platform
- Native Rust core (10× faster parsing)
- Export to JSON for dashboards
Getting started
Tokscale is distributed via Bun – a fast JavaScript/TypeScript runtime that comes with a built‑in package manager. If you’re not on Bun yet, quickly install it with:
curl -fsSL https://bun.sh/install | bash
1. Install Tokscale
bunx tokscale@latest
⚡ Tip:
bunxruns the package directly without adding it to your project – perfect for one‑off usage.
2. Run the interactive TUI
tokscale
You’ll be greeted with an eye‑catching dashboard featuring:
- Overview: total tokens, cost breakdown, and a quick‑view of the most‑used models.
- Models: cost and token counts per model.
- Daily: a day‑by‑day breakdown.
- Stats: a familiar GitHub‑style contribution graph.
Use the keyboard (1–4 to jump tabs, arrow keys to navigate) or mouse (clickable tabs). For a lightweight, non‑interactive version, add the --light flag:
tokscale --light
3. Filtering and date ranges
Tokscale shines when you want to slice your data:
# Show only Claude Code usage from the last week
tokscale --claude --week
# Restrict to a custom date range
tokscale --since 2024-01-01 --until 2024-12-31
# Combine with cost output
tokscale models --month --github --json > monthly.json
The CLI accepts short flags for each data source (--opencode, --codex, --cursor, etc.), making it trivial to compare a single platform or a mix.
4. Real‑time pricing lookup
tokscale pricing "claude-3-5-sonnet"
Tokscale uses Litellm as its primary source and falls back to OpenRouter for any new or niche models. The output shows input, output, and cached token pricing in USD.
5. Social features – Leaderboard & Profile
Tokscale isn’t just a local statistic tool. It includes a web‑based platform that lets you:
- Login via GitHub OAuth.
- Submit your aggregated data to a global leaderboard.
- Browse other developers’ profiles, compare token usage, and see trends.
- View custom “Wrapped” year‑in‑review images.
tokscale login # Opens browser for GitHub auth
tokscale submit # Pushes data to the leaderboard
tokscale wrapped # Generates the Spotify‑inspired image
Once you’re on the platform, you can also archive old data or fine‑tune which data points to submit.
Integrating with CI/CD
Tokscale’s headless mode captures token usage from tools that output JSON to stdout—great for continuous‑integration pipelines. For example:
- name: Run AI check
run: |
mkdir -p ~/.config/tokscale/headless/codex
codex exec --json "review code changes" > ~/.config/tokscale/headless/codex/pr-123.jsonl
- name: Collect usage
run: tokscale --json > report.json
The JSON file can be fed into metrics dashboards or CI badges.
Contributing
Tokscale is open to contributions—whether you want to add support for a new AI tool, improve the Rust core, or tweak the UI. The repo follows standard GitHub workflows:
- Fork the repo.
- Create a feature branch.
- Run the tests (
bun run test:all). - Submit a pull request.
The community welcomes bug reports and feature requests—just open an issue.
TL;DR
- What? CLI to track token usage from many LLM assistants.
- Why? Tokens = cost; see where they go in real time.
- How? Install with Bun, run
tokscale, filter or export. - Beyond CLI – Leaderboard, graphs, CI integration.
- Open source – MIT license, active community.
Final Thoughts
Tokscale is more than a metrics tool; it’s a way to stay conscious of AI costs while coding, and to benchmark productivity across services. With its native Rust back‑end and interactive TUI, it feels like a productivity IDE extension in terminal form. If you’re on the AI frontier, give Tokscale a whirl and share your usage on the leaderboard.”}