Adding Umami analytics to the OpenClaw morning brief
An agent skill that fetches traffic data from Umami, and how it fits into a daily automated briefing.
I run OpenClaw as a personal assistant on a VPS. Every morning it sends a briefing to Discord: weather, calendar events, issues in review. I wrote about the calendar sync setup previously.
One thing that was missing: traffic data. I run Umami for analytics on my sites, and checking how they’re doing meant opening Umami separately. Not a big deal, but it’s the kind of thing that belongs in a morning brief.
So I built an agent skill for it.
The skill
The umami skill fetches analytics from any Umami instance — self-hosted or Umami Cloud. It wraps the Umami API into a shell script that Claude Code or any agent can call, and formats the output as a summary table.
Install it with:
npx skills add martinhjartmyr/skills --skill umami
Configuration
Set these environment variables:
| Variable | Required | Description |
|---|---|---|
UMAMI_API_URL | Yes | Base URL of your Umami instance (e.g. https://umami.example.com) |
UMAMI_API_KEY | One of these | API key for Umami Cloud |
UMAMI_USERNAME | One of these | Username for self-hosted login |
UMAMI_PASSWORD | With username | Password for self-hosted login |
If UMAMI_API_KEY is set, it takes precedence over username/password.
Two modes
The default mode fetches a full summary comparing today’s stats against the previous period:
| Website | Domain | Pageviews | Visitors | Visits | Bounces | Avg Time | Active |
|---|---|---|---|---|---|---|---|
| My Blog | blog.example.com | 1,234 (980) | 567 (510) | 890 (801) | 123 (110) | 45s (38s) | 3 |
| Portfolio | example.com | 456 (390) | 210 (180) | 320 (290) | 89 (75) | 32s (28s) | 1 |
Numbers in parentheses are from the previous period.
There’s also --active-only for a quick check of who’s on the site right now, skipping the heavier stats API calls.
Adding it to the morning brief
The morning brief is a cron job that runs OpenClaw with a prompt describing what to include. OpenClaw gathers data from each source, assembles it, and posts the result to Discord.
Here’s the structure with all four sections:
sections:
weather:
source: Open-Meteo API
format: Today's forecast with temperature range and conditions
calendar:
source: node bin/cal-query.mjs --days 2
format: Events for today and tomorrow
issues:
source: Overvy API
format: Issues currently in review, sanitized titles
traffic:
source: umami skill (umami-summary.sh)
format: Pageviews, visitors, visits with previous-period comparison
The actual cron prompt looks something like this (credentials replaced):
Run the morning brief. Gather and format these sections:
1. Weather: Fetch forecast from Open-Meteo for lat=XX.XX&lon=XX.XX.
Show today's high/low and conditions.
2. Calendar: Run `node bin/cal-query.mjs --days 2`.
Show today and tomorrow's events.
3. Issues: Fetch issues in review from Overvy API at OVERVY_API_URL.
List titles only, no descriptions.
4. Traffic: Run `bash ~/skills/umami/scripts/umami-summary.sh`.
Format as a markdown table with previous-period comparison.
Post the assembled brief to Discord channel CHANNEL_ID.
Keep it compact. No commentary, just the data.
OpenClaw calls each source, formats the results, and sends a single Discord message.
Results
Before, the morning brief covered weather, calendar, and issues. Checking site traffic was a separate task.
Now it’s part of the same message:
Good morning. Here's your brief for Friday, Feb 28.
Weather
Partly cloudy, 2 to 7C. Light wind from the west.
Calendar
09:00 - 09:30 Standup
14:00 - 15:00 Sprint review
Tomorrow: No events
Issues in Review
FE-421 Update nav component
BE-198 Rate limiting middleware
Traffic (24h)
Site Pageviews Visitors Bounces Avg Time
blog 1,842 (1,650) 623 (580) 312 (290) 52s (47s)
portfolio 456 (390) 210 (180) 89 (75) 32s (28s)
Source code
The skill is open source at github.com/martinhjartmyr/skills. You can install individual skills from the repo:
npx skills add martinhjartmyr/skills --skill umami