Blog Watcher
Monitor blogs and RSS/Atom feeds for updates using the blogwatcher CLI.
BlogWatcher is a lightweight Go CLI for tracking blog updates and RSS/Atom feeds. It's designed for people who want to stay on top of their favorite blogs without using a full RSS reader app. Add blogs, scan for new posts, and manage read/unread status — all from the terminal.
What makes BlogWatcher smart is its dual-source approach: it tries RSS/Atom feeds first, then falls back to HTML scraping if no feed is available. It can auto-discover feed URLs from blog homepages, and you can provide custom CSS selectors for blogs that don't have feeds at all. This means it works with virtually any blog, not just those with proper RSS support.
For OpenClaw users, BlogWatcher is perfect for automated monitoring workflows. Combine it with cron to scan blogs periodically and get notified when your favorite authors publish new content. The read/unread tracking prevents duplicate notifications, and the colored CLI output makes it easy to spot new articles.
Built by Hyaxia, it's a focused tool that does one thing well. No AI summaries, no recommendation algorithms — just clean feed monitoring. If you want AI-powered summaries on top, pair it with the summarize skill.
Best suited for: developers following tech blogs, content curators tracking multiple sources, anyone who prefers CLI RSS over apps like Feedly or NetNewsWire.
Tags: rss, blog, monitoring, feeds
Category: Monitoring
Use Cases
- Monitor competitor blogs for new content
- Track tech blogs and get notified of new posts via OpenClaw
- Automated content curation pipeline: scan → summarize → share
- Keep up with open-source project changelogs
- Morning briefing: check for new articles from tracked blogs
Tips
- Use `blogwatcher scan` in an OpenClaw cron job for automated monitoring
- For blogs without RSS, use `--scrape-selector` with a CSS selector targeting article links
- Pair with the summarize skill to get AI summaries of new articles
- Use `blogwatcher articles --blog 'Name'` to filter by specific blog
- Run `blogwatcher read-all --blog 'Name' --yes` to batch mark articles as read
- Check `blogwatcher scan` output for feed discovery issues before assuming no new posts
Known Issues & Gotchas
- HTML scraping requires specifying the correct CSS selector — may break if blog layout changes
- No built-in notification system — you need to pair with cron or scripts for alerts
- Database is local — no cloud sync between machines
- Auto-discovery of RSS feeds doesn't work for all blogs
- No built-in filtering or keyword matching — you get all articles from tracked blogs
- Still early-stage (v0.0.3) — expect API changes
Alternatives
- Miniflux
- Newsboat
- Feedly
- Tiny Tiny RSS (tt-rss)
Community Feedback
A Go CLI tool to track blog articles, detect new posts, and manage read/unread status. Supports both RSS/Atom feeds and HTML scraping as fallback.
— GitHub
The blogwatcher skill enables AI agents like Claude and ChatGPT to monitor blogs and RSS/Atom feeds for updates using the blogwatcher CLI.
— SkillRegistry
Configuration Examples
Add blogs with different methods
# Auto-discover RSS
blogwatcher add "Simon Willison" https://simonwillison.net
# Explicit feed URL
blogwatcher add "Hacker News" https://news.ycombinator.com --feed-url https://news.ycombinator.com/rss
# HTML scraping fallback
blogwatcher add "No-RSS Blog" https://example.com --scrape-selector "article h2 a"Automated scan with OpenClaw cron
# Check for new articles every 2 hours
blogwatcher scan && blogwatcher articlesInstallation
go install github.com/Hyaxia/blogwatcher/cmd/blogwatcher@latestHomepage: https://github.com/Hyaxia/blogwatcher
Source: bundled