Your blog, working for you every day

Fetches your blog, picks a post, generates a shareable insight

What you will receive

Daily Insight from Your Blog

just now

Today's insight from your blog:

"The best code is the code you don't write"

From: Why Less is More in Software Architecture
Posted: 3 days ago

Share this: The hidden cost of every feature isn't the code—it's the maintenance, testing, and cognitive load it adds forever.

Ready to share →

How it works

  1. 1Humrun fetches your blog or RSS feed on your schedule
  2. 2It picks a recent or random post and extracts key content
  3. 3AI generates a shareable insight you can post anywhere

You configure

https://yourblog.com or https://yourblog.com/feed.xml

Your blog homepage or RSS feed URL

professional, casual, thought-provoking

How should the insight sound?

sk-...

For generating insights

View Python code
import requests
from bs4 import BeautifulSoup
import feedparser
import random
import os

BLOG_URL = os.environ.get("BLOG_URL")
TONE = os.environ.get("TONE", "professional and concise")
OPENAI_API_KEY = os.environ.get("OPENAI_API_KEY")

# Try to parse as RSS first
feed = feedparser.parse(BLOG_URL)

if feed.entries:
    # Pick from recent posts
    entries = feed.entries[:10]
    post = random.choice(entries)
    title = post.get("title", "Untitled")
    content = post.get("summary", post.get("description", ""))
    link = post.get("link", BLOG_URL)
else:
    # Fallback: scrape the blog homepage
    response = requests.get(BLOG_URL, headers={"User-Agent": "Mozilla/5.0"})
    soup = BeautifulSoup(response.text, "html.parser")

    # Try to find article links
    articles = soup.select("article a, .post a, .entry a, h2 a, h3 a")
    if articles:
        article = random.choice(articles[:10])
        link = article.get("href", BLOG_URL)
        title = article.get_text(strip=True)

        # Fetch the article content
        if not link.startswith("http"):
            link = BLOG_URL.rstrip("/") + "/" + link.lstrip("/")
        article_resp = requests.get(link, headers={"User-Agent": "Mozilla/5.0"})
        article_soup = BeautifulSoup(article_resp.text, "html.parser")
        content = article_soup.get_text(strip=True, separator=" ")[:2000]
    else:
        title = "Your Blog"
        content = soup.get_text(strip=True, separator=" ")[:2000]
        link = BLOG_URL

# Generate insight with AI
prompt = f"""Based on this blog post, create a shareable insight.

Title: {title}
Content: {content[:1500]}

Tone: {TONE}

Format:
- One memorable quote or key insight (1-2 sentences)
- A brief explanation of why this matters (1 sentence)

Keep it under 50 words total. No hashtags."""

response = requests.post(
    "https://api.openai.com/v1/chat/completions",
    headers={"Authorization": f"Bearer {OPENAI_API_KEY}"},
    json={
        "model": "gpt-4o-mini",
        "messages": [{"role": "user", "content": prompt}],
        "max_tokens": 150
    }
)

insight = response.json()["choices"][0]["message"]["content"]

print(f"From: {title}")
print(f"Link: {link}")
print(f"\n{insight}")
Suggested schedule: Every day at 9 AMNotifications: After every run
Browse more templates