~/dani/blog — cat how-i-got-chatgpt-to-send-me-traffic.md
Feb 25, 2026SEOAISide ProjectEntrepreneurship

How I Got ChatGPT to Send Me Traffic: Building an AI-Optimized Affiliate Site

At my day job, I build enterprise software. APIs, microservices, Angular apps — the kind of work where the users are internal and the traffic comes from a VPN. But I've always been curious about the other side of the web: the side where you build something, put it out into the world, and see if anyone shows up. So I built an affiliate site from scratch to find out.

The site is called GLP-1 After Denial. It's a health and wellness resource for people whose insurance denied coverage for GLP-1 weight loss medications like Ozempic and Zepbound. I chose this niche because I had personal experience with the topic, the search demand was massive and growing, and there was a real gap in trustworthy, non-spammy content. Most of what existed was either pharma marketing or Reddit threads. I wanted to build something in between — honest, well-researched, and actually helpful.

On the technical side, the site is built with Hugo, a static site generator written in Go. I chose Hugo because it's blazing fast, generates lightweight HTML, and gives me complete control over the site structure without the overhead of a CMS or database. The theme is Congo, which I customized with my own partials, color scheme, and layout overrides. The site is deployed on Netlify with automatic builds on every push to the repo.

The content strategy was straightforward: identify high-intent, lower-competition keywords that real people are searching for, and write comprehensive articles that genuinely answer their questions. Things like "compounded semaglutide reviews," "telehealth semaglutide no insurance," and "how to reconstitute semaglutide." Each article targets a specific search intent and includes comparison tables, methodology sections, and clear affiliate disclosures. I wasn't trying to trick anyone — I was trying to be the most useful result.

But here's where it gets interesting. Traditional SEO focuses on Google: optimize your title tags, build backlinks, improve page speed, and wait for the algorithm to reward you. I did all of that. But I also optimized for something most people aren't thinking about yet: AI search engines. ChatGPT, Claude, Perplexity — these tools are increasingly how people find information, and they don't use Google's index. They crawl the web themselves.

So I took a different approach. First, I created a custom robots.txt that explicitly welcomes AI crawlers. Most sites either block them or don't address them at all. Mine says: GPTBot, allowed. ChatGPT-User, allowed. ClaudeBot, allowed. PerplexityBot, allowed. I wanted every AI assistant to know my content was fair game.

Second, I created an llms.txt file — a relatively new convention that gives AI models structured context about your site. Think of it as a README for robots. It tells AI crawlers what the site is about, what topics it covers, what the key pages are, and what facts are safe to cite. I linked it in my HTML head and referenced it in my robots.txt. The idea is simple: if an AI is going to summarize my content or recommend my site, I want to make it as easy as possible for it to understand what I offer.

Third, I focused on content structure. AI models parse content differently than Google's traditional crawler. They favor well-organized, clearly written content with explicit headings, comparison tables, and factual statements that are easy to extract and cite. Every article has a clear hierarchy, a methodology section explaining how I evaluated things, and specific data points that an AI can confidently reference.

The results surprised me. Within two months of launching — with zero advertising spend, zero paid backlinks, and zero social media promotion — I started seeing organic traffic from ChatGPT in my Google Analytics. People were asking ChatGPT questions about GLP-1 medications, and it was recommending my site. Not because I gamed anything, but because I made my content accessible, structured, and genuinely useful for both humans and AI.

On the monetization side, I joined affiliate programs through RevOffers and direct partnerships with telehealth platforms. The site earns commissions when visitors click through and sign up for GLP-1 prescriptions. It's not life-changing money yet, but it validated the model: build useful content, drive traffic through both traditional and AI search, and monetize through relevant affiliate partnerships. The whole point was to understand how this ecosystem works, and now I do.

What I learned from this project goes way beyond SEO. I learned how to think about content as a product. I learned how AI models discover and recommend web content. I learned how affiliate networks track conversions, how to A/B test calls to action, and how to read analytics data to make decisions. I learned that the same skills I use as a software engineer — building systems, analyzing data, iterating based on feedback — apply directly to building an online business.

The biggest takeaway? AI is changing how people find information on the internet, and most content creators haven't caught up yet. If you're a developer with a side project, a blog, or any web presence, start thinking about how AI models interact with your content. Add an llms.txt file. Update your robots.txt. Structure your content so it's easy for both humans and machines to understand. The sites that adapt to AI-driven search early are going to have a massive advantage.

I also integrated the Google Analytics Data API to pull live traffic and AI referral data directly from GA4 using a service account. A scheduled GitHub Action fetches the data daily and writes it to a JSON file, which powers a live analytics dashboard right here on my portfolio site. It's a real-time view of visitors, page views, top pages, and which AI platforms are sending traffic — all rendered statically at build time with no client-side API calls.

GLP-1 After Denial started as an experiment to understand affiliate marketing. It turned into a crash course in AI-era SEO, content strategy, and digital entrepreneurship. And honestly? It's made me a better engineer. Understanding how users find products — whether through Google, ChatGPT, or a direct link — gives me a perspective that most backend developers don't have. It's the kind of cross-functional thinking that I bring to every project I work on.

If you're curious about the site, check it out at glp1afterdenial.com. And if you want to talk about AI-optimized SEO, affiliate strategy, or how to build a side project that actually makes money, reach out — I love talking about this stuff.