Skip to main content

The Internet's Landlord Problem

On March 30th, I received an email:

You've reached 100% usage

Your bandwidth on your account (Berry House) has reached 100% of the current limit for March, 2026.

If usage goes over the limit before the end of the month, all projects will be suspended until the first day of the next month to ensure we never bill you for overages. Alternatively, you can upgrade your account to Starter to allow additional bandwidth and other usage.

I looked over the number sitting there in my Netlify's analytics dashboard. March: 106.8 GB. I blew past the free tier (100GB) without even noticing. I write, I publish, I push to GitLab, and I mostly don't look at what happens next.

I created and launched brennan.day on December 10th, 2025, and in less than four months it's generated enough traffic to exceed the bandwidth allocation of a free hosting tier. I've already found myself reaching the ceiling. Don't get me wrong, this is a good problem to have, and I'm grateful for the audience I find myself building.

But it would be naïve for me to think that all of this bandwidth usage was from my audience. That isn't the case. I know all too well how many "non-browser requests" my website gets. In other words, bots.

The Crawler Problem

According to Imperva's 2025 Bad Bot Report, automated traffic surpassed human-generated activity in 2024, accounting for 51% of all web traffic—meaning bots now constitute the majority of the internet's traffic. That 51% splits into two categories: good bots (14%), like search engine crawlers that actually index and send traffic back to you, and bad bots (37%), built to scrape data, commit fraud, credential-stuff login forms, or simply overwhelm servers. That 37% bad-bot figure represents a six-consecutive-year streak of growth, driven largely by the rise of genAI tools making the deployment of bots faster, cheaper, and accessible to people with minimal technical skill.

AI crawlers—the kind harvesting content to train and power large language models—operate at an extractive crawl-to-referral ratio. Anthropic's Claude crawler peaked at a ratio of approximately 500,000:1, meaning for every 500,000 pages it crawled, it sent back roughly one visitor. These crawlers are consuming bandwidth I'm paying for, returning nothing, and doing so while 13.26% of AI bot requests actively ignored robots.txt directives in Q2 2025. The arms race is underway, and independent publishers are caught in the crossfire.

What's the solution to this? As I've mentioned previously, I don't have any scripts in place to block genAI bots from crawling and scraping my site, it is an uphill battle and the worst offenders do not act in good faith to honour robots.txt, anyways. I do not believe this is a viable solution that would tackle the full logistics of what I'm looking at.

There are a few real answers, thankfully, and I realize at this point that I need to start thinking about my site's security and DDoS protection as an independent citizen journalist speaking up on issues like my local city's library having a genAI artist residency, or the Pretendian problem with Thomas King, or the ongoing genocide in Palestine. I understand I have a civic duty to use my voice and platform to proactively advocate for marginalized groups and speak truth to power while remaining fully independent and uncompromising.

But the most obvious answer is the one I have the most concerns about.

Cloudflare.

A Quarter of the Entire Internet

The American technology company headquartered in San Francisco, whose content delivery network is currently covering a quarter of all Internet sites.

Cloudflare's ethical history is not uncomplicated. Matthew Prince, CEO and co-founder, describes himself as an "almost a free-speech absolutist" and has spent years articulating a position that is practically convenient.

As it stands, a quarter of the entire Internet runs their websites through a platform that has, at various points, provided security services to the Daily Stormer, 8chan, and Kiwi Farms—each dropped only after escalating public pressure or real-world catastrophe. The pattern reveals Cloudflare's institutional character.

In 2017, Cloudflare terminated services to the Daily Stormer, a neo-Nazi website, after the Charlottesville attack. CEO Matthew Prince's account of the decision was, in his own words, that he woke up that morning in a bad mood and decided to kick them off—an encapsulation of how arbitrary infrastructure-level content decisions are. He expressed discomfort with the precedent immediately after making it. In 2019, following the El Paso mass shooting, Cloudflare dropped 8chan—whose content had helped radicalize the shooter. Again, Prince framed the decision as an uncomfortable departure from Cloudflare's stated policy of infrastructure neutrality. Then in September 2022, facing a sustained pressure campaign led by trans Twitch streamer Clara Sorrenti (known as Keffals), who had been doxed, swatted, and driven into hiding by Kiwi Farms users, Cloudflare finally terminated services to that platform, citing an "imminent and emergency threat to human life." Prince had spent the preceding days publishing 2,600-word blog posts defending the decision to continue protecting the site, comparing Cloudflare to a telephone company that doesn't terminate your line for saying awful things. He reversed course within 72 hours.

After Cloudflare dropped these platforms, authoritarian governments started citing those decisions as justification for pressuring Cloudflare to drop human rights organizations. This is the bind that comes from holding enormous unilateral power over the internet's plumbing. Every decision becomes precedent. Every refusal is a policy. And the only way to avoid the bind would have been to not accumulate the power in the first place.

And this is the landlord we're dealing with. An infrastructure provider whose decisions about who gets to exist on the Internet are ultimately subject to the mood and media cycle of a CEO. The Internet Society has noted how this kind of consolidation is eroding Internet resilience. The fragility was demonstrated on November 18th, 2025, when a bug in Cloudflare's Bot Management system triggered a global outage. Roughly one in five webpages were affected at the height of the incident, with a third of the world's 10,000 most popular websites down—X, ChatGPT, Spotify, Canva, and even Downdetector, the outage-tracking site people normally reach for in these moments. Seventeen days later, a second major outage struck, this time caused by a change to Cloudflare's own Web Application Firewall while attempting to patch an industry-wide React Server Components vulnerability.

None of this is to even mention the privacy nightmare. Cloudflare acts as a man-in-the-middle for HTTPS traffic, meaning they are able to see all unencrypted traffic passing through their network. It sits between you and a substantial fraction of every website you visit, decrypting HTTPS, inspecting packets, and re-encrypting on the other side. This is what it means to be a reverse proxy.

The Static

Look, I don't need to tell you how antithetical this is to the ideals of the Internet. A single platform cannot be responsible for the infrastructure of everyone else. A single private company cannot have the power to arbitrarily and unilaterally decide who gets to remain on the Internet, and who doesn't.

But I also understand how tempting it is to use the platform and service, don't get me wrong. Cloudflare Tunnel allows you to easily point a domain to a homeserver's IP address without any port forwarding needed. Cloudflare Pages allows 500 builds per month with 100 custom domains per project, unlimited bandwidth and static requests. In January 2025, the company acquired AstroJS, the open-source framework that has become the preferred tool for content-driven static websites. And just this week, they announced the adorably-named EmDash CMS, an open-source content management system built on Astro and positioned as the spiritual successor to WordPress, written entirely in TypeScript, serverless, with sandboxed plugins that structurally address the security nightmare that has plagued WordPress for years.

As a JAMstack developer who has been making static websites for over a decade, I concede that was an exciting development. I've seen this acquisition pattern before. When Netlify acquired Gatsby in February 2023 with public commitments to be good stewards of the open-source project, the engineering team was gutted within months and Gatsby Cloud was shuttered by August of that year. The incremental build features that had been Gatsby's main competitive advantage—the actual reason developers chose it over alternatives—were never ported to Netlify as promised. The framework entered what Smashing Magazine documented in early 2024 as dependency hell: unable to upgrade its dependencies without introducing cascading breaking changes, essentially abandoned in place.

Cloudflare's stewardship of Astro will be different, or it won't, and we won't know for a few years. I remain cautiously hopeful, because the incentive structure seems different—Cloudflare wants Astro to power EmDash, which they want to power their Workers platform, which is their core business. Netlify wanted Gatsby's enterprise customer list. I do not think you need to throw the static baby out with the DDoS-protection bathwater.

Regardless, if not Cloudflare, then what?

Thankfully, there are real alternatives. Bunny.net, a Slovenian CDN provider that has been steadily building a reputation as the conscience-compatible choice among developers who've concluded the same things I'm concluding here. EU-headquartered, GDPR-native by design, pay-as-you-go pricing starting at $0.01 per gigabyte, no tiered plans that punish you for growing. It includes DDoS protection, a CDN, and edge scripting. It doesn't have Cloudflare's scale because nobody does. But the question is whether what you're giving up is worth what you're getting back. And if what you're giving up is a meaningful stake in the decentralization of the infrastructure of public discourse, the math works out in Cloudflare's favour.

But that isn't the solution I took.

A Canadian Answer

Deflect is not a household name, but I think it should be.

Founded in 2011 by digital security expert Dmitri Vitaliev and Canadian internet entrepreneur David Mason, Deflect predates both Google's Project Shield and Cloudflare's own Project Galileo—the programs those companies eventually stood up to offer free protection to civil society organizations. Deflect was the original, built in response to an influential Berkman Center report documenting how DDoS attacks had become a standard tool of political repression, used by governments and bad actors to silence independent media and human rights groups. Deflect was the response to that finding.

It is a project of eQualitie, a Canadian social enterprise based in Montréal, committed to privacy, resilience, and self-determination. The commercial revenue from paid customers directly subsidizes free protection for qualifying non-profits, human rights defenders, and independent media. Since 2011, Deflect has protected thousands of civil society organizations around the world, including, at various points: the Black Lives Matter website during the Ferguson protests, Rohingya news organizations during the Rakhine State violence, human rights organizations in Gaza under active DDoS attack, and Uzbek activists targeted by a persistent state-backed cyber offensive.

The infrastructure has withstood malicious traffic in excess of 100 Gbps. It uses Apache Traffic Server, seeks datacenters powered by renewable energy, and publishes its software as open source. Deflect will never sell your data. It has never refused service to a qualifying organization because that organization was attracting too many attacks. It does not have a CEO who describes himself as "almost a free-speech absolutist."

A Promise to Shine Light

I'm so grateful to be running my site through Deflect, now. I'm independent media. No institution, no grants, no corporate ownership. I am a Queer Red River Métis writing about colonial contradiction from within, and I'm doing this in public, under my own name, on my own domain. I'm Canadian. I was born and raised here, I live and work here, just as my ancestors have for thousands of years.

Deflect's eligibility criteria ask whether you defend human rights, run a civil society organization, or produce independent media. They ask whether your work contravenes the Universal Declaration of Human Rights.

I'll be honest: I wasn't sure I qualified. Maybe this is just impostor syndrome, but Deflect is designed for organizations under threat—journalists in hostile countries and activists under state surveillance. But being accepted means I feel far more emboldened to try. To try to dismantle the systems of oppression. To shine light on darkness and to amplify unheard voices. To liberate all my brothers and sisters and siblings into freedom and joy. I have nothing to be afraid of anymore.


If you run a site that does work worth protecting—independent media, civil society, advocacy, anything that could make someone with power uncomfortable—and you've been defaulting to the easy answer because it's the only one everyone recommends: there is another answer.

Comments

To comment, please sign in with your website:

How it works: Your website needs to support IndieAuth. GitHub profiles work out of the box. You can also use IndieAuth.com to authenticate via GitLab, Codeberg, email, or PGP. Setup instructions.

No comments yet. Be the first to share your thoughts!


Webmentions

No webmentions yet. Be the first to send one!


Related Posts

↑ TOP