The Rot is Structural

Written in

by


I have a confession. When I get a DM from a stranger these days, my first instinct isn’t curiosity. It isn’t even mild suspicion. It’s a kind of weary, bone-deep certainty: here we go again.

It doesn’t matter what platform. LinkedIn, X, Instagram, Telegram, Discord, the format changes, the grift stays the same. Someone found my profile. They want to collaborate, invest, connect, share an opportunity, or introduce me to their “CEO contact” who’s been watching my work. They are typing right now, probably from a server farm somewhere, and they will send me something that looks almost human. Almost.

I’ve been in the blockchain space since 2014. I’ve built things, lost things, documented things. I’ve watched retail investors get ruined in real time and filmed it for a documentary. I have seen more rug pulls, honeypots, fake influencer campaigns, and “guaranteed yield” schemes than any one person should be subjected to without hazard pay. And still — still — the volume increases every year.

Cory Doctorow coined the term ‘enshittification’ for the slow collapse of platforms. What I’m describing is its natural endpoint: a social web where every surface is coated in fraud, and trust has become a liability.

The evolution has been remarkable, in the worst possible way. Early scams were obvious, bad grammar, implausible claims, urgency signals a child could spot. Then they got better. They learned to mirror your industry’s vocabulary. They started using real company names, legitimate-looking profiles, plausible backstories.

Some of them now write better English than my actual colleagues. They have profile pictures generated by the same AI tools I use for creative work. They have LinkedIn endorsements from other fake accounts endorsing each other in an elaborate mutual legitimacy performance.


Incident log — recent sample

  • [SCAM] “Love your work. Our fund wants to invest in Web3 projects like yours.”
  • [SCAM] Instagram DM: model account, crypto trading “mentor”, 4,200 followers, all bots.
  • [SUSPECT] Telegram: “Are you the founder of [project]? We’d like to list you on our exchange.”
  • [SCAM] LinkedIn: HR at a company that doesn’t exist offering a creative director role. Requires wallet deposit to “process your equipment.”
  • [SCAM] Email: Docusign-styled phishing for “contract review” on a deal I never initiated.
  • [SUSPECT] X: verified-looking account DM. Project looks real. Team page: stock photos.

The worst part isn’t the volume. It’s what it does to your judgment. I’ve started applying a kind of reverse-prior to every new introduction: assume fraud, prove otherwise. That’s not a healthy way to operate a professional network. But it’s what the environment has trained me to do. And I know I’m not alone, I hear versions of this story constantly from founders, filmmakers, creators, anyone with a visible online presence and something worth stealing.

I’ve also witnessed the collateral damage up close. For Degen Generation, the documentary I’ve spent years making, I’ve followed real people, first-time DeFi users, through a full market cycle. Some of the most painful stories weren’t about bad trades. They were about trust: someone reached out, seemed legitimate, knew the right language, and walked off with someone’s savings. The scam didn’t announce itself. It arrived in the shape of an opportunity.

Platforms have largely washed their hands of this. The moderation infrastructure that exists is porous by design, enforcement is slow, reporting is frustrating, and the economic incentive to keep engagement metrics high outweighs the incentive to remove bad actors. Spam is engagement too, apparently. Bot accounts inflate follower counts, which inflate ad valuations, which keeps the numbers looking healthy for investors. The rot is structural.

At this point I can usually tell within two sentences. The timing of the follow. The phrasing of the opener. The way the compliment is slightly too generic to be real. I’ve developed a sixth sense for it. Nobody asked me to. The internet just made it necessary.

Somewhere along the way I became fluent in a language nobody should have to speak. I know the signals. The too-perfect grammar with a single odd idiom. The profile created six months ago with fifteen connections and a stock-photo headshot. The investment opportunity that “can’t wait.” The “CEO” who contacts you directly instead of through anyone else. The unsolicited contract. The emotional manipulation dressed as opportunity. I’ve learned to read all of it without thinking about it, the way you learn to read traffic before crossing a street in a city that’s trying to kill you.

Here’s the thing: that knowledge is worth something. Not just to me, and not just to warn friends. It’s the kind of pattern recognition that fraud departments at banks, platforms, and law enforcement agencies spend significant budget trying to build into systems. I’ve had it trained into me for free, by the internet itself, over a decade of daily exposure.


At this point, I figure someone in local authority should probably just hire me. Give me a badge. A modest stipend. A direct line to whichever department handles this. I’ll save them the training budget, I’ve already done the coursework.