When Cheap Content Changes Who Gets Heard
core-model | 2026-04-09 | economyforeveryone
When AI makes content cheap, power shifts toward ranking, discovery, and trust signals, and ordinary people inherit the sorting burden.
One small action: Audit one gate you rely on this week and ask what trust signal is really doing the work.
A school librarian is working through a cart of new books.
One is a children’s biography. The cover looks fine. The author name is unfamiliar, but that’s normal. She searches for the author and finds almost nothing. No website. No real publishing trail. Inside, she spots a date that doesn’t match anything she can verify.
She flags it.
But she has forty books on the cart and forty minutes in the hour. She can’t do that level of checking for all of them.
The problem isn’t just that AI can make more content. It’s that it can make more content than ordinary human judgment can realistically sort.
That’s where this post starts. This isn’t really a story about “too much content.” It’s a story about what happens when content gets cheap and trust doesn’t.
What’s happening
AI has pushed the cost of producing passable content way down. The issue isn’t just abundance. It’s the way abundance raises the importance of ranking systems, discovery systems, and trust signals.
That changes where power sits.
When production gets cheap, the choke point is no longer who can make something. The choke point becomes who gets found, who gets trusted, and who gets surfaced first.
That’s the gate shift.
The old gate was production. The new gate is discovery, ranking, and trust. And that new gate is mostly private.
Why it’s happening
The shift is straightforward: output got cheap. Judgment didn’t.
When something becomes fast, cheap, and scalable, pressure shifts to the next scarce thing. In content markets, that scarce thing is attention, trust, and gatekeeping.
First, scale without adjudication. The system can scale creation faster than it can scale verification. The books keep arriving. The articles keep appearing. The feeds keep filling. But the human time needed to check quality, source, intent, and accuracy doesn’t scale the same way.
Second, gate shift. Once production gets cheap, distribution becomes the real choke point. Gatekeeping doesn’t disappear. It moves downstream into ranking, storefront placement, search, feeds, and trust signals.
This shows up in two directions at once.
For readers, it becomes a trust problem. Can you tell what you’re looking at? Was it checked? Is the author real? Is the outlet what it claims to be?
For creators, it becomes a market problem. Cheap production does not automatically mean a freer market if getting found still depends on a few opaque systems. A platform can say it opened the door to more creators while still tightening its grip on who actually gets seen.
Most people don’t review everything directly. They rely on shortcuts:
- search results
- recommendations
- top sellers
- reviews
- author bios
- outlet names
- credential signals
Those shortcuts are now doing more work than before. Which means the systems that shape those shortcuts are doing more governing than before too.
And when someone tries to contest a gate decision - why was my book buried, why was my content demoted, what do I do - the response is usually polite and empty. The platform says “proprietary.” The frontline support person can’t override. The buyer says “that’s just the marketplace.” Everyone has a script. No one has responsibility.
The deeper issue isn’t volume. It’s counterfeit trust signals.
The flood itself is a problem. But the bigger problem is that low-cost content can imitate the signals people use to decide what’s worth trusting:
- a decent cover
- a plausible author page
- a convincing outlet name
- a review profile
- a search result that looks official
- a result placed high enough that most people assume someone already checked it
The flood adds pressure. But the more damaging mechanism is the breakdown of trust signals that let bad content masquerade as good.
That is why the local news part of this story carries so much weight. AI did not create the collapse of local news. That collapse was already underway. What AI does is make it cheaper to fill the vacuum with content that looks local enough to pass a quick glance. That is a different claim, and it is the right one. The danger is not “AI replaced journalism overnight.” The danger is that communities already living with a weaker local information system now have an even harder time telling what is real.
And once people are left guessing, two bad things get easier: manipulation and withdrawal. Some people get pulled into nonsense. Some people stop believing anything enough to act. Neither one helps shared reality.
Who benefits, and who carries the risk
Who benefits?
Platforms benefit from more inventory, more engagement opportunities, and more leverage over discovery. Cheap production can also help some small creators enter markets that used to be harder to enter. That gain is real.
But the stronger point from the case study is what happens after entry. In books, for example, the market doesn’t just split into “old incumbents” and “new creators.” It also starts to squeeze the middle. Platform-favored content and high-volume operators can do fine. A handful of new entrants can break through. The people in trouble are often the mid-tier creators who compete on care, voice, and consistency, but now depend even more on ranking systems they cannot see or challenge.
Think about the difference between a name-brand author with a marketing machine behind them and a working writer with a loyal but finite audience. The first person may still get the homepage placement, the newsletter slot, the podcast booking, the pre-order push, the “you might also like” shelf.
The second person is often the one making something careful and distinctive, then watching it disappear into a much noisier field where ranking matters more than craft.
Another way to say it: creators get turned into the pool. A platform can use a few recognizable names to attract attention, then let everyone else compete inside a much noisier market where ranking is opaque, payouts are thin, and volume keeps rising. The platform grows. A few anchor names still do well. The middle gets less stable.
A similar split shows up in local news. The national brand with an established audience can absorb some discovery shock. The small local outlet, or the startup trying to replace the paper that already died, is the one living or dying on traffic it does not control.
Who carries the risk?
- mid-tier creators who compete on care rather than volume
- parents trying to sort trustworthy from untrustworthy material
- librarians and teachers asked to curate at impossible scale
- local communities already living with weakened newsrooms
- readers who can’t tell what’s real, who’s real, or why something is being shown to them
That split is the point. Once again, the system that gains speed isn’t the same as the person who absorbs the uncertainty.
And there is a second layer here: the platforms running these gates increasingly function like private regulators. They make ranking, visibility, and traffic decisions that shape livelihoods and public understanding, but without the kind of notice, explanation, recordkeeping, or appeal we would expect if the same power were exercised by a public body. They get to act like the gate without taking on the duties of being the gate.
What good looks like
The answer isn’t “ban AI content.” That’s too blunt, and it misses the mechanism. The better question is: what should the minimum floor be once the gate has moved?
The minimum floor here is specific: consumer-facing labels at the point of discovery, real contestability for takedowns and ranking penalties, calibrated friction that slows flood operations without locking out legitimate creators, and enough structural room for the middle to survive.
- people should be able to see what they’re looking at
- creators should be able to challenge decisions that affect whether they can be found
- platforms shouldn’t be allowed to become private regulators without basic accountability
- cheap production shouldn’t automatically mean that discovery power, and the money that comes with it, get captured upstream
When the gate shifts to distribution and ranking, platforms exercise real power over creator livelihoods. That power should come with accountability.
If content is becoming abundant, trust should get easier, not harder.
What to do
Pick one gate you rely on all the time and audit it:
- recommendations
- search
- a review site
- a social feed
- a newsletter platform
Check five things it surfaced and ask:
- Who made this?
- Can I verify the source?
- What trust signal is doing the work here?
- Would a normal person know what they’re looking at?
- What is this system rewarding right now?
That’s a repeatable habit. It turns the gate from invisible background into something you can actually see.
At the community or policy level, the practical issue is simple:
if a platform is acting like the gate, it needs duties that fit being the gate.
How to talk about it
I wouldn’t lead with “AI is ruining culture.” That turns the whole thing into a vibes fight.
I would say this instead: the issue isn’t whether people can make more content. It’s whether normal people can still tell what they’re looking at, and whether the systems deciding what gets seen have any meaningful accountability.
Or even shorter: cheap content isn’t the whole problem. Ungoverned gates are.
One steady action to take this week
Do one gate audit this week using the questions above.
Pick one platform and check five things it surfaced. Do that once, and the gate starts to come into view.
Action ladder
Short term
Readers and users:Audit one gate you rely on this week - search, recommendations, reviews, or a feed - and ask what trust signal is really doing the work.Teachers, librarians, and curators:Slow down on one selection decision and check the source trail, not just the polish of the artifact.
Medium term
Communities and institutions:Ask the platforms, schools, libraries, and marketplaces you rely on what trust checks and review standards they actually use.Creators and local outlets:Strengthen owned trust signals - direct relationships, transparent sourcing, named authorship, and visible editorial process - so you are not relying entirely on opaque ranking.
Long term
Policymakers and regulators:Push for visible labeling, contestable ranking, and accountability for private gates that now shape public understanding.Communities, educators, and civic groups:Treat discovery and trust as infrastructure. If the gate governs what gets believed, it cannot stay unaccountable by default.