Why Spotify, Netflix and YouTube Recommendations Feel Worse Than They Used To
You open YouTube, Spotify, or Netflix expecting something good.
After all, these platforms know you.
What you watch.
What you skip.
What you replay at 2AM.
And yet somehow…
You spend more time scrolling — and less time actually enjoying anything.
If you've been wondering why recommendations feel worse than they used to, you're not imagining it.
Something has changed.
What Happened to Recommendations?
A few years ago, recommendation systems felt almost magical.
You'd open an app and immediately find:
- a song you loved
- a show you binged
- a video that actually stayed with you
It felt like discovery.
Now it often feels like repetition.
The same creators.
The same formats.
The same safe, predictable content.
So what changed?
The Shift: From Taste to Engagement
At their core, recommendation systems are built on machine learning.
Originally, the goal was simple:
Show you more of what you like.
But over time, that goal evolved into something else:
Show you what keeps you watching.
That might sound similar — but it's not.
- What you like is personal
- What keeps you engaged is often universal
And universal content tends to be:
- faster
- louder
- more repetitive
- easier to consume
Why Everything Starts to Feel the Same
When platforms optimize for engagement, they naturally drift toward patterns that work on the largest number of people.
This creates a loop:
- You watch something that grabs your attention
- The system recommends more of it
- You keep watching (even if you don't love it)
- Your feed becomes narrower over time
Eventually, you're not discovering anymore.
You're being fed variations of the same thing.
The Hidden Problem: Over-Optimization
Here's the paradox:
The better recommendations get at predicting your behavior…
The worse they get at surprising you.
This is sometimes described as a "filter bubble" — a concept popularized by Eli Pariser.
Instead of expanding your taste, algorithms start reinforcing it.
You see:
- fewer risks
- fewer unexpected finds
- fewer "how did I even find this?" moments
And that's exactly what makes content feel stale.
Why You End Up Scrolling More (and Enjoying Less)
If recommendations were truly perfect, you'd:
- click something
- enjoy it
- leave satisfied
But that's not what happens.
Instead:
- you scroll longer
- hesitate more
- open multiple tabs
- abandon things halfway through
Because the system isn't optimizing for satisfaction.
It's optimizing for time spent.
And those are very different outcomes.
What People Actually Want From Recommendations
Most people don't want infinite suggestions.
They want:
- fewer, better options
- content that matches their current mood
- a sense of discovery, not repetition
- some level of human curation
In other words:
They want recommendations that feel intentional, not automatic.
How to Fix Your Recommendations (Practical Tips)
If your feeds feel worse lately, here's how to reset them:
- Break the patternSearch for something completely different than usual. This disrupts your recommendation loop.
- Don't rely only on the homepageActively search instead of passively scrolling.
- Use "not interested" aggressivelyTrain the system — most people don't.
- Rotate platforms or sourcesDifferent inputs = different outputs.
- Limit passive consumptionThe more you scroll without choosing, the worse recommendations get.
A Different Approach: Intentional Discovery
There's an alternative to algorithm-heavy feeds.
Instead of asking:
"What should I watch next?"
You start with:
"What do I feel like right now?"
That's the idea behind platforms like vibeastral.app.
Instead of endless recommendations:
- you choose a mood
- you explore a finite set of content
- you rely on human signals, not just predictions
It's a slower experience.
But it often leads to something better:
Content you actually remember.
The Bigger Shift Happening Right Now
More people are starting to notice:
- recommendations feel repetitive
- content feels disposable
- time spent doesn't equal satisfaction
That's why interest in:
- "quitting social media"
- "reducing screen time"
- "better content discovery"
is growing.
Not because people want less content.
Because they want better filtering.
Final Thought
Recommendations didn't get worse by accident.
They got better at the wrong thing.
And once you see that…
It's hard to unsee.
The future of content discovery might not be smarter algorithms.
It might be giving people control again.