- Think Well Together
- Posts
- The Algorithmic Everywhere š
The Algorithmic Everywhere š
Itās not just social media. Itās the sorting system beneath it.
Most debates about āsocial media harmsā start with the same assumption: the problem lives inside a few apps.
If we could just fix those platforms, ban those feeds, or regulate those companies, we would be fine.
But social media did not invent algorithmic influence. It normalized it. Research on feeds and ranking shows how recommendation systems now sit at the core of how we see news, search results, and public information online.
And now, the same recommendation logic that shapes your Instagram feed is quietly sorting what you see almost everywhere else.
⢠The āFor Youā page is not just TikTok.
⢠The feed is not just Twitter.
⢠The algorithm is not just Meta.
Itās news apps deciding which headlines you see first, based on what people like you tend to click and linger on.
Itās search engines ranking what counts as ārelevant,ā pushing some sources up and others down through complex scoring systems.
Itās streaming platforms steering your next watch before you choose it, using models that predict what will keep you watching the longest.
Itās shopping sites arranging what you āmight like,ā whatās āpopular,ā and whatās ālimited-time,ā all tuned to maximize clicks, carts, and conversions.
In other words: algorithmic influence is becoming the default feature of everyday life, not a problem confined to social media.
Why this matters: Weāre training for the wrong opponent š„
When the conversation stays stuck on āscreen timeā we miss the system underneath. Algorithms donāt just show content.
They quietly shape:
⢠Attention (what you notice) š
⢠Emotion (what you feel) š„
⢠Belief (what seems true or common) š
⢠Behavior (what you buy, click, watch, share) š§
They are curation machines.
And because they learn from our clicks, they tend to optimize for what reliably keeps us engaged: novelty, outrage, fear, status, certainty, and identity signals. Studies of platforms like YouTube and TikTok show how engagement-driven systems can steadily amplify emotionally charged and even toxic content because it keeps people interacting longer.
That is why ājust spend less time on social mediaā is not enough. Even if you never open Instagram again, you are still swimming in algorithmic currentsāin your news, your entertainment, your shopping, even your search results.
Twella: Digital cognitive fitness šŖ, not digital avoidance
In a recent post, I wrote about the limits of bans and heavy restrictions. Boundaries can help, and fewer pings can improve sleep and focus.
But even if bans āwork,ā they donāt actually prepare people for the world they will still graduate into: a life shaped by recommendations, targeted content, and invisible sorting systems.
Researchers talk about this as āalgorithmic media useā and āalgorithmic literacyā: understanding how these systems shape what we see and how we act.
What we need is something more durable: digital cognitive fitness. Not a one-time assembly. Not a guilt trip. Not a ādelete the appsā moral stance.
A skill set you buildālike reading, math, or scientific reasoningāwhere the goal is agency or cognitive autonomy.
š§° Twella Toolkit: Donāt just ask āIs it true?ā
In another Twella newsletter post I introduced a simple, impactful move: Name the Claim.
Before reacting, sharing, or spiraling, ask: what kind of claim is this?
⢠A fact claim (something happened) š
⢠A causal claim (this caused that) ā”ļø
⢠A vibe claim (this is good/bad/cringe) š¬
⢠An order claim (you must do this) š£
Algorithms blur these together at high speed. Naming the claim slows the blur just enough to restore choice. This kind of micro-classification is a core move in critical thinking and information literacy.
Ultimately, it is about the tiny habits we cultivate that make the difference. This actually builds what we say we want for young people: autonomy, resilience, and the ability to thrive in attention-grabbing digital spaces. šŖš±
Takeaway
Remember, itās not just about social media. Itās about the systems that decide what you see, in what order, and with what emotional pressure.
Once we notice that, a different goal appears.
Not āhow do we escape the feed?ā
But: how do we build the skills to live well inside an algorithmic world?
Connect & Share!
If youāre a leader in a middle or high school, college, or university and want to help students build real digital cognitive fitness, Iād love to partner. Just reply to this newsletter or email me at [email protected].
You can also get the 2āStep Critical Thinking Guide to immediately start helping students build those digital cognitive fitness muscles through short, targeted practices they actually enjoy. šŖš²
And if someone in your world is already doing this workāan educator, counselor, or leader you admireāplease consider forwarding this their way.
Small circles of people who care can always make a big difference.
Hereās to helping students think well in our digital world!
James