The Algorithmic Everywhere šŸ”

It’s not just social media. It’s the sorting system beneath it.

Most debates about ā€œsocial media harmsā€ start with the same assumption: the problem lives inside a few apps.

If we could just fix those platforms, ban those feeds, or regulate those companies, we would be fine.

But social media did not invent algorithmic influence. It normalized it. Research on feeds and ranking shows how recommendation systems now sit at the core of how we see news, search results, and public information online.

And now, the same recommendation logic that shapes your Instagram feed is quietly sorting what you see almost everywhere else.
• The ā€œFor Youā€ page is not just TikTok.
• The feed is not just Twitter.
• The algorithm is not just Meta.

It’s news apps deciding which headlines you see first, based on what people like you tend to click and linger on.

It’s search engines ranking what counts as ā€œrelevant,ā€ pushing some sources up and others down through complex scoring systems.

It’s streaming platforms steering your next watch before you choose it, using models that predict what will keep you watching the longest.

It’s shopping sites arranging what you ā€œmight like,ā€ what’s ā€œpopular,ā€ and what’s ā€œlimited-time,ā€ all tuned to maximize clicks, carts, and conversions.

In other words: algorithmic influence is becoming the default feature of everyday life, not a problem confined to social media.

Why this matters: We’re training for the wrong opponent šŸ„Š

When the conversation stays stuck on ā€œscreen timeā€ we miss the system underneath. Algorithms don’t just show content.

They quietly shape:
• Attention (what you notice) šŸ‘€
• Emotion (what you feel) šŸ’„
• Belief (what seems true or common) šŸ’­
• Behavior (what you buy, click, watch, share) 🧠

They are curation machines.

And because they learn from our clicks, they tend to optimize for what reliably keeps us engaged: novelty, outrage, fear, status, certainty, and identity signals. Studies of platforms like YouTube and TikTok show how engagement-driven systems can steadily amplify emotionally charged and even toxic content because it keeps people interacting longer.

That is why ā€œjust spend less time on social mediaā€ is not enough. Even if you never open Instagram again, you are still swimming in algorithmic currents—in your news, your entertainment, your shopping, even your search results.

Twella: Digital cognitive fitness šŸ’Ŗ, not digital avoidance

In a recent post, I wrote about the limits of bans and heavy restrictions. Boundaries can help, and fewer pings can improve sleep and focus.

But even if bans ā€œwork,ā€ they don’t actually prepare people for the world they will still graduate into: a life shaped by recommendations, targeted content, and invisible sorting systems.

Researchers talk about this as ā€œalgorithmic media useā€ and ā€œalgorithmic literacyā€: understanding how these systems shape what we see and how we act.

What we need is something more durable: digital cognitive fitness. Not a one-time assembly. Not a guilt trip. Not a ā€œdelete the appsā€ moral stance.

A skill set you build—like reading, math, or scientific reasoning—where the goal is agency or cognitive autonomy.

🧰 Twella Toolkit: Don’t just ask ā€œIs it true?ā€

In another Twella newsletter post I introduced a simple, impactful move: Name the Claim.

Before reacting, sharing, or spiraling, ask: what kind of claim is this?
• A fact claim (something happened) šŸ“…
• A causal claim (this caused that) āž”ļø
• A vibe claim (this is good/bad/cringe) 😬
• An order claim (you must do this) šŸ“£

Algorithms blur these together at high speed. Naming the claim slows the blur just enough to restore choice. This kind of micro-classification is a core move in critical thinking and information literacy.

Ultimately, it is about the tiny habits we cultivate that make the difference. This actually builds what we say we want for young people: autonomy, resilience, and the ability to thrive in attention-grabbing digital spaces. šŸ’ŖšŸ“±

Takeaway

Remember, it’s not just about social media. It’s about the systems that decide what you see, in what order, and with what emotional pressure.

Once we notice that, a different goal appears.

Not ā€œhow do we escape the feed?ā€

But: how do we build the skills to live well inside an algorithmic world?

Connect & Share!

If you’re a leader in a middle or high school, college, or university and want to help students build real digital cognitive fitness, I’d love to partner. Just reply to this newsletter or email me at [email protected].

You can also get the 2‑Step Critical Thinking Guide to immediately start helping students build those digital cognitive fitness muscles through short, targeted practices they actually enjoy. šŸ’ŖšŸ“²

And if someone in your world is already doing this work—an educator, counselor, or leader you admire—please consider forwarding this their way.

Small circles of people who care can always make a big difference.

Here’s to helping students think well in our digital world!
James