When Algorithms Know You Better Than You Know Yourself

Last week I was scrolling through TikTok when I noticed something weird. Every third video was about journaling and mental health apps. It felt like the app had somehow figured out I was stressed about my upcoming exams. The thing is, I hadn’t searched for anything related to stress or mental health. I’d just been watching videos late at night, probably scrolling faster than usual, maybe rewatching certain clips. Somehow, TikTok knew I needed those wellness videos before I did.

This is the strange new world we live in. Social media platforms collect so much information about us that they can predict what we want or need with scary accuracy. Sometimes this feels helpful. Other times it feels like we’re living in a science fiction movie where computers control our thoughts.

Let me be clear about something upfront. Targeted advertising does have some benefits. If you’re a small business owner, you can find customers who actually want your products without spending a fortune. If you’re shopping for something specific, you might see ads for exactly what you need. Parents discover useful products for their kids. Travelers find good deals on hotels.

But after looking at how these systems really work, I think the harm they cause is much bigger than any benefits. We’re trading our privacy and freedom for slightly more relevant ads, and that’s a terrible deal.

The Surveillance Machine

Most people have no idea how much data social media companies collect. It’s not just about what posts you like or what accounts you follow. These platforms watch everything you do, and I mean everything. Facebook collects thousands of different data points about each person using their platform. They track how long you look at each post, whether you start typing a comment and then delete it, how quickly you scroll past certain topics. They even track your mouse movements on the screen.

TikTok goes even further with data collection. They monitor your typing rhythms, which can identify you personally. They collect voice data and facial recognition information. They know what other apps you have on your phone and which websites you visit outside their app.

All this data gets fed into complex computer systems that create a profile of who you are. Not just your interests, but your personality, your weaknesses, your habits, and your likely future behavior. These profiles are so detailed that researchers have found they can predict major life events before they happen.

Exploiting Psychological Vulnerabilities

Here’s where things get really concerning. These platforms don’t just use data to show you ads for things you might want to buy. They use it to figure out when you’re most likely to make impulsive decisions. Facebook’s own research showed they could identify when teenagers feel insecure or worthless based on their activity patterns. Think about that for a second. A massive corporation knows when kids are feeling bad about themselves and can target them with specific advertisements during those vulnerable moments.

TikTok has gotten really good at what I think of as emotional manipulation loops. They notice when someone watches videos about a particular insecurity, then show more content about that topic, which makes the person more anxious, and then serve ads for products that promise to fix the problem. Someone worried about their appearance gets more beauty content, becomes more self-conscious, then sees ads for cosmetic procedures. Studies have shown that platforms can detect depression, anxiety, and other mental health issues with frightening accuracy. Instead of using this information to help people, they use it to sell products.

The Responsibility Trap

What really bothers me is how these companies pretend they’re giving us control. They offer screen time limits and ad preferences settings, acting like we’re responsible for managing our own experience. But this is like a casino giving you a pamphlet about gambling addiction while they pump oxygen into the air to keep you awake and playing.

The whole system is designed to override your self-control. Platforms use techniques from behavioral psychology to keep you hooked. Variable reward schedules make checking for likes as addictive as gambling. Infinite scroll removes natural stopping points. Push notifications create anxiety about missing out.

Then when people develop problems with social media use, the companies say it’s the user’s fault for not having better habits. They’ve created a trap and then blame us for falling into it.

Digital Autonomy Under Attack

Going back to my TikTok experience, I keep wondering: did I actually want to learn about journaling, or did the algorithm decide I should want it? When platforms can predict and influence our behavior this accurately, how much of what we think we want is really our own choice?

Experts who study surveillance capitalism warn that we’re losing our ability to make free choices. Our future behaviors become predictable and controllable. Companies don’t just sell us products anymore. They sell the ability to modify our behavior to whoever pays them. This isn’t just about seeing ads. It’s about losing control over our own minds. When algorithms know what we’ll want before we do, when they can trigger our insecurities and offer solutions, when they can make us believe our programmed desires are our own choices, we stop being fully human. We become predictable machines ourselves. The benefits of targeted advertising are real but small. The costs to our privacy, mental health, and freedom are enormous. We need to demand better from these platforms. We need laws that protect our data and our minds. Most importantly, we need to understand that when something is free, we’re the product being sold.