Author: soa23yl

  • Is Social Media Making Us Sick?

    Is Social Media Making Us Sick?

    Last month, a friend told me she deleted Instagram after realizing she’d spent four hours scrolling through strangers’ vacation photos instead of studying for her finals. “I just wanted a quick break,” she said, “but somehow the afternoon disappeared.” The debate about social media’s impact often presents it as balanced, with benefits and drawbacks. Yes, social media offers real benefits. It connects distant friends, helps activists organize, and provides communities for isolated people. During the pandemic, these platforms became lifelines for maintaining relationships.

    But here’s the thing: when you look closely at how these platforms actually work, their design, business models, and the behaviors they encourage, the harm outweighs the good. Social media are carefully engineered systems designed to exploit our psychology for profit, and this comes at a real cost to our well-being.

    The Architecture of Addiction

    Let’s start with the most basic feature of modern social media: the infinite scroll. This isn’t a convenient design choice. It’s a deliberate decision to remove natural stopping points that might prompt you to close the app. Research on platform design shows that features like pull-to-refresh, variable reward schedules for likes and comments, and push notifications all work together to create what psychologists recognize as addictive patterns.

    Think about slot machines for a moment. They’re designed with variable reward schedules because that’s what keeps people pulling the lever. You never know when you’ll hit the jackpot, so you keep trying. Social media works exactly the same way. You never know when you’ll get that dopamine hit from a popular post or an interesting video, so you keep scrolling.

    The platforms also use sophisticated algorithms to learn exactly what keeps you engaged. They track everything: what you click, how long you watch, what makes you comment or share. Then they use this data to serve up an endless stream of content perfectly calibrated to keep you on the platform. It’s not about showing you what’s good for you or even what you really want to see. It’s about showing you whatever will keep you scrolling.

    Following the Money

    To understand why platforms are designed this way, you need to follow the money. Facebook’s parent company Meta made nearly $40 billion in ad revenue in 2024. They don’t make money when you have meaningful connections with friends. They make money when you stay on the platform looking at ads.

    This creates a fundamental misalignment between platform interests and user well-being. The longer you scroll, the more data they collect, the more ads they show, the more money they make. Your anxiety, your FOMO (fear of missing out), your compulsive checking – these aren’t unfortunate side effects. They’re the business model.

    The platforms have become incredibly sophisticated at what researchers call “surveillance capitalism.” They’re not just tracking what you do on their apps. They’re building detailed psychological profiles to predict and influence your behavior. This data isn’t just used to show you ads for products. It’s used to shape your emotions, exploit your insecurities, and keep you coming back.

    The Content Problem

    Beyond the addictive design and exploitative business model, there’s the issue of what these platforms actually show us. The algorithms don’t optimize for truth, kindness, or well-being. They optimize for engagement, and unfortunately, the content that drives the most engagement is often the most inflammatory, divisive, or anxiety-inducing.

    Studies consistently show that false information spreads faster and wider than true information on social media. Outrage drives more clicks than nuance. Fear keeps people scrolling longer than joy. The platforms know this, and their algorithms amplify it.

    This creates what researchers call “toxic content spirals.” The more you engage with negative content, the more the algorithm shows you. Before long, your feed becomes a stream of arguments, bad news, and content designed to make you angry or afraid. Even when you try to curate a positive feed, the algorithm fights against you, always pushing the content that will keep you engaged, regardless of how it makes you feel.

    The Illusion of Control

    When faced with criticism about these harms, platforms offer “digital well-being” tools and tell users to practice better self-control. Instagram has time limits. TikTok offers break reminders. But this is like a casino offering pamphlets about responsible gambling while keeping the lights flashing and the drinks flowing.

    These tools represent what scholars call neoliberal responsibilization – making individuals responsible for managing problems created by systems designed to override their self-control. The message becomes: if social media is harming you, it’s your fault for not using our tools properly.

    But the tools themselves are often buried in settings menus, designed to be forgettable, and easy to override. They’re not serious attempts to protect users. They’re PR exercises designed to shift blame from the platforms to the people they’re harming.

    The Bottom Line

    When I think about my friend who lost an afternoon to Instagram, I don’t blame her for lacking self-control. I see someone caught in a trap designed by some of the smartest people in the world, backed by billions of dollars, and refined by constant experimentation on billions of users. Yes, social media can connect us. Yes, it can spread information and enable organizing. But these benefits could exist without the predatory design, the surveillance, the algorithmic manipulation, and the relentless push for engagement at any cost. The fact that platforms provide some value doesn’t excuse the harm they cause.

    The evidence is clear: social media platforms as they currently exist are net negative for human well-being. They’re designed to be addictive, funded by surveillance and manipulation, and filled with content that makes us anxious, angry, and isolated. Until we fundamentally restructure how these platforms work they will continue to extract far more from us than they give back.

  • When Algorithms Know You Better Than You Know Yourself

    When Algorithms Know You Better Than You Know Yourself

    Last week I was scrolling through TikTok when I noticed something weird. Every third video was about journaling and mental health apps. It felt like the app had somehow figured out I was stressed about my upcoming exams. The thing is, I hadn’t searched for anything related to stress or mental health. I’d just been watching videos late at night, probably scrolling faster than usual, maybe rewatching certain clips. Somehow, TikTok knew I needed those wellness videos before I did.

    This is the strange new world we live in. Social media platforms collect so much information about us that they can predict what we want or need with scary accuracy. Sometimes this feels helpful. Other times it feels like we’re living in a science fiction movie where computers control our thoughts.

    Let me be clear about something upfront. Targeted advertising does have some benefits. If you’re a small business owner, you can find customers who actually want your products without spending a fortune. If you’re shopping for something specific, you might see ads for exactly what you need. Parents discover useful products for their kids. Travelers find good deals on hotels.

    But after looking at how these systems really work, I think the harm they cause is much bigger than any benefits. We’re trading our privacy and freedom for slightly more relevant ads, and that’s a terrible deal.

    The Surveillance Machine

    Most people have no idea how much data social media companies collect. It’s not just about what posts you like or what accounts you follow. These platforms watch everything you do, and I mean everything. Facebook collects thousands of different data points about each person using their platform. They track how long you look at each post, whether you start typing a comment and then delete it, how quickly you scroll past certain topics. They even track your mouse movements on the screen.

    TikTok goes even further with data collection. They monitor your typing rhythms, which can identify you personally. They collect voice data and facial recognition information. They know what other apps you have on your phone and which websites you visit outside their app.

    All this data gets fed into complex computer systems that create a profile of who you are. Not just your interests, but your personality, your weaknesses, your habits, and your likely future behavior. These profiles are so detailed that researchers have found they can predict major life events before they happen.

    Exploiting Psychological Vulnerabilities

    Here’s where things get really concerning. These platforms don’t just use data to show you ads for things you might want to buy. They use it to figure out when you’re most likely to make impulsive decisions. Facebook’s own research showed they could identify when teenagers feel insecure or worthless based on their activity patterns. Think about that for a second. A massive corporation knows when kids are feeling bad about themselves and can target them with specific advertisements during those vulnerable moments.

    TikTok has gotten really good at what I think of as emotional manipulation loops. They notice when someone watches videos about a particular insecurity, then show more content about that topic, which makes the person more anxious, and then serve ads for products that promise to fix the problem. Someone worried about their appearance gets more beauty content, becomes more self-conscious, then sees ads for cosmetic procedures. Studies have shown that platforms can detect depression, anxiety, and other mental health issues with frightening accuracy. Instead of using this information to help people, they use it to sell products.

    The Responsibility Trap

    What really bothers me is how these companies pretend they’re giving us control. They offer screen time limits and ad preferences settings, acting like we’re responsible for managing our own experience. But this is like a casino giving you a pamphlet about gambling addiction while they pump oxygen into the air to keep you awake and playing.

    The whole system is designed to override your self-control. Platforms use techniques from behavioral psychology to keep you hooked. Variable reward schedules make checking for likes as addictive as gambling. Infinite scroll removes natural stopping points. Push notifications create anxiety about missing out.

    Then when people develop problems with social media use, the companies say it’s the user’s fault for not having better habits. They’ve created a trap and then blame us for falling into it.

    Digital Autonomy Under Attack

    Going back to my TikTok experience, I keep wondering: did I actually want to learn about journaling, or did the algorithm decide I should want it? When platforms can predict and influence our behavior this accurately, how much of what we think we want is really our own choice?

    Experts who study surveillance capitalism warn that we’re losing our ability to make free choices. Our future behaviors become predictable and controllable. Companies don’t just sell us products anymore. They sell the ability to modify our behavior to whoever pays them. This isn’t just about seeing ads. It’s about losing control over our own minds. When algorithms know what we’ll want before we do, when they can trigger our insecurities and offer solutions, when they can make us believe our programmed desires are our own choices, we stop being fully human. We become predictable machines ourselves. The benefits of targeted advertising are real but small. The costs to our privacy, mental health, and freedom are enormous. We need to demand better from these platforms. We need laws that protect our data and our minds. Most importantly, we need to understand that when something is free, we’re the product being sold.

  • How Hashtags Saved My Local Library

    How Hashtags Saved My Local Library

    My local library branch closed last year. The building where I spent countless afternoons as a teenager, discovering books that changed how I saw the world, now sits empty. The council said budget cuts made it necessary. What they didn’t say was that they’d barely consulted the community or considered how many people depended on that space for internet access, job applications, and human connection. This experience turned me into an unlikely activist. I’m now part of a growing movement to protect and expand public libraries as essential community infrastructure. Here’s how I’d use digital media to build this movement.

    Starting with Stories, Not Statistics

    While libraries face funding cuts across many countries, throwing numbers at people rarely inspires action. Instead, I’d begin by collecting and sharing personal stories through social media. The #MeToo movement succeeded partly because it made abstract issues concrete through individual experiences. Similarly, our campaign would invite people to share what libraries mean to them using #MyLibraryStory.

    These stories would highlight libraries as more than book repositories. They’re where elderly people combat isolation, where kids without home internet do homework, where job seekers print resumes, where new immigrants take language classes. By flooding social media with these diverse narratives, we’d reframe the conversation from “budget line items” to “community lifelines.”

    Building Networks, Not Just Audiences

    Traditional activism often relies on formal organizations with clear hierarchies. But research on connective action shows that digital movements can succeed through loose networks where participants personalize their engagement. Rather than creating one central “Save Libraries” organization, I’d encourage local groups to form their own campaigns while sharing resources and amplifying each other’s messages.

    We’d create shared Google Drives with template social media posts, fact sheets, and graphics that local groups could adapt. A simple website would map all participating groups, making it easy for people to find and join efforts in their area. This distributed approach makes the movement resilient. If one group faces burnout or opposition, others continue the work.

    From Awareness to Action

    Digital activism often faces criticism for encouraging “slacktivism,” where people feel they’ve contributed by merely liking or sharing posts. To avoid this trap, every piece of content would include specific actions supporters could take. This might include signing petitions, attending council meetings, volunteering at library events, or donating to library support organizations.

    The Women’s March demonstrated how Facebook events could coordinate massive offline action. We’d use similar tactics, creating Facebook events for library read-ins, community meetings, and peaceful protests. The digital tools serve as organizing infrastructure, not endpoints.

    We’d also use digital platforms for coordinating practical support. WhatsApp groups could organize carpools to council meetings. Slack channels could coordinate volunteer schedules. Google Forms could match people’s skills with movement needs. The goal is making real-world participation as frictionless as possible.

    Creating Content That Spreads

    Getting people to share your message online is tricky. After months of trial and error, I learned what actually works. People share things that make them feel something. A dry post about library funding statistics? Nobody cares. But a photo of an 80-year-old woman teaching herself to use email at the library computer? That gets hundreds of shares.

    We started making simple videos with our phones. Nothing fancy. Just real people talking about real experiences. One video showed a teenager explaining how the library was the only quiet place to study because his house was too crowded. Another featured a single mom who used library computers to apply for jobs. These weren’t polished productions, but that made them feel genuine.

    Visual content gets 94% more views than text-only posts, so we focused on images and short clips. We’d photograph the empty shelves where books used to be. We’d film the locked doors of closed branches. Timing matters too. We learned to post when people were actually online. Early morning for parents. Lunch breaks for workers. Evenings for everyone else.

    Measuring What Matters

    It’s tempting to get excited about viral posts and trending hashtags. But likes don’t reopen libraries. We needed to track what actually mattered: real change in our communities. We kept it simple. Every month, we’d count: How many people showed up to council meetings? How many signed our petitions? How many libraries stayed open? We used a basic spreadsheet that anyone could update. No fancy analytics tools needed.

    The Ice Bucket Challenge raised $115 million because it connected online activity to actual donations. We borrowed that idea. Every social media campaign linked to concrete asks: “Call your councillor today” or “Join us at Tuesday’s protest.” Then we’d follow up to see who actually did it.

    Local victories mattered most. When volunteers saved the mobile library service, we made sure everyone knew. These wins showed people their efforts weren’t wasted. Success bred more success. We also tracked our failures. Which tactics flopped? What messages fell flat? Learning from mistakes helped us improve.

    So, What Next?

    Building a movement online takes patience. There’s no magic formula. What worked for us might not work for you. But some basics apply everywhere. Start where you are. You don’t need thousands of followers or a fancy website. Our library campaign started with five people in a WhatsApp group. Small beginnings can lead to big changes.

    Focus on people, not platforms. Social media sites come and go. What lasts are the relationships you build. We made sure people exchanged phone numbers at events. We created email lists as backups. When one platform changed its algorithm, we weren’t starting from scratch. Keep it sustainable. Burnout kills movements faster than opposition does. We rotated responsibilities. Nobody had to post every day or attend every meeting. When someone needed a break, others stepped up. This isn’t a sprint. It’s a marathon.