Thursday, March 28, 2024

Algorithms: I See You, but You Don't See Me

When I think of algorithms, I sometimes have a tough time conceptualizing what social media sites actually know about me. I have a growing understanding of how algorithms work - in short, algorithms collect data on social media users, then utilize this data to deliver personalized content to each user. Since I cannot possibly watch every single YouTube video or read every single Facebook post, this catered content is like serving food I'm more likely to enjoy, making me a loyal customer at these diners of social media. After all, I wouldn't return to a restaurant with icky food.

Image Source: tenor.com

Having said that, I have limited say (and oftentimes, zero say) in the data that these social media sites collect about me. Additionally, these social media algorithms are subject to bias, with even emojis playing a role in misinformation. Developing a more nuanced understanding of how algorithms work is like taking a trip into the kitchen of that restaurant, seeing what's going on behind the scenes - inner workings that social media sites don't necessarily want users to know. Not everyone may want to take a trip into the kitchen. Even I have asked, why should I care about the data they collect? As long as the food's good, what does it matter? I care because it's not about the food, but about the tab. If I'm paying these restaurants (the data that social media sites collect), I want to know what I'm paying. Since I work with kids and teens as a KidSpace Library Associate at a public library, it's also important to know the tab so I can better empower students to make informed choices. Many kids at the library are avid social media users, plus the library has its own social media presence. In other words, my organization and the families my organization serves are directly impacted by algorithms. 

On the subject of libraries, I discovered another algorithm metaphor that doesn't relate to food (I may or may not have been hungry while writing this blog). As described in Dorcas Adisa's Sprout Social article, Everything you need to know about social media algorithms, algorithms are described as "librarians, sorting and connecting users with their preferences. This prevents overwhelming users with endless content and helps them find what they like faster. Algorithms enable users to uncover valuable posts, connect with like-minded individuals and explore their interests." In other words, algorithms can be useful by providing relevant information to users. For example, when Facebook provides wildlife photos in my newsfeed, I'll devour that content.

Image Source: tenor.com

I was intrigued by the librarian metaphor because librarians help identify misinformation. Algorithms can likewise flag and filter misinformation; alternatively, algorithms can bolster and spread misinformation. Per Joe Lazer's Contently article, Facebook’s Algorithm Has Unprecedented Power. Here’s How We Need to Respond, Facebook had come under scrutiny for "its role in spreading fake news leading up to the election, and the filter bubbles it helped create among people with similar political ideologies." The chart below explores this role in a visual way.

Image Source: Contently Article

Librarians, too, have this power of opening access or restricting access to information and misinformation. Can we trust social media algorithms to utilize this power wisely? According to a PEW Research Center survey, as discussed in the article Mixed views about social media companies using algorithms to find false information, "Fully 72% of Americans have little or no confidence that social media companies will use computer programs appropriately to determine which information on their sites is false, including three-in-ten who have no confidence at all." I was not surprised by this lack of confidence in algorithms' ability to determine false information. In sharp contrast to the lack of information filtering, I was surprised by the abundance of information gathering, especially when it came to Facebook.

Why Facebook?

As a regular Facebook user, I was unnerved by Caitlin Dewey's Washington Post article 98 personal data points that Facebook uses to target ads to you. I knew that Facebook had the power to collect a lot of data, but 98 data points? - including credit card data, car data, and home data? It's not nearly as innocuous as favorite TV shows, as I had presumed before doing my research. Users can indicate ad preferences, but users can't actually opt out of Facebook's tracking methods. As discussed in the aforementioned article, "The preferences manager, for instance, lets users tell Facebook they don’t have certain interests that the site has associated with them or their behavior, but there’s no way to tell Facebook that you don’t want it to track your interests, at all." In summary, the only way to opt out of data tracking is to opt out of Facebook entirely. This understanding made me think differently about algorithms, since I hadn't fully grasped the impact of lack of consent on lack of privacy, particularly when there's a lack of transparency regarding the full scope of the data that Facebook collects.

Will I change my Facebook ways?

To be determined! At the very least, I'll pay closer attention to the information I choose to share with Facebook. I'd also like to cultivate more resources for teaching kids about the hidden digital footprint that algorithms leave behind. Lastly, I've gained a curiosity about which ads that social media sites choose to display.

To satiate this curiosity, I ran some impromptu data on another favorite social media site - YouTube. In the spirit of the article on Facebook's 98 Data Points, I created a spreadsheet categorizing my last 98 YouTube ads. I learned how to access your YouTube ad history through this YouTube video. I included my top five categories, as well as an "Other" category for the remainder of the ads.

Image Source: Chart made on Google Sheets by Aron Ryan

Mint Mobile and Canva are both services that I use regularly, so it's no surprise that they led the way with my YouTube advertisements. It's also not surprising that therapy ads were so common, considering I watch a lot of content on the topics of psychology, mental health, and neurodiversity. I was surprised that so many deodorant ads had played, since I've never bought deodorant online. I hope that YouTube advertises deodorant because I presumably use deodorant, not because I need more deodorant so as not to offend the algorithm. Lastly, I found the topic of refugees (Gaza refugee camps) to be interesting, since these ads are geared toward social activism. Much of my YouTube viewership relates to social activism topics, even if I haven't watched content on refugee activism.

Now that I've discussed my ad history, I'm curious, which ads do the algorithms curate for you? How do you feel about the role of algorithms in social media sites? I'd love to hear your insights!

4 comments:

  1. Hello,
    I absolutely love the food analogy! I am one of those people who appreciate seeing the kitchen before I eat any meal. However, we all know this is not always possible. Even knowing what is going on in the background is not always an assurance that you gain the entire picture. As you stated, there is a lot of misinformation due to the use of emojis alone. One person's interpretation and use of content may not be interpreted the same way by someone else. I agree with you that students need to be taught about the hidden digital footprints. I myself will change some of my ways of utilizing social media.

    ReplyDelete
    Replies
    1. Hello, Anthony! Great point that interpretation varies between one person and another. Thanks very much for reading my blog and for commenting. I'm glad you enjoyed the food analogy! Best of luck with your own journey changing up the ways you utilize social media.

      Delete
  2. Aron, your posts never disappoint!

    ReplyDelete

I Ghost Social Media, and it Still Haunts Me

As someone who's dived so deeply into social media that I made pie charts exploring my Facebook usage , I asked myself, what would it be...