Be literate and inquisitive about your social media
by Meghan Wenzel – Wenzel offers a recap of Data 4 Good’s December 14 event
The Data Literacy working group of Los Angeles nonprofit Data 4 Good recently hosted another event in their ongoing series How Data Impacts My Life. It was a thought provoking evening that explored how social media platforms impact our lives, how we all can better protect our data, and how we might move forward as a society. Jeanne Holm (City of Los Angeles) moderated the panel with Marie Smith (Data360), and Varoon Bashyakarla (TuneIn).
To kick off the discussion, Marie reflected on the evolution of social media over the years and where we might go next: “Over the last 15 years or so, as social media is involved, I’ve just seen the need for its maturity. How to go from this area of experimentation and openness into this area of responsibility and other topics like justice, equality, equity. How do we mature this infrastructure to the next level?”
You’re in a transaction - you’re not just hanging out with your friends
Varoon explained how social media companies are so incredibly profitable because they’re monetizing all of our data. “Even though these companies often present themselves as ‘uncompanies’, they’re actually some of the biggest corporations in the world. And they’ve obtained that status because they track, collect, mine, and monetize all of our data on their platform - every click, browse, even the most minute details, in ways that can be quite invasive.”
Marie agreed and shared that “the number one thing I want to emphasize is it is a business and the business is advertising. When you look at it through that lens, everything else is sort of secondary.” She continued:
“You’re dealing with some of the largest corporations in the world. They’re not designed to be of public benefit, they’re designed to make profit.You have to think about your information as one of many commodities on the platform, and you have to really think about how you want that information to be recorded, used, disseminated, bought, and sold. You have to really become aware. This next phase is an awareness that you’re in a commercial environment - you’re in a transaction - you’re not just hanging out with your friends. This is all about managing your transactional life.”
“What you buy, what you read, what you share online, who you associate with, what your mood is, where you work, what you do, what your health situation is, where you've donated, what clothing styles you like, what car models you buy, your favorite Cola brand, your favorite phone brand – all of that information is available to those with the budget to buy it and the algorithms to aggregate and sift through it. This is where big data is changing the face of American election politics.”
–David Gewirtz, Election 2016: The big data trail to our next president
Varoon shared some examples of how companies are monetizing our data in new, and disturbing, ways. An entire ecosystem has popped up around aggregating and selling data on our interests, purchases, preferences, voting records and more to the highest bidder.
He explained how politicians, in both parties, use A/B testing to determine the most engaging and effective images, slogans, calls to action, etc. The Trump campaign was running A/B tests on an unprecedented scale, allowing them to collect incredibly granular insights and hone their messaging to elicit more support and donations.
Your physical location – be it a gay bar, a hospital, or a place of worship – shows a lot about your values and beliefs. During the protests over George Floyd’s murder, Field Team 6, a Democratic voter registration organization, used geofencing to target the physical location of protests and harvest the mobile IDs of people protesting. Field Team 6 could pull data on mobile IDs, find people who weren’t registered to vote, and serve them voter registration ads. People at the protests likely didn’t know their location would be used in this way, sparking questions around data ethics, unbridled capitalism, manipulation, and the levels of surveillance we’ve grown accustomed to.
Jeanne summarized and noted that “The reason why data is valuable is because it tells those who want to use it about potential audiences, potential consumers, potential opportunities.” In the attention economy, people and companies compete to garner attention as fast as possible.
When things go wrong… who is it that really bears the brunt of that?
Varoon explained, “Companies are beholden to their shareholders at the end of the day, who want a return on their investment. This very quickly comes down to what are the business models of these companies - how are they generating revenue?” While the tech sector prides itself on being really innovative, Varoon noted they’ve failed to innovate on their business model.
Their current business models value engagement over all else, yet there has been clear societal ramifications as a result, from a failure to respond to coronavirus to rampant misinformation to mental health concerns in youths.
Despite these negative effects, social media companies are still raking in record profits. Varoon noted that like Big Oil, Big Tobacco, and previous bad actors, social media companies are quick to internalize the benefits when things work out, but quick to externalize costs when they don’t.
Varoon posed a revealing question:
“Companies profit from our data massively, but when things go wrong, after a leak, a breach, or a hack, which seems to happen more and more often, who is it that really bears the brunt of that?”
Overall, Varoon noted that large scale problems such as these demand a level of public awareness and literacy in order to act. Looking ahead, he feels like the next generation of tech products will allow you to premium for increased privacy and control over your data. However he notes, “I have misgivings about that as well, because I don’t want to live in a world in which only rich people have access to privacy. [But] in the absence of some ambitious and far reaching legislation, it is one of the few viable paths forward.”
Marie agreed that requiring people to pay a premium for privacy will likely exacerbate inequality. She reflected that it’s important for us to continue the dialogue around “who is going to be the arbiter of truth, or how many arbiters of truth will there be for these organizations? And how will you know who you can trust?” Overall, is the internet a utility or is it a right?
Marie shared that we’re seeing lots of reactions. We’ve seen some decentralization and specialization, and Marie has personally experienced clients paying to build their own private platforms, since they can’t shape other companies to be more responsible.
She shared that Facebook, Twitter, and Google know their days are numbered. “These centralized platforms - their days are numbered because it is impossible to meet the needs of a billion people at the same time. People are going to splinter off and form their own communities. People are comfortable with their own tribes… and they’re going to set their own rules.”
What can we do?
Varoon shared that voicing your concerns is a great place to start. “Let your team know it’s something you’re thinking about.” He also suggested looking into compartmentalizing and/or reducing your digital footprint. For example you could use Firefox for personal things and Chrome for work, you could fortify your footprint with a VPN you trust, or you could reduce your digital footprint by electing to leave your phone at home sometimes.
Varoon shared The Data Detox Kit from Tactical Tech, which is a user-friendly guide to take control of your digital footprint in a way that aligns with your own values. He notes it provides techniques, strategies, and suggestions to reduce or compartmentalize digital footprint.
Marie shared that DuckDuckGo is a great privacy browser that offers a quick, easy fix. Additionally, she noted that VPNs are great for privacy. She suggested having a plan for kids and teaching them what to look for online, and being proactive in how you use online platforms. She noted that phones and devices have added in features to help you be more mindful, such as shutting off at a certain time.
Marie explained that data practitioners have to understand the process of what they’re doing with the data. “You really have to have a framework, and if you’re looking to change something, a theory of change that you really rely on that’s clear and rigorous.” She noted that social media data is from people and requires a ton of cleaning. For example people might say “Hollyweird”, “La La Land”, or the “City of Angels”, all of which mean Los Angeles.
“You have to really understand the process of what you’re doing, what you’re looking to get out of it, what has to be cleaned up, where is the source coming from. Is the source true or authoritative? What is the level of truth? How do you define truth in your analysis? How is it effective?”
Marie noted that the main part of “data science” is science, meaning “we need to look at the rigors and structure of data science, instead of making it a data playground.” She also urged everyone to be responsible for what they’re putting out there because it does impact other people’s lives. While we really have an amazing opportunity to shape the world, we need to be very thoughtful about what we’re doing and why. She urged us to be very diligent, noting there’s a lot of responsibility involved.
All in all, it was an evening full of insightful questions and engaging conversation. As Varoon and Marie both mentioned, this is a complex and challenging problem, and it will require collective awareness, engagement, and action. As a society we need to be literate and inquisitive, and we need to have serious and ongoing conversations around the impact and ramifications of various technologies. Only then can we understand the problems and take collective and corrective action.
Additional resources from the speakers include: The Ad-Hoc Group of Activists and Academics Convening a “Real Facebook Oversight Board” (The New Yorker), The Big Business of Ad Tech (The Privacy Issue), The Costs of Connection: How Data is Colonizing Human Life and Appropriating it for Capitalism (Nick Couldry and Ulises A. Mejias), and A Data Day, Data and Politics Project, and Your Data, Our Democracy (Tactical Tech).
And watch the YouTube video of the event!
About the author and partipants
Meghan Wenzel is a user researcher, strategist and writer.
Host Data 4 Good was a nonprofit collective of community leaders and organizations striving to use data and technology to do better and give back to the greater community.
Panelist Marie Smith, Chief Information Officer at Data360, has been a founding contributor to nearly 50 media, tech and wellness companies and projects. She has served as a pro bono technology trainer for social and human justice nonprofits and government initiatives such as My Brother’s Keeper, StepUp, WiSTEM, and others.
Panelist Varoon Bashyakarla is Principal Data Scientist at the online radio company TuneIn. He led the Data & Politics project at the Tactical Technology Collective, a Berlin-based NGO that works at the intersection of activism and technology.
Moderator Jeanne Holm is the Deputy Mayor for Budget and Innovation of the City of Los Angeles, and has a deep expertise in data science, knowledge management, and civic innovation with the White House, data.gov, the World Bank, NASA, Time’s Up, UCLA and more.