Social Media.
If written today, what would DeLuca and Peeples say about Facebook? About fake news or disinformation? What about the thousands upon thousands of bots that patrol Twitter, populating threads with disinformation or images from unrelated events parading them as contextually accurate? I don't think DeLuca and Peeples were necessarily optimistic with these "new conditions for...participatory democracy" but they refer to technologies that many people don't participate in anymore, like watching television for the news or reading newspapers (126). I do, however, believe that DeLuca and Peeples lay the foundation for us to discuss this new public sphere. First, they warn that "we cannot simply adopt the term 'public sphere' and all it entails" because that form relies on speech and text, not screens. Little did they know how important "screens" would become 18 years later. They were optimistic, however, in the potential of these screens to influence grassroots movements. Movements like Black Lives Matter.
Some sources say that BLM is the biggest movement in American history. It's also a movement built online. So DeLuca and Peeples were correct in stating that this new arena of discourse allows those who are disenfranchised, voiceless, and marginalized a platform to participate in democracy. It's difficult to disagree with this statement. Hypermediacy allows us to consume information in a plethora of different ways and on different screens. This exposes us to the evil corporations are hiding from us or the murderers that put on a uniform and claim to protect us. Like DeLuca and Peeples state, this sort of exposure can force big corporations into bending the knee or to bring justice to black folks murdered by the police. But there's another side to this coin and it has the power to destroy democracy--if it hasn't already (sorry for being so cynical, Mike).
On any given day we might see a tweet that directs us to a trending topic. We then read from that source. If maximum virality is achieved, then we might see this same story mediated through other mediums (memes, images, gifs, tik-tok videos, etc.). There is simply too much to consume on any given day. And most of it is meant to enrage us, nudging us to sharpen our pitchforks. Alas, we sit furiously behind our computer screens, tablets, or cellphones, tweeting and posting and sharing. DeLuca and Peeples sort of address this onslaught of information by saying its simply impossible to keep up with all of it (135). To them, it isn't a bad thing and, perhaps, even good that we choose what to consume and what not to consume.
But DeLuca and Peeples don't talk about algorithms. They don't talk about the rich white dudes in silicon valley forcibly colonizing the entire planet with their toxic social media platforms (see this article). How could they when Zuckerberg didn't start Facebook until 2004? Since then, many scholars have discussed this same dissolution of the public sphere. Instead of the media cherry picking images of violence, it's artificial intelligence and it knows all about you. It knows what gifs you'll watch or likely skip, it knows when you are near your crush, and it's fighting for your attention. Though this new frontier of public discourse is promising, affording movements like BLM to take hold and gain traction, the flip side of the coin is much, much darker. Truth is bent and shaped into a grotesque monster on this side of the coin and information, regardless of credibility, is used advantageously, often against the common good.
DeLuca and Peeples have many important and valuable things to say that we in 2020 should be paying attention to. But, like the rich white boys in silicon valley, they couldn't anticipate how fucked these screens and devices would make us.
Keith,
ReplyDeleteYou bring up a lot of interesting points. I am actually anticipating writing my conference paper for our course on how Tik Tok has influenced the cultural discourse around the BLM movement. I think platforms like Tik Tok, Twitter, etc., are making this really interesting space right now, as a lot of Gen Z are using these apps for news, when they were designed for things of a lot less weight. What are your thoughts on this? Do you think this is helpful or becoming a new threat to examine?
I think DeLuca and Peeples would explain our current trends as a double edged sword. As you state, these platforms DO give a voice and space for important civil rights discussion to take place. But, they also give a space for groups like Qanon and other white supremacist organizations. For me, it comes down to moderation: how are we going to moderate these spaces to make them safe for everyone?
ReplyDeleteI think trying to moderate digital spaces is a lesson in futility. Big Tech would never go for it, and most Americans would see it as a restriction on their freedoms. Though current research really only lists teaching media literacy as an alternative to combat these issues. While I do think that giving individuals the tools they need to separate the truth could be effective, we have to realize it won't have an immediate effect. Particularly because we cannot send older generations who are just beginning to engage in the digital back to school. Though I am optimistically open to suggestions
ReplyDeleteIf you moderate with human beings, it's a big ask, sure. There's just too much work there. But Big Tech likes algorithms. They can surely create an algorithm sophisticated enough to moderate. I'll follow up and say implementing this ethically would be challenging and expensive. A.I. driven moderation AND literacy together would make an effective pair.
ReplyDeleteKeith,
ReplyDeleteYou bring up a good point with algorithms. While social media can be a great tool for unifying individuals and giving them a voice, it must always be used with caution. I think too many people rely on social media without realizing how controlled what they are viewing is. They are unaware of the bias that algorithms create. It's honestly a little bit scary how much "ignorance is bliss" when it comes to social media, as people form opinions without seeing how much they've been manipulated. I'm not sure if there's a way to solve this with the power of algorithms being currently in unsafe hands.
I think Brynn is on it here. To Keith's point in the comment section pertaining to a double edged sword, I think that is actually necessary to do what Brynn is speaking of. Yes, it has its evil, and I think that if that were all it had, it would be easier to see action. However, the other side of this is it (social media) can do great things, or at least has the potential to, so that can always be the focus whenever scrutiny is presented. Whatever impact will have to be gradual, but if we want it to be permanent, then maybe that makes some sense, however frustrating.
ReplyDelete