The conference shed light on how these issues have been allowed to proliferate within the environment of major social media platforms such as Facebook/Meta, X/Twitter, Snapchat, Reddit, and others - often circumventing existing legal frameworks. The insights I gained have been invaluable in grasping the complex interplay between technology, regulation, and societal impact in today's digital landscape
.
The following article comprises and expands upon the notes that I took during the conference, including useful resource links and references to thought leaders in the space of information technology, some of whom were in attendance and others who have notably published their opinions in the press.
Reclaiming the Digital Commons: Confronting the Human Cost of Unregulated and unchallenged
"Technopoly."
"Technology is not separate from society; it reflects and reproduces existing power relations. It's a tool that can either entrench injustice or help dismantle it." — Ruha Benjamin - Sociologist and author of Race After Technology
In the digital age, the term "data privacy" often fails to capture the profound human consequences of unregulated technology.
David Jeurgens of the University of Michigan emphasizes that framing these issues as "advanced surveillance profiling," which adversely affects consumers and inflates the cost of goods and services resonates more deeply with the public.
This perspective shifts the narrative from abstract concerns to tangible impacts on individuals' lives.
As misinformation proliferates across online platforms, finding a bipartisan solution to digital harm is no longer optional, it's essential. At the heart of this challenge lies tech reform, and a key focal point is the increasingly outdated language of Section 230 of the Communications Decency Act.
Section 230, originally designed back in 1996 to balance free speech with minimal liability for platforms, now stands as a legal shield that enables tech companies to evade accountability, particularly regarding the harm inflicted on youth and vulnerable users through cyberbullying, harassment, and the spread of disinformation.
"Search algorithms are not neutral; they are embedded with systemic bias that reinforces existing social inequalities." —
Safiya Umoja Noble - Author of Algorithms of Oppression (UCLA
Professor and researcher on bias in algorithms)
Critics rightly warn of the dangers of censorship and the slippery slope of infringing upon free expression. Yet, the urgency of revisiting and revising Section 230 is driven by the pluralization of victims across all ages, genders, and political identities - who have suffered due to online abuse and manipulation. The very fabric of democracy is fraying under the weight of unregulated digital influence, and this conference was focused on addressing these crises head-on.
Any serious response must confront the immense entanglement of these Technopolies within American industry and politics. Major tech firms have embedded themselves deeply into our institutions through funding, lobbying, and influence making meaningful reform a daunting but
necessary task.
Meanwhile, under the current U.S. administration, essential government bodies like the Cybersecurity and Infrastructure Security Agency (CISA) and portions of the FBI's cyber divisions are facing cuts or restructuring. This raises a critical question: How do we resist the erosion of regulatory power in a time of digital vulnerability?
Among the many powerful voices present at The Future of Social Media conference at Notre Dame was Kristin Bride, a mother turned tireless advocate after the heartbreaking loss of her son, Carson, to suicide as a result of relentless cyberbullying. In the face of unimaginable grief, Kristin chose not to remain silent.
Instead, she rose to the challenge, founding Carson's Fund to raise awareness, demand accountability, and push for tangible policy changes around the prevention of online harm and abuse. Her presence at the conference was both emotional and galvanizing, reminding professionals working at the intersection of information and technology that behind every data point or "user" are real lives impacted by the dark underbelly of digital spaces.
The just world fallacy, the belief that people get what they deserve, often leads to misplaced blame in cases of abuse, as seen in the tragic story of Kristin Bride and her son Carson. After Carson's death due to relentless cyberbullying, Kristin herself became a target of online harassment, with strangers suggesting she was negligent or somehow responsible. This cruel tendency to assign blame rather than confront systemic issues reveals a deeper danger in how we approach online privacy. Many assume that only "bad people" need to worry about the misuse of their data, but as Kristin's experience shows, anyone can become a target. The idea that privacy is only essential for those with something to hide ignores how easily victimization, judgment, and exploitation can affect even the most well-intentioned and vulnerable individuals.
Kristin's testimony, delivered alongside former Congressman Dick Gephardt, warned of the growing threat of what many are calling a "Technopoly" - a term highlighting how tech companies portray themselves as both untouchable innovators and benevolent hosts of digital discourse, while simultaneously evading accountability for the harm proliferating on their platforms. From her personal story to her national advocacy work, Kristin Bride stands as a critical voice for justice and reform, urging lawmakers, tech leaders, and civil society to stop viewing online harm as an inevitable side effect and instead treat it as the urgent crisis that it is.
…
Comments
Post a Comment