Degradation, Legitimacy Threats, and Isolation
New research on census, youth, mental health; a recent talk and an upcoming one
tl;dr:
1. New paper with Janet Vertesi: "The Resource Bind: System Failure and Legitimacy Threats in Sociotechnical Organizations"
2. "Techno-legal Solutionism: Regulating Children's Online Safety in the United States" (my paper with María Angel) was officially published as part of the ACM CS+Law symposium.
3. Crisis Text Line's report on what youth need to be more resilient is haunting but important
4. Watch Tressie McMillan Cottom, Janet Vertesi, and I riff on tech & society issues
5. Come hear me speak in DC on April 10!
We’ve come a long way to get back to trodden terrain…
Ten years ago - on March 17, 2014 to be precise - Data & Society hosted its first event: The Social, Cultural & Ethical Dimensions of "Big Tech." At that event, we brought together people from academia, civil society, government, industry, and others to start grappling with emergent challenges in a data-soaked world. It's utterly surreal to realize that was 10 years ago. I went back and read the primers we created for that event and just smiled. The debates we elevated then are still with us today. I also can't thank all of those who helped make that event possible - in effect, they helped write Data & Society into being.
(Side note: Data & Society is going to have many 10-year celebrations this year. Make sure to stay tuned to everything folks there are planning. And if you have the means to donate, that would be mighty nice. I continue to be in aw of all that D&S is doing!)
I've been thinking a lot about how far we've come in those ten years - and how many steps backwards we've also taken. On one hand, folks are much more willing to see the complexities and nuances of technology's interactions with society. On the other, the techlash tends to be just as deterministic as the tech sector itself. And then there's the tendency for policymakers to engage in techno-legal-solutionism which just makes me want to facepalm. (Congratulations to my co-author María Angel for an awesome presentation of our paper at the ACM CS+Law Conference last week!)
More and more, what's going through my mind these days has to do with degradation. What happens when sociotechnical systems - and the organizational arrangements that rely on them - start to crumble? Not simply break or meltdown in a fatal sense. But, rather, just become shittier and shittier. Cory Doctorow has a term to describe this phenomenon in the context of technology platforms: enshittification (which, you have to admit, is just a damn good term). But the degradation and shittiness goes so far beyond platforms. For example, so many workers' lives are becoming so much crappier. And this isn't simply a story of AI. It's a story of greed and oppression. Technology and law are just the tools to help aid and abet this configuration.
What's worse is that degradation is sometimes the goal. Janet Vertesi and I just published a comparative ethnography paper this week in a fabulous Sociologica special issue on failure. Throughout the organizational sociology literature, there are case studies of how technical failures lead to legitimacy crises. And that's for sure true. But in examining how resources (e.g., time and money) are constrained in public-sector organizations like NASA and the Census Bureau, we noticed something else going on. We started to see how a resource bind can be manufactured to help trigger legitimacy crisis which can push sociotechnical projects to the brink of survival. To get at this, we examined how money was contorted inside NASA alongside the political dramas of manipulating time during the 2020 census. So check out our paper: "The Resource Bind: System Failure and Legitimacy Threats in Sociotechnical Organizations."
(Also, if you're reading this and you don't know who Janet Vertesi is, you should. In addition to being an amazing ethnographer of NASA, she's constantly engaging in opt-out experiments, which are kinda like breaching experiments to protect privacy in a surveillance society. Hell, you should see what efforts she went to in an effort to evade Disney's data collection regime. And yes, I was the friend who was convinced she'd hate Disney. Challenge accepted, right?)
Of course, it's not just sociotechnical systems that are degrading. So too is our collective social fabric. And, with it, the mental health of young people. Last month, Crisis Text Line published some of its latest data about depression and suicide alongside what CTL is hearing from young people about what they need to thrive. (Hint: banning technology is not their top priority.) Young people are literally dying due to a lack of opportunities for social connection. This should break your heart. Teens are feeling isolated and alone. (My research consistently showed that this is why they turn to technology in the first place.) It's also scary to see the lack of access to community resources. Communities are degrading. And there's no quick technical fix.
These issues were all on my mind when Tressie McMillan Cottom, Janet Vertesi, and I sat down for a "fireside chat" at the Knight Foundation's Informed conference. We kinda evaded the instructions we were given and, instead, decided to draw on the collective knowledge of our disciplines to offer theoretical insights that can help people think more holistically about tech and society. Along the way, we talked about how systems are degrading, how the technical fixes are harmful, and how we owe it to the future to address social issues in a more ecological fashion.
If you happen to be in DC on Wednesday, April 10th, I will be offering up a new lecture that connects some of these issues to the public conversations we're having about AI. This will be part of Georgetown's Tech and Society week. (I'm the distinguished lecture with details forthcoming on the schedule.) I hope you can join me there!
always love yer emails... :)