Gender, Power and AI: Wrestling for the soul of the network, again

Stanford’s Clayman Institute ran a virtual panel this morning called “Gender, Power, and Artificial Intelligence,” with Safiya Noble (UCLA), Catherine D’Ignazio (MIT), Angèle Christin (Stanford), and moderator Genevieve Smith, a Clayman Institute Postdoctoral Fellow. The panel applied principles from feminist tech studies to the current moment, and covered how gender norms get encoded in data and reproduced by AI systems, and discussed whether the technology has real capacity for equitable design and implementation at scale.

Noble’s argument throughout is that the governance conversation has gotten too high-level and universalizing while the actual outputs of these systems have profound day-to-day consequences for specific people today. She named the role of AI in the recent gerrymandering of Louisiana and Indiana as examples, and called for tripling down on long-term social science research about AI’s impacts. She also pointed out that philanthropy is retreating from feminist academic and organizational work because that work originates from the same dynamics that critique philanthropy itself, precisely at a point when this research is sorely needed. A lot of money is moving in AI, and very little of it is funding the people best positioned to study how it impacts everyone downstream.

D’Ignazio was asked directly whether feminist generative AI at scale is possible. Her answer was no, with caveats, given who owns the technology today and the current emphasis on profit motive. She suggested it is more important to consider how to organize around our relationship to technology, and how we might approach questions of profit and ownership, policy and decision-making, and data and tech governance.

She provided an example of a reasonable use case by walking us through a project from her Data + Feminism Lab. The example is documented at length in her recent book “Counting Feminicide: Data Feminism in Action,” where her team partnered with activists who scour news reports to document the gender-related killing of women and girls, including cisgender and transgender women. The lab built a very lightweight AI-based approach that streamlines the scanning and identification of news stories as possible cases to include in their project, setting up the activists to supercharge their work (note: very similar to how the NYT uses AI to analyze data for reporting). In this example, the AI’s job is task-scoped, democratically co-determined with the people who use it, and small. Smith picked this up: there is an idea baked into the current LLM moment that AI must scale to make it marketable, and the alternative is using purpose-built models that are right-sized against a body of work.

Christin spoke at length about how embodiment is one of the primary focuses of feminist theory, and how AI perpetuates the “disembodied” illusion of technology, and how this dynamic shows up in everything from the marketing to UX to user comprehension. This spoke to my thoughts on how the single-interface design of LLM chat reproduces Haraway’s “god trick,” knowledge that presents as universal while concealing the specific and situated position it comes from.

The parallel I kept returning to, listening to this, is one I think about often with my own cohort of early bloggers, women who grew up alongside the rise of the internet — and then the rise of ad tech. The internet of the late 1990s and early 2000s was being shaped by several camps: writers, students, information architects, and user-centric researchers who saw it as an information access network and a space of possibility; entrepreneurs and opportunists who saw it as a channel for marketing, monetization and extraction; and a smaller boycott camp that wanted to limit and refuse the whole personal computing and digital revolution altogether.

It was generally considered weird to be a girl on a computer or a woman on the internet — so weird that many of our peers didn’t recognize us at all — and we were there anyway, making stuff, witnessing, learning, advocating, producing, influencing. So when I watch some of my old peers, many of whom are professional writers and academics today, treat LLMs as a question of refusal rather than a condition to engage with critically, I worry we are abdicating a responsibility at precisely the moment when our technical and rhetorical expertise applies. Their refusal has good logic: user-centric researchers and communities engaged extensively with the early internet and the extractive camp won anyway, so why expect a different outcome here?

But Noble’s work on algorithmic bias attributes that failure not to engagement, but to the institutional and financial disadvantages that user-centric approaches operated under relative to gargantuan commercial interests. David and Goliath. That gap does not close through abstention. Understanding the trade-offs around tech, producing knowledge and analysis that does not depend on investors and marketers to frame the platform and the questions, requires presence. Refusal cedes so much ground.

Overall, the recommendations from the panel were practical. Noble called for people with capital (and the political will to spend it) to consider how to put money toward socially responsible research and development. D’Ignazio called for alternative funding infrastructure outside of venture capital logic, and pointed at European digital sovereignty models as worthy of consideration here. She also gestured at the popular AI Skeptics reading group as one current example of mad-and-commiserating-as-organizing that is creating safe psychological space for people to talk about AI and its tradeoffs. Christin’s recommendation was community organizing, on the grounds that LLMs are unpopular with a lot of people who feel there is no space to say so, and that finding those spaces is itself worthy because it provides shared language and awareness of others’ knowledge and experiences.

Personally, it was refreshing to hear reflections on the work (and the feelings) of being inside institutions that are being reshaped by AI, and being responsible for some of how that reshaping gets communicated and absorbed. I’m thinking about the incredible value of interdisciplinary governance, and how the commitment to governance is a specific position, and all the margins to consider.

Further reading:

Catherine D’Ignazio and Lauren Klein, Data Feminism. The foundational text on applying intersectional feminist thinking to data science practice.

Catherine D’Ignazio, Counting Feminicide: Data Feminism in Action. Extended case study of the grassroots data activism project D’Ignazio described on the panel.

D’Ignazio et al., “Feminicide and Counterdata Production.” Research paper on the counterdata methodology behind the femicide tracking project.

D’Ignazio et al., “Data Feminism for AI.” Conference paper extending the data feminism framework to questions specific to AI systems.

Safiya Noble, Algorithms of Oppression. Noble’s study of how commercial search engines reinforce racism and sexism through their ranking systems.

Donna Haraway, “Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective” (1988). The original essay where Haraway introduces the god trick and the case for situated, embodied knowledge against the view from nowhere.

How early blogging tools like Movable Type made reverse-chronological posting the path of least resistance, killing the curated, librarian-style personal homepage and locking the whole web into the chronostream.

More on personal websites as digital gardens.

Did poptimism kill rock n’ roll?

Two things can be independently true about social media. First, there is no evidence that using these platforms is rewiring children’s brains or driving an epidemic of mental illness. Second, considerable reforms to these platforms are required, given how much time young people spend on them.”

For those who love to post, a guide to the Internet of the future.

Garbage Day on whether Bluesky was a net negative for left politics. I’d argue the most important thing Bluesky did/does is technical, by producing the AT protocol that underlies it, thereby building out the federated web. Distribution channels come and go, so consider posting at your own domain.

A look at the national decline in student test scores. It’s important to consider this alongside global trends in education as well.

Some contemporary thoughts from Ana Rodrigues on the perennial question of being a “woman in tech.”

Added a Wayback link preserver plugin to the site. This lightweight code detects broken external links on the blog and shows clickable Wayback Machine fallbacks so readers can still access the content. Handy when we’re working on the indie web where the web is fragile.

Finished reading: The Bluest Eye by Toni Morrison 📚

Reflections on teaching fiction writing in the age of AI, from a professor with ten years of classroom experience teaching writing at MIT.

Read this alongside The Bluest Eye. Finished reading: On Morrison by Namwali Serpell 📚

Reading Timothy Chester’s reflections on last week’s Canvas incident. As institutions increasingly outsource the technical load to cloud service teams, IT incident response becomes about communication.

Why I'm broadly skeptical of device bans in K-12

Watching a local phone ban policy snake through the community this week. Like others have already said, my concern is mostly related to enforcement, as we know enforcement for all school policy lands unevenly across school populations at the expense of Black students. But also, my experience is that the local school system is not communicative in a way that reinforces student and parent desires for two-way communication, especially considering the logistics of having kids who need transportation and after school arrangements.

When I was young, there was a pay phone on every corner and a central landline in every home. We don’t live in that world anymore – in our world, phones and other personal devices are part of our daily processes for school, work and family logistics and communication with friends, family and the broader world. Same for kids with devices.

None of this means screens are neutral or that Jon Haidt is entirely wrong about attention and comparison dynamics. But Haidt et al diagnose a genuine social problem and locate the cause as technology design, then arrive at solutions that are driven by individual consumer behavior. Ultimately he does not call for taxing and regulating algorithmic platforms, regulating algorithmic amplification of distress, regulating algorithmic amplification of marketing, reducing economic precarity, or doing anything about climate change linked to tech. All of the behavioral changes indicated (device bans, Faraday bags) set up fights between kids and adults on phone access at the individual behavior level, fights with consequences that generally land harder on certain students.

While this issue roils locally, my kiddo’s locker was broken for two months this year, and while waiting for repair, she got dinged for having her device in her pocket in class when the locker wasn’t a secure option. She had shoes stolen from her locker in the meantime, proving the point.

I spoke with my kiddo at length to get her thoughts. Her takeaway as an 8th grader is that kids have second and third secret devices that they hide from parents and teachers already – often, mom and dad’s old devices slipped from a junk drawer and connected to wi-fi. She suggested we adults don’t fully appreciate the kids’ ingenuity around their devices, and how they view their phones and tablets as the means to get and stay connected with one another.

While talking, I was reminded of the dance between students and the school system’s IT department during the COVID-19 shutdown. In our community, the kids were in remote learning for a full year and a half, and the IT department chased them around their approved digital tools like a game of whack a mole, shutting down access to chat and collaboration. In the meantime, almost no socialization happened between students that wasn’t directly observed by teachers, on camera. By 2021, the kids were engaged in secret, digital note-passing, chatting within Google docs and slide decks to avoid teacher surveillance. Where there is a will, there is a way.

It’s like the phrase “turtles all the way down,” but turtles are marketing.

A new group is attempting to map influence in the AI industry, with the goal to “produce a structured, shareable, and dynamic resource that identifies who is working on what, where the gaps are, and which partnerships might form across ideological and organizational lines.”

A new study suggests that people who use AI for writing are more able to detect AI writing than automated scanner tools. My current LLM pet peeve is how they use language like load-bearing, structural and legible to describe most ideas.

A quarter of the way in and already exhausted/confounded. Currently reading: Adult Braces by Lindy West 📚

“Amazon has launched a new feature that uses AI to generate a short, podcast-like audio segment where two AI ‘hosts’ discuss the merits and reviews of a specific product.”