A Feminist Touch

Christine Meinders uses feminist theory to directly inform her body of artificial intelligence work. The result is a collection of projects, called Feminist.ai, that prioritize inclusivity, nature, culture, and the power of the collective.

Where did your inspiration for Feminist.ai come from?

I moved around a lot growing up, and in college, I studied international economics and clinical psychology. This exposed me to different communities and perspectives. Feminist. ai grew out of my thesis year at ArtCenter College of Design. I started looking at how AI could be used to predict patterns and information, and I noticed that the narrative was very male-dominated—the way people were talking about it felt so exclusive. I started looking for women in the space.

At a basic level, what does it mean to take a feminist approach to learning and thinking?

Everybody defines their own approach to feminism. One of the main takeaways is striving to achieve equity in systems that are, at the very core, structured inequitably. It’s more than just that, though. The relationship between human bodies and nature is very prevalent in feminist theory. It centers the environment, the human, and nature altogether in one system.

How can that closeness to nature then be used to inform AI design?

One of my favorite examples is the first project we did, Intelligent Protests. We were looking at the act of protesting trees being removed from the city of Alhambra. We created a virtual protest site, and whenever anybody logged in, their avatar would be present for 24 hours. The longer you’re logged in, the more interconnected your tree roots become with others. For the machine learning component, we trained an algorithm to use your own facial movements, inspired by the animal kingdom— raising eyebrows, puffing cheeks—to create very specific sounds.

And those sounds were drawn from nature itself. When we did the sound design, we recorded audio of activists watering the trees at the protest site. Using mics on the trees, we designed with the same materials we were designing for. Trees are quite prevalent in feminist approaches in literature and thinking. This whole interconnected system.

Photo by Christine Meinders.<br/> An example of a “posthuman plugin” prototype developed with the Feminist. ai community. The prototype moves beyond gendered approaches to voice design by incorporating natural sounds.
Photo by Christine Meinders.
An example of a “posthuman plugin” prototype developed with the Feminist. ai community. The prototype moves beyond gendered approaches to voice design by incorporating natural sounds.

Much of the criticism around “sameness” in the AI field is a failure to consider diverse perspectives when creating algorithms. Can you tell me about a project you’re working on that specifically focuses on inclusivity?

Our Contextual Normalcy initiative reimagines mental health from different perspectives around the world. We’re using artificial intelligence and crowd- sourced data to create alternative versions of “normalcy,” and thinking about how we then use this data to recreate traditional mental health frameworks. We’ll then try to build our own systems for classifying, diagnosing, and treating mental illness.

What kind of data are you collecting?

We’ve created donation boxes where different people all over the world can donate their data. Right now, we ask them four questions. For example, what do they do when they feel joy? People respond, and we look at localized language. One of the preliminary findings is that some people view joy as very active. Others see it as inwardly and quiet. So how do we think about these ideas from different perspectives?

Who’s someone who inspires you as it relates to your work?

Dr. Safiya Umoja Noble had been talking about fairness in AI before people really became aware of algorithmic bias. I love her work. Many of our projects are specifically responding to her book Algorithms of Oppression. It was inspired by a moment when her stepdaughter was looking for something to do, and she typed “black girls” into a search engine. All these things populated, and none of them were positive. Dr. Noble found this quite shocking and started to dive into why this happened. I consider her one of the leading voices on algorithmic bias. I defer to work done by individuals who have that lived experience.

You lead a feminist organization, but your work often focuses on co-creation with men. How do you approach allyship?

Feminism is for everyone. That’s how we define it. Some of our biggest advocates have been men; we’re not trying to demonize them in any way. But we need to make sure not to drown out the voices that have never had a voice to begin with. Having male- identified individuals help us is very important, as long as it’s done in such a way that doesn’t remove power from other individuals. I would never, ever turn anyone away for wanting to learn about AI. It’s just such an exciting place to be.