Safiya Noble [ Machine Learning 16:48 ] One of my earliest memories is a memory being about two or three years old and putting my hand on top of my mother’s hand and being really sad – not understanding why they didn’t match.
You see, my Mom was white and, as you can see, I am not.
This was a real source of tension for me – and confusion.
I couldn’t make sense of what did it mean that we didn’t match. I was born at a time when a cultural revolution was sweeping across the United States. 1970 was a time when the civil rights movement was in full swing, the Black Power movement was creating incredible change in the way that African Americans could see themselves and my mom was very clear that she had to curate – make sense of, provide information about – what it meant to me that we didn’t match.
Much to my chagrin, she provided a lot of African American dolls and books and information for me and sometimes I didn’t want it because I wanted the white Barbie – like everybody else.
It’s not fair.
My mom was aware of studies that had been done by Kenneth and Mimi Clarke.
Kenneth and Mamie Clarke, in the 1940s, conducted a series of experiments with African American children where they gave them two dolls.
The first doll they gave them was white with blonde hair and the second doll they gave them had brown skin and black hair. And they asked African American children a series of questions about which doll was nicer, which one was prettier, which was the better doll?
Which one do you like?
Every time, black children chose the white dolls.
So my Mom was aware of this and she knew that if she didn’t do something to intervene on all of the other messages – the stereotypes, the negative ideas about African Americans in the United States – that I might be one of those children.
That experience of growing up was the inspiration for the research that I currently do.
I research the internet.
I look at the values, the ethics, the influences that impact the way that we design technologies.
I don’t actually design technology myself, I’m a social scientist. I study them. I study the technologies that we inherit and I try to collaborate with people who want to make a difference in improving them.
Let me tell you a little bit about what I found when I started my research in 2009. I started collecting searches in major commercial search engines and in 2011 this was the first hit for black girls.
If we could go back – I’m not sure – I’ll tell you that you got a chance to see that first slide and I was very concerned about that so I contacted women’s magazines and I tried to convince them that this is something that we should care about .
The first taker was Bitch Magazine.
This wasn’t easy to convince them of because, you know, we’re so used to seeing women pornified and hyper sexualized on the web that they kind of weren’t sure that this was something that would be really story worthy.
So I said all right – after about the fifth round of emails – I said I’d like you to do a search on Women’s Magazines and tell me if you find Bitch Magazine in the first five pages.
The first five pages is important.
Actually, the first page is the most important.
Because the majority of people who use commercial search engines do not go beyond the first page.
In fact, Pew Research did an amazing study on search engine use and they found that the majority of people who use search engines find that the information that they look for appears to be reliable and credible.
People have incredible trust in search engines.
What they get and listen I’ll tell you if I’m looking for the best latte in town I’m starting with Google because I know millions of people are going to be impacting and sharing information about where the best latte is in town.
And I use that all the time when I travel
But what happens when we start looking for other things.
Once Bitch Magazine did that search – and they couldn’t find their feminist magazine linked to women’s magazines – I got their attention.
What does it mean that feminism as a concept has been divorced from women?
If you don’t know what feminism is, or you think that the only thing that matters about feminism is whether Beyonce is one, that’s a problem.
I first wrote about the impact of search engine results and how they misrepresent women and girls in 2011 and shortly thereafter – I’d say about three months later –the algorithm changed for black girls.
Black girls and pornography are no longer the first page of information in Google.
Not so for Asian girls. Not so for Latinas.
So we have work to do.
Search engines are amazing tools for us but what we don’t see are the ethical dilemmas – the biases – that are inherently built into them.
The ways in which money influences the kinds of results that we get.
It’s not just about popularity.
It’s about credibility too.
So what happens when we look for more specific information?
Information that can help us – tell us something about other people?
When I do search is on why are X types of people
So these are the kinds of results that I get
Why are black people so loud, athletic, lazy?
We don’t see words like intelligent, prolific, amazing.
A girlfriend of mine – Diane, about a year ago, posted to Facebook and she said, “What’s happened? Did Hitler just take over Google?”
She was doing a search for something beautiful –because she wanted some scenes of nature for her desktop wallpaper.
When she did a search for beauty this is what she got.
This is what’s beautiful.
I expect in fact when I do a search for beautiful to see a beach in a paradise because that’s what’s beautiful to me.
Part of what this tells us is that, no matter the ways in which offline we are trying to make sense of our values we’re trying to redefine beauty standards – they get replicated, instantiated in fact, in some of our most important technologies.
Black girls are still not off the hook yet – in these in these environments –in that we still have work to do.
I have six nieces and a bonus daughter and I’m always thinking about what do.
They find, when they’re looking for other black girls online, teal the text and the websites have changed but when I think about and when I wonder what they’re thinking about when they look for black girls and images of black girls these are the kinds of things that they see.
I would expect that the two most popular black girls in the United States – Sasha and Malia Obama –might show up above the fold.
The first page of results for black girls.
If you talked to my nieces they would really expect Raven Symone to be there.
They just really discussed it – that she wouldn’t be there.
Oh snap has been heard way too much in our houses.
These are some of the things that I care about
All right so now I’m trying to imagine something different in my work.
I’m trying to theorize that we could –instead of being bombarded by problematic ideas and images – what if we imagine search differently?
What if, instead of starting in the red-light district when we’re looking for black girls or Asian girls or Latinas – we’d have to actually go to the web light district
We could move our search box there if – that’s what we’re looking for
You see, part of the trouble is that it’s not that you get pornography necessarily back, it’s that you don’t have to use the word porn or the word sex with Latinas or with black girls or with Asian girls to get the porn.
So I feel rise and try to imagine a different kind of way of looking at search.
Now listen. I said it before, I’m not a computer scientist I’m a social scientist. So those of you who are out there who are computer scientists we should be working together.
There’s so many amazing web scholars –not just me– who are thinking about the implications of the biases that are not visible.
these practices that we can’t see
In my imagined engine you actually have to choose your biases.
It’s so, it’s visible to us.
So listen. if you want racism? check a box. you want sexism? check the box.
But you have to opt in for that rather than that being the default.
That’s a different way of imagining looking for information in many ways a much more powerful way.
I think about some of my students right now my transgender LGBTQ students looking for information going to what seems like a blank page a value-free neutral search engine full of no biases that we can see just white space and looking for information for themselves last week and being bombarded with vitriol from Westboro Baptist Church.
If you want that, check your box and then you can get it – rather than be assaulted by it.
There are a lot of hidden processes in our technologies.
I guess you could say that I am in pursuit of socially responsible technologies –a way to be socially responsible in how we find information.
These other kinds of processes that are hidden from us, that we don’t think about, our processes that are also affecting women of African descent around the world
For example women and children in the Democratic Republic of Congo – Coltan.
Coltan is essential to making microprocessor chips.
Every electronic device that we use –even probably this technology that’s holding up our conference right now –holds Coltan
Simultaneously the mineral wars that are happening in the Democratic Republic of Congo are significantly contributing to the Congo being the worst site of sexual violence in the world.
According to the United Nations. it’s hidden from my view. We don’t see it.
We have to make these processes visible.
How might we design for those of you – the engineers in this room –how could we design microprocessor chips that wouldn’t be so dependent upon these kinds of exploitive, incredibly oppressive conditions?
And you know –I know –everybody wants to upgrade upgrade to that next iPhone 12.
I don’t know what’s coming right?
It’s coming, we know it’s coming
Well what happens – five, six, seven, eight, nine, ten and eleven where do they go?
For us they might just go into a recycling container – but then that container gets put with a lot of other containers and then it’s on a ship and it’s going to the coast of West Africa where giant a waste cities are emerging.
Where again black women and children and boys and men with small hands are taking apart all of our devices under the most extremely toxic and dangerous conditions.
I believe we should be in pursuit of something different
We made these technologies we make them better. every time we use them we make them smarter. but in the process of kind of relying upon things like algorithms to tell us the answers we should actually be deciding.
My mom was curating information for me when I was a little girl, knowing that that curation would have an effect on me in the future.
How will we curate, manage code and tag our identities?
What will the legacy be –that we leave –about who we are?
How will other people know what we were committed to.
You – they’ll know by the traces that we leave, by the symbols and the science, the signals. they’ll know at one point in time – in 2011, like I did, that pornography was actually the highest value we attributed to black girls and somebody had the power to influence that.
And it wasn’t black girls.
So I ask that we be in pursuit of a different conversation.
That we make visible the ethics that undergird the choices that we’re making.
FEATURED IMAGE CREDIT: Corine Bliek