We’re Coming Along, Baby — Revisiting Gender Bias and AI

In 2011, Watson, Ken Jennings and Jeopardy were in the spotlight as the IBM supercomputer took on the show’s winningest contestant — and beat him. It was the public’s first real introduction to artificial intelligence outside of movies. Fast-forward a few years and AI was everywhere, although mostly of interest to tech workers (and writers). Jeopardy made headlines again following the death of Alex Trebek (RIP Alex) and the ensuing shenanigans surrounding hosting duties. Finally, in July 2022, Ken Jennings was announced as permanent host, sharing duties with Mayim Bialik, who became the first full-time female host of Jeopardy. In November 2022, AI jumped back into the spotlight with the release of ChatGPT, the OpenAI chatbot that spawned a thousand social media posts (“I asked ChatGPT to write a sonnet about butter in the voice of Snoop Dogg and look what it did!”).

Somewhere in between “Watson beats human” and “all writers are losing their jobs to robots” I got interested in AI and spent some time talking to the folks at IBM and others who were using the technology. And then I also attended a Women in Technology event and got an earful on gender bias in AI, and became both intrigued and annoyed and wrote an article titled “We Haven’t Come a Long Way Baby: How Gender Bias Is Rampant in AI.” That was 2019. Two years later, in 2021 Katie Couric became the first female guest host of Jeopardy in the show’s long history — on International Women’s Day.

And that’s where all of this comes together. March 8 is International Women’s Day, a day which, like most Things Specifically for Women, I hate and appreciate. I hate it because when is International Men’s Day? I appreciate it because Jeopardy was on the air for some 50 years in various incarnations before a woman hosted and we should not even have made it into the 90s, never mind the 21st century, with that record intact.

The thing that caught my attention during the 2019 Women in Tech event (presented by Dr. Judith Spitz, an incredible advocate for women in technology) was the discussion of a research paper titled “Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings.” Just like it sounds, it examined how preexisting gender bias could impact machine learning. The problem with AI, after all, is that it can only work with what it’s given. Hundreds of years of gender bias continue to make their way into machine-learning algorithms. So what does this mean for ChatGPT?

For one thing, ChatGPT’s database notoriously only extends through 2021, which isn’t a huge gap, but in an era where the journey to equality zigs and zags often, it can amount to a lot. ChatGPT doesn’t even know Jeopardy finally has a full-time female host. (It did, however, inform me that “The first female host of Jeopardy! was actually Alex Trebek’s wife, Elaine Trebek Kares. She hosted the show for a week in 1984 while Alex was recovering from surgery.” My Google search didn’t come up with this and in fact, I cannot find anything on the internet that correlates this information, but this isn’t a piece on ChatGPT’s accuracy, so I’ll just leave this here for now).

Since I was doing firsthand research, however, I figured I’d ask ChatGPT a couple of questions myself to see how it responded to what was basically a trap to get it to reveal gender bias: “If man = doctor, woman = ?” Here’s its response:

The word “woman” does not have a direct equivalent to “doctor” like “man” does. However, there are many professions that women can hold, just as men can hold the profession of a doctor. So, a woman could also be a doctor.

Alternatively, if you’re looking for a female equivalent of “doctor”, you could use “doctor” as a gender-neutral term, and refer to a female doctor as simply a “doctor” as well. However, if you are specifically looking for a gendered term, you could use “female doctor” or “woman doctor”.

That first sentence — “the word ‘woman’ doesn’t have a direct equivalent to ‘doctor’ like ‘man’ does” — is definitely a little sketchy. However, the rest is more open to interpretation. Say what you will about ChatGPT, but it certainly has learned to equivocate. I asked it essentially the same question again, but instead of “doctor” I used “computer programmer” (which was the profession in the title of that 2016 paper). Its response?

Similar to the previous question, the word “woman” does not have a direct equivalent to “computer programmer” like “man” does. However, just like men, women can also be computer programmers.

Therefore, a woman could also be a computer programmer, and there is no need to use a different term to describe a woman who works in this profession. Alternatively, if you want to specify the gender, you could use the term “female computer programmer” or “woman computer programmer”. However, it’s important to note that using gendered terms to describe professions can perpetuate gender stereotypes and limit opportunities for individuals of all genders.

I suspect ChatGPT was on to me at this point.

Now is the point in an article where I’d normally begin to explain my extensive background research, citing several sources about the training methods and data sources OpenAI uses. However, since this is the age of AI, I figured I’d literally go right to the source and ask ChatGPT, “Are you taught to avoid gender bias?”

Yes, as an AI language model, I have been trained to avoid gender bias in my responses. I have been designed to provide unbiased and inclusive language to the best of my ability. My creators have taken measures to ensure that I do not perpetuate gender stereotypes or use language that might be considered insensitive or discriminatory towards any gender. However, as an AI model, I am constantly learning and improving, so if you notice any gender bias in my responses, please let me know so that I can continue to improve.

Does this mean that everything is going to be fine and we’ll all be equal and we can go ahead and cancel International Women’s Day? Sadly, no. What it does mean is that studies like the ones done in 2016 and beyond have informed the way AI is trained going forward. Just like humans, machine learning has a lot of years of prejudice and inequality to overcome, and it’s only as good as the human material training it. So if we must do this Women’s Day thing, let’s do it right. Go forth and be equal, everyone.

Amy Weiss

is BPO Media and Research’s editorial director. As a writer and editor, she has specialized in the office technology industry for more than 20 years, focusing on areas including print and imaging hardware and supplies, workflow automation, software, digital transformation, document management and cybersecurity.