We’re likely familiar with person–to–person sexism. Whether that comes in the form of inappropriate comments about a women’s appearance or workplace sexual harrassment, it’s clear that women are often disrespected in the professional world. As technology develops the ability to sort through resumes and qualifications, a new concern arises: Can algorithms be sexist?
Although the first algorithm was written by Ada Lovelace—the first programmer in history—artificial intelligence now has the potential to discriminate against women seeking professional growth. A prominent example of sexist AI was found in one of the most powerful companies in the world: Amazon.
In 2018, a Reuters report revealed that Amazon was using an AI recruiting tool that held a bias against women. During the age of online applications, LinkedIn profiles, and digital resumes, the company decided to implement AI to sort through a larger pool of applicants. Since 2014, Amazon utilized this automation to increase efficiency and maximize results. Much like Amazon products, the system rated applicants on a one to five star scale. However, the star system was far from gender–neutral.
Amazon’s automated systems were created to discard applicants based on job acceptance patterns from a 10–year–period. Throughout Amazon’s history, the majority of its applicants have been male, as the tech industry is still heavily male dominated. This caused the algorithm to automatically place male applicants at an advantage. According to the Reuters’ report, the AI system “penalized resumes that included the word ‘women’s,’ as in ‘women’s chess club captain.’ And it downgraded graduates of two all–women’s colleges.”
Although the AI system was eventually scrapped, sexist algorithms prove to be especially concerning as women are often already placed at a disadvantage to male counterparts with or without the bias of technology. In fact, Big Tech proves to be one of the most male–dominated fields. Case in point: men make up a whopping 77% of Apple employees in technical roles.
Another jarring example of a sexist algorithm was discovered in the creation of the Apple Card. Although most of the evidence is anecdotal, a common pattern was observed: men received a significantly higher credit limit than women, regardless of their credit score. Even Apple co–founder Steve Wozniak admitted that the Apple Card gave him a credit limit that was ten times higher than his wife.
Unlike Steve Wozniak, we cannot continue to blame algorithms and the nebulous idea of “big tech” for these problems. As sexist algorithms come and go, we must inquire: How can algorithms serve, protect, and empower women? How can big tech companies encourage and welcome women on to their team?
I had the opportunity to attend the program Girls Who Code during the summer of my sophomore year of high school. Before attending the program, the industry was distant and intimidating to me. While touring start–up tech companies, the gender disparity was alarmingly clear.
The Girls Who Code program is an international non–profit founded by Reshma Saujani, working to close the gender gap in technology. I was able to attend the program free of cost, and even received a free laptop at the end of the program. As a high school student, this experience was priceless; I gained mentors, friends, and inspiration from educated and self–taught female coders. For my final project, I worked with a group of girls to code a game on Python where a woman collected coins through a maze to represent closing the gender pay gap.
Another notable point is that big tech seems to have appointed women to serve users. Siri, Alexa, and Cortana all come with a default female voice. This is no coincidence. Digital female assistants recreate our social reality of women serving the professional world as subordinate secretaries or associates. Why is it that women are often placed in positions to “serve” society without truly leading it?
As of 2020, 37 women hold CEO positions in Fortune 500 companies. Although this number may appear impressive at first, it makes up only 7.4% of Fortune 500 CEOS. This limited percentage permeates the male–dominated professional world. In the U.S, women make up 50.8% of the population, but hold significantly fewer leadership positions compared to their male counterparts. For example, in academia, women have earned the majority of eight–year doctorates, but only make up 32% of full professors and 30% of college presidents. In the legal profession, women make up 45% of associates, but only 19% of equity partners.
Nonprofits such as Girls Who Code create a pipeline for young women to enter the tech industry. A recent report from Accenture found that a gender–inclusive culture would have far–reaching positive results in the tech field. The analysis states that if every tech company scored high on “inclusive” workplace culture—specifically, if they match companies in the top 20% of the study—the annual attrition (drop) rate of women in technology would decrease by 70%. This means if companies take the necessary steps, three million women could be in tech by 2030.
Diversifying Big Tech is one way to start, but we must also examine the ways that algorithms are created. The Harvard Business Review suggests that machine–learning teams should measure different accuracy levels for each demographic category, rather than just testing on one “universal” sample. In addition, it suggests that machine learning should focus on de–biased techniques that penalize the production of unfairness as much as the production of errors.
As we learn more about biased algorithms, one thing is certain: we all have the social responsibility to hold algorithms accountable for inequality.