How AI will impact the LGBTQ community
PinkNews speaks to Kevin McKee, an AI researcher at Google DeepMind to find out how AI can potentially impact the LGBTQ+ community at work, why the queer experience is needed during research and if the robots are truly coming for all of our jobs.
From voice assistants to self-driving cars, AI technologies are slowly but surely integrating into our lives in numerous ways. Tech giants like Google, Microsoft and Snapchat have all put the power of AI directly into the hands of everyone. Since debuting in November of 2022, ChatGPT reached 100m monthly active users two months after launching – making the platform one of the fastest-growing consumer applications in history
However, beyond these mainstream applications, AI holds immense potential for impacting marginalized communities, including the LGBTQ+ community, both positively and negatively. From issues like privacy to employment, artificial intelligence has the potential to erase bias but has the potential to reinforce discrimination and the stereotypes that follow the LGBTQ+ community into the workplace and in wider society.
Kevin McKee, a senior research scientist at the AI research laboratory Google Deepmind, believes that having the LGBTQ+ community and all marginalised communities involved in every step of the research process is key to ethical AI.
“Representation of the queer community in AI research is essential,” he says.
McKee co-authored a 2021 paper on ‘queer fairness’ which advocated for more research to be done to get a deeper understanding of the effects of AI on the LGBTQ+ community. The paper highlights potential promising and concerning impacts in areas like privacy and employment.
Is AI a solution for LGBTQ+ job discrimination or a new threat?
When many think of practical uses for AI, recruiting often comes to mind. Advocates say that AI can improve recruiters’ efficiency in reviewing applications and eliminate the unconscious bias that sometimes costs LGBTQ+ candidates a job offer. While these are potential promising advantages, McKee acknowledges that along with those benefits, there are substantial risks.
However, these benefits don’t come without substantial risks. McKee warns that using AI systems “out of the box” within recruitment can easily recreate existing patterns of human bias. If those biases show up in the data used to train AI systems, then the AI will learn the same patterns.
“The existence of discrimination against applicants who appear or are perceived to be queer is well-established,” he says, “particularly when queerness conflicts with gender stereotypes for a particular job.”
McKee said he would be concerned that using AI without any protections or safeguards will “continue to produce recruitment and hiring bias against members of our community.”
“Just because these systems involve AI, we shouldn’t automatically believe that they are unbiased,” he notes.
To alleviate those concerns, McKee believes that taking an intentional and thoughtful when building out the systems is crucial.
“This starts with identifying the values we want to support by integrating AI into this process,” he says, “for example – debiased decision-making that is fair to queer job applicants.”
McKee says that with this approach, researchers and engineers can define and continuously measure those outcomes and if needed, modify the algorithm if there is evidence of biased decision-making.
Will AI really take our jobs?
One of the biggest criticisms of artificial intelligence for ALL employees is that at some point, AI robots will be taking our jobs. McKee acknowledges that the advances in AI research and how it will impact existing jobs is an important question. He believes that in many cases, job roles will evolve to incorporate AI rather than replacing entire jobs.
Large language models like Chat GPT and Google’s Bard can already write code, draft messages and outline documents. “These capabilities can support workers, and potentially take away some of the monotony of basic tasks,” he says.
“I imagine that much like previous new technologies,” he continues, “AI development will also create new opportunities and roles.”
Yet McKee still remains cautious and notes that in some experiences, there will be negative consequences and outcomes.
When it comes to the research process in artificial intelligence, McKee believes that all marginalised groups need to be involved to ensure that AI is as inclusive as society is trying to be.
“Our experiences can help us identify issues and insights that others are likely to miss,” he says.
Having LGBTQ+ researchers and engineers involved in the development of AI will help eliminate those unconscious biases that often occurs everywhere in society.
McKee considers facial recognition systems as an example: these systems usually categorize gender as a binary concept, classifying individuals strictly as male or female. By doing this, they overlook and disregard nonbinary and transgender identities.
“The presence of queer researchers can help teams question the initial assumption that gender is binary and fixed, rather than fluid and a spectrum,” he states. “Or whether it is necessary to build gender classification into the system in the first place.Finally, McKee thinks that it is important to not think of inclusion as an end to a means, “Inclusion is a value in itself,” he professes.
“If any queer researchers and engineers want to be involved in AI development, we belong just as much as anyone else.”
Comments are closed, but trackbacks and pingbacks are open.