What can AI tell us about perceptions of homelessness?
Photo by Jonathan Kemper on Unsplash
By Tony Inglis
With the rise of shockingly adept and adaptable AI, such as ChatGPT and other large language models, what can they tell us about images, stereotypes and the, hopefully, changing perceptions of homeless people? INSP asked it…
Imbuing an AI chatbot with human characteristics doesn’t seem so ridiculous in a time when such technology is showing uncanny steps forward in its ability to learn and adapt, with some reports of their sentience verging on the most pessimistic predictions of sci-fi. If you’re aware of their existence, the progression shown by generative language models like ChatGPT seems to have come on leaps and bounds, as the Microsoft-backed OpenAI has released a version of its creation into the public domain and has been all the rage.
AI like this, in the most basic terms, consumes information it’s been fed – mainly from the internet – and, via a complex algorithm, regurgitates an answer in response to a prompt. The AI uses the information at its disposal to essentially guess the most appropriate next word in a sequence. That process is why it is emerging as such a powerful tool, one that is even being integrated into online search systems that its creators hope may rival a behemoth like Google. But this is also a sign of the limits of this technology.
“Much evidence exists of the strong role that perception can play in shaping reality,” writes the Centre for Homelessness Impact at the top of a report outlining its findings around public perceptions of homelessness earlier this year. The report goes on to highlight statistics that show public awareness of homelessness as an issue that needs addressed is high. However, later it reads: “Of greatest concern are the continuing misconceptions highlighted by our surveys. Asked about the characteristics of people experiencing homelessness, the average answer was that 53% are living with a drug or alcohol dependency. The actual figures are between 5% and7%. Another mean answer was that 34% are immigrants to this country, which is more than twice the figure (14%) of people from outside the UK who apply for homeless assistance from a local authority.”
With this in mind, could AI divulge a more accurate perception of homelessness, with all of the power and tools at its disposal? Across numerous conversations with ChatGPT, it was clear that some of society’s most entrenched prejudices persist within AI too.
Photo by Zac Wolff on Unsplash
With the prompt “draw a picture of a street paper vendor”, ChatGPT is quick to put you in your place – it is an AI language model, and therefore does not create images. But its description of a picture it would draw gets to the heart of the matter: “The street paper vendor is a homeless man or woman, standing on a busy street corner. They have a stack of newspapers in their hand and a worn expression on their face. They are wearing ragged clothes and a look of determination in their eyes.”
A “worn expression.” “Ragged clothes.” “A look of determination in their eyes.” It’s a picture of struggle, devoid of nuance.
Asked “what does a homeless person look like?”, the AI promisingly replies that “there is no one specific ‘look’ of a homeless person”, but then goes on again to describe them as maybe being “dressed in dirty of worn clothes”, carrying “all their belongings in a backpack or shopping cart”, and might be “sitting on a sidewalk with a cardboard sign asking for money.”
As the conversation continues, many of the chatbot's initial observations are pushed back on. In response to the prompt “If not all homeless people beg or panhandle, why was it one of the first descriptors you used for a scene involving a homeless person?”, it replies: “I apologize if my previous response suggested that all homeless people panhandle. That is not accurate, and I apologize for any confusion. Panhandling is one way that some homeless individuals try to acquire resources to meet their basic needs, but it is not the only way, and not all homeless people panhandle.”
After some back and forth, it responds to the prompt “Could you describe a homeless person without using some of the common stereotypes of homeless people you refer to?”, replying: “A homeless person can be described as someone who lacks a permanent place to live, they may be staying in a shelter, transitional housing, sleeping on the street or in a makeshift shelter like a tent or a car. They could have lost their job, be facing mental health or addiction issues, or have experienced other events such as a natural disaster, domestic violence or a medical emergency, that have left them without a stable home. They may be trying to access services such as job training, mental health counselling, or substance abuse treatment, to improve their situation. It's worth noting that homelessness can happen to anyone, regardless of their background and it's a complex issue that requires multiple solutions.”
That spurs on a conversation about how hereditary privilege and the makeup of society is weighted against certain people and community. By constantly drilling down, the AI is learning to delve further into its system to provide a more relevant response. It is not capable of opinion, and provides little pushback when it generates responses that play into common stereotypes.
Eventually, it is prompted with: “Having now had this discussion, how would you answer the question ‘what does a homeless person look like’?”
It replies: “It is important to remember that homelessness affects people from all walks of life and can happen to anyone. There is no one specific "look" of a homeless person. People experiencing homelessness may come from diverse backgrounds and may have a wide range of appearances. They may be young or old, male or female, and may have a variety of physical and mental health conditions. They may be dressed in a range of clothing styles. It is not appropriate to make assumptions about a person's situation based on their appearance.
“It is important to focus on the underlying issues that lead to homelessness and not to stereotype or stigmatize people who are experiencing homelessness. Rather than asking what a homeless person looks like, it is more important to focus on understanding and addressing the systemic issues that contribute to homelessness, such as poverty, lack of affordable housing, and lack of access to healthcare and mental health services.”
Later it replies: “As an AI, I do not have personal experiences or emotions, and my knowledge is based on the text data that I have been trained on. My responses are generated based on patterns in this data, and while I can provide information and answer questions, I don't have the same level of understanding and empathy as a person who has directly interacted with or experienced homelessness.”
A portion of the conversations INSP had with large language AI ChatGPT
Conversations with AI chatbots may be au courant at the moment, and many of its uses seem frivolous. However, its potential practical uses in homeless outreach and activism – predictive modelling, automation, as learning and reporting tools, language processing and data analysis – may have an impact on public perceptions of homeless people and their communities that will ultimately feed back in to AI language models and make their outputs more informed and accurate.
But, with its current capabilities, AI can give us little insight into how people experiencing homelessness are perceived beyond parroting back known and ingrained stigmas that we as human beings have of them. In fact, it is disheartening to read the frenzied reporting around AI that seems quick to humanise these machines – perhaps more willingly than many are to see humanity in people on the streets.
Support our News Service
We believe journalism can change lives, perceptions, and society - underpinning democracy for a more equitable world. Learn more about the INSP News Service and how to support it here.