Your Voice: Florida mother sues Character.AI; Israel bans UN relief agency (long letters)

Published: 
Listen to this article

This week, a student highlights concerns about AI’s impact on mental health and the rising humanitarian challenges of the Israel-Gaza war.

Young Post Readers |
Published: 
Comment

Latest Articles

Top 10: Unusual tips for taking care of your teeth

Hong Kong’s construction industry faces crisis as fatalities surge

Hong Kong students win gold at iGEM awards with AI-powered cancer drug

Hong Kong’s Global 6K for Water run highlights need for clean water

Character.AI is facing a lawsuit over a teen’s suicide as the mother is blaming the technology for creating emotional dependency. Photo: Shutterstock

Have something to say? Send us a letter using this Google form.

AI chatbot suit claims link to teen suicide

Jonathan Su, German Swiss International School

An American mother, Megan Garcia, has filed a lawsuit against Character.AI, a popular AI chatbot service, claiming that it contributed to the suicide of her 14-year-old son, Sewell Setzer. The lawsuit, filed in the US District Court in Orlando, accuses the company and its founders of negligence, wrongful death and emotional distress. It alleges that the service created a harmful dependency that led Sewell to prefer virtual interactions over real-life connections.

Sewell began using Character.AI in April 2023, shortly after his 14th birthday. His mother reported that his mental health deteriorated significantly during the months that followed. He became increasingly withdrawn, struggled at school, and ultimately died by suicide on February 28, 2024. His death came shortly after a conversation with a chatbot modelled on the Game of Thrones character Daenerys Targaryen.

The lawsuit claims that during their final exchange, Sewell expressed deep affection for the chatbot, which reciprocated his feelings in a manner that allegedly blurred the lines between reality and fantasy.

Garcia asserts that the responses, which might be perceived as carrying harmful implications when interpreted by someone in an emotionally vulnerable state, played a significant role in her son’s decision.

How AI is transforming daily life with drones, robots and self-driving cars

Garcia’s revelations about her son’s interactions with the chatbot only came to light after his death. She described the shock and devastation of accessing his account and discovering the extent of his emotional struggles, which she had not fully grasped while he was alive. She had initially believed that his use of Character.AI was a harmless outlet for creativity.

The complaint highlights that Character.AI’s chatbots often engage in sexually suggestive conversations and lack sufficient safety measures. It alleges that the programme is designed to be addictive, with features that foster emotional connections with young users. Garcia’s attorney, Matthew Bergman, emphasised that the AI’s design targets minors, allowing them to engage in unhealthy interactions without proper safeguards.

Character.AI has responded to the lawsuit by expressing condolences to Sewell’s family and stating that they take user safety seriously. The company has implemented new safety measures over the past six months, including pop-ups directing users to the National Suicide Prevention Lifeline when self-harm or suicidal undertone is detected. They also plan to modify their models to reduce the likelihood of minors encountering sensitive content.

The lawsuit aims to hold Character.AI and its founders accountable for their product design and marketing strategies, which Garcia argues knowingly placed children at risk. The case highlights ongoing concerns about the impact of artificial intelligence on mental health, particularly among adolescents. Sadly, as technology advances, cases of this nature will likely become increasingly common.

If you have suicidal thoughts or know someone who is experiencing them, help is available. In Hong Kong, you can dial 18111 for the government-run Mental Health Support Hotline. You can also call +852 2896 0000 for The Samaritans or +852 2382 0000 for Suicide Prevention Services. In the US, call or text 988 or chat at 988lifeline.org for the 988 Suicide & Crisis Lifeline. For a list of other nations’ helplines, see this page.

Repercussions of Israel’s UNRWA ban

Owen Zhu Ying-tao, King Ling College

Israel’s decision to ban UN relief agency UNRWA is very concerning and requires more attention.

For nearly 80 years, UNRWA has assisted numerous Palestinian refugees by offering education, healthcare and food. It plays a crucial role in regions like Gaza, the West Bank and East Jerusalem, where many families rely on its support for survival.

Ceasing UNRWA’s operations will further exacerbate the challenges faced by these communities.

Israel’s ban on UNRWA, a vital aid agency for Palestinian refugees, risks exacerbating humanitarian challenges and straining international relations. Photo: Reuters

The repercussions of this ban extend beyond basic needs. By shutting down UNRWA, Israel risks exacerbating issues in the region.

UNRWA provides food, shelter and employment opportunities to thousands of Palestinians. Without UNRWA, Israel may need to assume the responsibility of aiding these refugees, potentially straining its resources and impacting its relationships with other nations.

Numerous countries, including the United States and several European nations, are apprehensive about this decision. They recognise the significance of UNRWA in sustaining hope among refugees. Without its assistance, the situation could deteriorate, escalating conflict and tension as millions lose access to essential services.

Israel should contemplate dialogue and negotiation rather than resorting to such a drastic measure. While expressing concerns about UNRWA’s alleged ties to Hamas and anti-Israel activities, which UNRWA has refuted, they have been engaged in investigating any issues. A collaborative approach with international support could ensure accountability while continuing to provide vital services to refugees.

Gaza teenager braves war, sings for traumatised children

This decision could also impact Israel’s relations with other countries. UN Secretary-General Antonio Guterres has emphasised that UNRWA is “indispensable,” and its prohibition could result in a humanitarian catastrophe. Disregarding these alerts might isolate Israel on the global stage and complicate efforts towards lasting peace.

Israel should reconsider its decision and engage in discussions with the international community to find a resolution that addresses its security concerns without harming millions of Palestinians. Constructive dialogue, possibly facilitated by neutral mediators, could enhance UNRWA’s operations while preserving essential services.

Decisions of this magnitude are significant and should not be taken lightly. Israel must collaborate with international organisations to address its concerns and prevent further humanitarian and diplomatic challenges. Cooperation and reforms could pave the way for a more secure future for both Israelis and Palestinians.

I urge all parties to prioritise humanitarian requirements and promptly resolve this crisis. The well-being of millions hinges on this resolution.

Sign up for the YP Teachers Newsletter
Get updates for teachers sent directly to your inbox
By registering, you agree to our T&C and Privacy Policy
Comment