ChatGPT: A Growing Emotional Support Tool in Pakistan

Many in Pakistan, including professionals like Mehak Rashid and Khizer Iftikhar, are turning to ChatGPT for emotional support and advice. While the AI chatbot provides instant comfort, psychologists warn of potential negative impacts on real human connections and social isolation. The trend of seeking emotional safety in AI reflects the lack of mental health resources in the country.

Jun 25, 2025 - 10:56
 0  0
ChatGPT: A Growing Emotional Support Tool in Pakistan

Comfort or isolation: Pakistanis weigh pros and cons of ChatGPT as confidant

When Mehak Rashid looks back on a restless, emotionally fragile phase of her life earlier this year, an unlikely confidant comes to mind.

“When nobody else was listening to you and everybody else thought you were crazy, ChatGPT was there,” Rashid, a metallurgy and materials engineer from Lahore, told Arab News.

“I just wanted to be heard… It will not give you a judgment and that’s so beautiful.”

Rashid began using the chatbot after noticing her children experimenting with it for schoolwork. Now, she often turns to it for “answers” and “different perspectives.”

“It helps me in every way,” she said.

Since its launch in November 2022, ChatGPT has attracted hundreds of millions of users and, by mid-2025, logged nearly 800 million weekly active users. Many in Pakistan, among the top 20 countries for ChatGPT traffic, use it daily for emotional support, venting feelings, or late-night reassurance when friends aren’t available.

Globally, an estimated 40 percent of ChatGPT conversations relate to mental well-being, and a Sentio University survey found nearly half of users with ongoing mental health issues rely on it for support: 73 percent for anxiety, 63 percent for advice, and 60 percent for help with depression.

While this instant comfort helps some cope, psychologists warn that heavy reliance on AI can weaken real human connections and deepen social isolation in a country already short on mental health resources.

A March 2025 study by OpenAI and MIT found frequent users reported increased dependence and loneliness, suggesting that AI companionship can erode human bonds and intensify feelings of isolation rather than resolve them.

For Lahore-based designer Khizer Iftikhar, ChatGPT began as a professional aid but gradually crept into his personal life and started affecting his relationships, especially with his wife.

Many experts say using AI models can weaken bonds overtime, reduce empathy, and make people more emotionally self-contained, preferring the predictable reassurance of a machine over the give-and-take of genuine human connection.

Despite once trying therapy, he now uses ChatGPT to process emotions and trusts people only for practical advice.

In Islamabad, 26-year-old Tehreem Ahmed initially used ChatGPT for office transcriptions and calorie tracking but it eventually became an emotional lifeline.

The chatbot encouraged her to pause and reflect before reacting.

While Ahmed doesn’t fully trust the bot, she said she preferred it to people who might dismiss her feelings.

For one anonymous Lahore-based tech professional, ChatGPT quickly shifted from a practical helper to an emotional crutch during a difficult relationship and the ongoing war in Gaza.

Still, she was careful not to project too much onto the tool:

Psychologists caution that without the challenges and messiness of real interactions, people using chatbots may lose vital social skills and drift further into isolation.

To avoid that, many turn to chatbots. But Khan warned that AI’s constant affirmation could have unintended consequences.

The trend is especially troubling in a country where mental health care remains deeply under-resourced: Pakistan has fewer than 500 psychiatrists for a population of over 240 million, according to WHO estimates.

No wonder then that even people with clinical mental health issues were turning to AI.

Khan recalled the case of a young woman who used ChatGPT so often that it replaced nearly all her social interaction.

Eventually, she cut everyone off.

One day, she asked the chatbot what would happen if she overdosed on phenyl.

“ChatGPT said, ‘There are no consequences. In case you overdose yourself, you might get paralyzed,’” Khan recalled.

The girl only read the first half and attempted suicide.

According to the source: Arab News PK.

What's Your Reaction?

Like Like 0
Dislike Dislike 0
Love Love 0
Funny Funny 0
Angry Angry 0
Sad Sad 0
Wow Wow 0