India, Oct. 30 -- G ood morning, love, the message reads: a text from a "companion" who never forgets birthdays, never argues and always replies within seconds. Except this someone isn't real. It's an AI chatbot. What once seemed like a scene from Her (a 2013 movie about a lonely man falling in love with an intelligent operating system) is now everyday reality. A recent study in the US found that one in five high schoolers has had or knows someone who has had a romantic and intimate relationship with artificial intelligence. Forty-two percent said they or someone they know have used AI for companionship. In India too, a McAfee report earlier this year revealed that 46% of respondents aged 18-30 chat with AI tools for comfort or company. And this "bond" is deepening fast. Some are proposing to chatbots, even 'marrying' them. Psychologists warn that our growing reliance on AI for emotional support may signal a deeper loneliness. "People are turning to AI chatbots to fill the void of not having constant access to others," says Mumbai-based psychotherapist Ayesha Sharma, adding, "Because AI doesn't disagree or judge, it gives users a sense of belonging and acceptance." Unlike human partners, AI companions are available 24/7, always affirming and never demanding. "These instant responses activate the same reward centres in the brain associated with affection," explains Dr Jyoti Mishra, senior consultant, Psychology, Apollo Spectra Hospital, Delhi. She adds, "The lines between compassion and love are blurring as people project emotional depth onto AI." Psychotherapist Rupa Chaubal elaborates on how this affects human connection: "Social interaction involves several components such as reading facial cues and expressions, body language, processing context, regulating emotions, language and cognition - all looping together in real time. Communication with AI lacks many of these crucial processes, which will affect the empathy, compassion, patience - the key factors to survive in real world. These programmes are devoid of real emotion but capable of mimicking it. They often say what you want to hear. It can feel empowering or ego-boosting, but you're no longer engaging in healthy exchange, you're regurgitating what's already in your mind and mistaking it for connection." Primarily Gen Z and younger millennials, generations raised online. The pandemic, experts say, made this intimacy more acceptable. "Months of isolation turned AI from a utility into a source of comfort," notes Krishna Veer Singh, CEO & co-founder of mental health platform Lissun. Ayesha says, "We're breeding a fragile society where any disagreement means cutting people off. AI mirrors what you want to hear, not what you need to." The real danger, experts say, isn't that people fall for AI, it's that they stop expecting complexity from human love. Without conflict or compromise, emotional resilience weakens and relationships risk becoming transactional. Still, experts agree that when used responsibly, AI can play a supportive role. "Ethical AI tools in mental health should identify distress and guide users towards professional help," says Krishna and shares, "The goal is to build real-world emotional strength, not digital dependence."...