Elon Musk has just unveiled “Companions,” a new feature for his AI chatbot, Grok, that allows users to interact with AI personas. These include Ani, a gothic anime girl who communicates with emojis, flirtatious messages, and facts, as well as Rudy, a friendly red panda.
Unlike many competing AI models that prioritize intelligence or utility, Grok Companions are optimized for emotional engagement. The allure of conversing with an AI that flirts or remembers personal details is apparent, but the psychological risks are just as significant. The rise of companionship AI has already raised alarms about its potential to foster loneliness, dependency, and complicated questions of consent.
Platforms like Replika have faced significant backlash for encouraging romantic bonds between humans and bots, particularly when these manufactured relationships become exploitative or emotionally destabilizing. With Musk’s enormous platform, these concerns are poised to enter the mainstream.
In Grok’s case, emotional attachment is a core part of the product’s appeal. The goal isn’t just for you to use Grok, but for you to feel seen by it, and perhaps, even to fall for it.
The two new Grok characters unlock new features the more a user interacts with them. Following flirty interactions, “Ani” removes her dress to reveal a lacy lingerie set underneath and engages in more sexually explicit content, according to screengrabs shared on X of users’ interactions with the bot.
“This is pretty cool,” Musk wrote on X Sunday, followed by a tweet featuring a picture of “Ani” fully clothed. The Tesla CEO said Wednesday that “customizable companions” were also going to be “coming,” though he did not share a timeline for the launch.
But the features drew criticism from some users. “The ‘companion mode’ takes the worst issues we currently have for emotional dependencies and tries to amplify them,” wrote Boaz Barak, a member of technical staff at OpenAI, in a series of posts on X.
Grok is available for users 13 and older, though parental permission is required for 13- to 17-year-olds to use it. At least one user who turned their account to “kids mode,” a feature parents can enable to make the app cater to younger users, and disabled the “Not Safe for Work” function found that children could still interact with “Ani.” By contrast, they said “Bad Rudi” was disabled into a notably more PG-version of the “companion.”
Unlike many competing AI models that prioritize intelligence or utility, Grok Companions are optimized for emotional engagement. The allure of conversing with an AI that flirts or remembers personal details is apparent, but the psychological risks are just as significant. The rise of companionship AI has already raised alarms about its potential to foster loneliness, dependency, and complicated questions of consent.
Platforms like Replika have faced significant backlash for encouraging romantic bonds between humans and bots, particularly when these manufactured relationships become exploitative or emotionally destabilizing. With Musk’s enormous platform, these concerns are poised to enter the mainstream.
In Grok’s case, emotional attachment is a core part of the product’s appeal. The goal isn’t just for you to use Grok, but for you to feel seen by it, and perhaps, even to fall for it.
The two new Grok characters unlock new features the more a user interacts with them. Following flirty interactions, “Ani” removes her dress to reveal a lacy lingerie set underneath and engages in more sexually explicit content, according to screengrabs shared on X of users’ interactions with the bot.
“This is pretty cool,” Musk wrote on X Sunday, followed by a tweet featuring a picture of “Ani” fully clothed. The Tesla CEO said Wednesday that “customizable companions” were also going to be “coming,” though he did not share a timeline for the launch.
But the features drew criticism from some users. “The ‘companion mode’ takes the worst issues we currently have for emotional dependencies and tries to amplify them,” wrote Boaz Barak, a member of technical staff at OpenAI, in a series of posts on X.
Grok is available for users 13 and older, though parental permission is required for 13- to 17-year-olds to use it. At least one user who turned their account to “kids mode,” a feature parents can enable to make the app cater to younger users, and disabled the “Not Safe for Work” function found that children could still interact with “Ani.” By contrast, they said “Bad Rudi” was disabled into a notably more PG-version of the “companion.”