Skip to Content
Hardware

AI Company That Made Robots For Children Went Bust And Now The Robots Are Dying

Embodied, maker of the AI robot called Moxie, is shuttering. With their closing, parents have to explain to their kids that Moxie is dead.

A child embracing a Moxie robot.

This robot is now dead. Credit: Embodied, Inc.

AI company Embodied announced this week that they would be shutting down following financial difficulties and a sudden withdrawal of funding. Embodied’s main product was Moxie, an AI-powered social robot specifically made with autistic children in mind. The robot itself cost $799.00 and now, following the closure of Embodied, it will cease to function. 

Credit: Embodied, Inc.

Moxie is a small blue robot with a big expressive face straight out of a Pixar movie. The robot used large language models in the cloud to answer questions, talk, and function. With Embodied out of business, the robot will soon no longer be able to make those calls. This outcome was always likely – any cloud based device is subject to the health of the company and LLMs are not cheap to run. This has actually happened before with a company called Vector. But the shocking part is that this was not an old device, it was fairly recent, expensive, and still being sold. 

In a Closing FAQ emailed to users and posted on their website, Embodied made it clear that Moxie was likely to stop working within days. No refunds will be given and if you bought the device on a payment plan it’s out of their hands. No repairs and service can be offered and the company has no clear plan for who, if anybody, will take it over. Short of a miracle, customers will be stuck holding the bag.

The response from Moxie owners seems fairly emotional. Parents have to explain to their kids that Moxie is functionally dying, although the company has “included a letter from the G.R.L. (Global Robotics Lab) to help guide this conversation in an age-appropriate way”. There are videos on TikTok of children and adults crying and asking Moxie what’s going to happen now. On some level I understand that reaction – even outside of dropping 800 bucks on this thing, it’s like watching a friend die. I could see a future in which another owner takes over or even one where these devices are hacked for local operation

But relying on large language models to socialize children, particularly neuroatypical ones, seems like a bad idea on every single level. I do not think any child should be learning language and social interactions from an LLM, let alone children with special needs. What’s more, this brings to bear a larger issue of the AI bubble: these devices are costly to run and Silicon  alley has been attempting to outrun or rationalize that fact. And when this entire thing comes tumbling down, what happens to the people who decided to develop ill-advised relationships with AI? 

H/T Ashleigh Stoneman

Enjoyed this article? Consider sharing it! New visitors get a few free articles before hitting the paywall, and your shares help more people discover Aftermath.

Stay in touch

Sign up for our free newsletter

More from Aftermath

Report: Bobby Kotick Files Defamation Suit Against Gizmodo And Kotaku

The suit is over details in a pair of articles about Kotick's rumored interest in buying TikTok

The Suikoden Series Is As Relevant As Ever

"It does something the original Star Wars movies did really well, which is showing normal people doing exceptional things"

Why Did No One Tell Me Duolingo Is Like This

I thought it was just, you know, a learning app

See all posts