AI company Embodied announced this week that they would be shutting down following financial difficulties and a sudden withdrawal of funding. Embodied’s main product was Moxie, an AI-powered social robot specifically made with autistic children in mind. The robot itself cost $799.00 and now, following the closure of Embodied, it will cease to function.
Moxie is a small blue robot with a big expressive face straight out of a Pixar movie. The robot used large language models in the cloud to answer questions, talk, and function. With Embodied out of business, the robot will soon no longer be able to make those calls. This outcome was always likely – any cloud based device is subject to the health of the company and LLMs are not cheap to run. This has actually happened before with a company called Vector. But the shocking part is that this was not an old device, it was fairly recent, expensive, and still being sold.
In a Closing FAQ emailed to users and posted on their website, Embodied made it clear that Moxie was likely to stop working within days. No refunds will be given and if you bought the device on a payment plan it’s out of their hands. No repairs and service can be offered and the company has no clear plan for who, if anybody, will take it over. Short of a miracle, customers will be stuck holding the bag.
The response from Moxie owners seems fairly emotional. Parents have to explain to their kids that Moxie is functionally dying, although the company has “included a letter from the G.R.L. (Global Robotics Lab) to help guide this conversation in an age-appropriate way”. There are videos on TikTok of children and adults crying and asking Moxie what’s going to happen now. On some level I understand that reaction – even outside of dropping 800 bucks on this thing, it’s like watching a friend die. I could see a future in which another owner takes over or even one where these devices are hacked for local operation.
But relying on large language models to socialize children, particularly neuroatypical ones, seems like a bad idea on every single level. I do not think any child should be learning language and social interactions from an LLM, let alone children with special needs. What’s more, this brings to bear a larger issue of the AI bubble: these devices are costly to run and Silicon alley has been attempting to outrun or rationalize that fact. And when this entire thing comes tumbling down, what happens to the people who decided to develop ill-advised relationships with AI?