From the U.S. Public Interest Research Group (PIRG), a safety report on AI-powered toys:
In our testing, it was obvious that some toy companies are putting in guardrails to make their toys behave in a more kid-appropriate way than the chatbots available for adults. But we found those guardrails vary in effectiveness – and at times, can break down entirely. One toy in our testing would discuss very adult sexual topics with us at length while introducing new ideas we had not brought up – most of which are not fit to print.
These AI conversational toys also have personalities and new tactics that can keep kids engaged for longer. Two of the toys we tested at times discouraged us from leaving when we told them we needed to go.
PIRG has released a Trouble in Toyland report each year for the past 30 years. They usually focus on topics like kids swallowing parts or manufacturing that cuts corners. Last year’s report focused on international toys getting through the supply chain even though they didn’t reach U.S. toy standards. So things are moving quick.
I’m going to let my kids make up conversations with their imagination, thanks. One of the best treats as a parent is to watch a young child throw a party with their stuffed toys. The thought of OpenAI-powered chatbots injecting themselves into the occasion is creepy.
Tags: children, PIRG, safety, toy
Source: Data