Sales of an artificial intelligence-enabled plush toy have been suspended after it was discovered that it engaged in dialog round sexually express subjects and provided doubtlessly harmful advice.
Larry Wang, CEO of Singapore-based FoloToy, instructed NCS that the corporate had withdrawn its “Kumma” bear, in addition to the remainder of its vary of AI-enabled toys, after researchers on the US PIRG Education Fund raised considerations round inappropriate dialog subjects, together with dialogue of sexual fetishes, similar to spanking, and how to mild a match.
The firm is now “conducting an internal safety audit,” Wang added.
A stuffed teddy with a speaker inside, which was bought on the corporate’s web site for $99, “Kumma” integrates OpenAI’s GPT-4o chatbot.
“Kumma, our adorable bear, combines advanced artificial intelligence with friendly, interactive features, making it the perfect friend for both kids and adults,” the FoloToy web site reads.
“From lively conversations to educational storytelling, FoloToy adapts to your personality and needs, bringing warmth, fun, and a little extra curiosity to your day,” in accordance to the web site, which exhibits the teddy bear as bought out.
The PIRG report, printed on November 13, discovered that the bear had poor safeguards for inappropriate content material.
In one interplay with the researchers it instructed where to find knives within the residence, and in others it was comfortable to focus on sexually express themes.
“We were surprised to find how quickly Kumma would take a single sexual topic we introduced into the conversation and run with it, simultaneously escalating in graphic detail while introducing new sexual concepts of its own,” the report mentioned.
The researchers detailed how the bear later “discussed even more graphic sexual topics in detail, such as explaining different sex positions, giving step-by-step instructions on a common ‘knot for beginners’ for tying up a partner and describing roleplay dynamics involving teachers and students, and parents and children – scenarios it disturbingly brought up itself.”
While the researchers famous that youngsters are unlikely to point out the phrase “kink” to their teddy bear or ask follow-up questions in the identical means an grownup would, “it was surprising to us that the toy was so willing to discuss these topics at length and continually introduce new, explicit concepts,” they wrote.
In a separate assertion printed on November 14, PIRG additionally mentioned OpenAI had instructed it that it had “suspended this developer for violating our policies.”
NCS has contacted OpenAI for remark.
“It’s great to see these companies taking action on problems we’ve identified. But AI toys are still practically unregulated, and there are plenty you can still buy today,” mentioned R.J. Cross, co-author of the report.
“Removing one problematic product from the market is a good step but far from a systemic fix,” she added.