New York
 — 

Popular synthetic intelligence chatbots like ChatGPT and Meta AI are more and more blurring the road between real-world and digital relationships by permitting romantic and generally sexual conversations — whereas scrambling to ensure kids aren’t accessing that grownup content material.

But Microsoft desires no a part of that, the corporate’s AI CEO Mustafa Suleyman advised NCS.

“We are creating AIs that are emotionally intelligent, that are kind and supportive, but that are fundamentally trustworthy,” Suleyman mentioned. “I want to make an AI that you trust your kids to use, and that means it needs to be boundaried and safe.”

Microsoft is locked in a race with tech giants like OpenAI, Meta and Google to make its Copilot the AI software of selection in what Silicon Valley believes would be the subsequent large computing wave. Copilot now has 100 million month-to-month energetic customers throughout Microsoft’s platforms, the corporate mentioned in its most up-to-date earnings name. That’s properly beneath rivals like OpenAI, whose ChatGPT has 800 million month-to-month energetic customers.

But Microsoft is betting its strategy will win it a wider viewers, particularly as AI firms grapple with how to form their chatbot’s personalities amid studies of AI contributing to users’ mental health crises.

“We must build AI for people; not to be a digital person,” Suleyman wrote in a weblog put up earlier this 12 months.

The interview got here forward of a collection of new Copilot features that Microsoft unveiled on Thursday, which embody the flexibility to refer again to earlier chats, have interaction in group conversations, improved responses to well being questions and an non-obligatory, sassy tone referred to as “real talk.”

Some of Microsoft’s AI rivals are dealing with intense strain to hold younger customers secure on their platforms.

Families have sued OpenAI and Character.AI claiming their chatbots harmed their kids, in some instances allegedly contributing to their suicides. A string of reports earlier this 12 months raised issues that Meta’s chatbot and different AI characters would have interaction in sexual conversations even with accounts figuring out as minors.

The tech firms behind standard AI chatbots say they’ve rolled out new protections for kids, together with content material restrictions and parental controls. Meta and OpenAI are additionally implementing AI age estimation expertise aiming to catch younger customers who join with pretend, grownup birthdates — however it’s unclear how properly these techniques work. OpenAI CEO Sam Altman announced earlier this month that with its new security precautions in place, ChatGPT will quickly let grownup customers talk about “erotica” with the chatbot.

Suleyman mentioned Microsoft is drawing a shiny line at romantic, flirtatious and erotic content material, even for adults. “That’s just not something that we will pursue,” he mentioned.

That means, for now, Microsoft is unlikely to roll out a “young user” mode like a few of its rivals — as a result of they shouldn’t want it, Suleyman mentioned.

A key focus for Microsoft is coaching Copilot to encourage customers to work together with different people, not simply AI. That’s key for an organization that has constructed its enterprise round offering work-oriented productiveness instruments.

Its new “groups” characteristic will let up to 32 folks — assume, classmates engaged on an task or buddies planning a visit — be part of a shared chat with Copilot, the place the chatbot can chime in with strategies.

That theme of pointing customers to actual folks applies to Copilot’s well being updates, too. The chatbot will suggest close by docs for sure medical-related queries, and can in any other case draw on “medically trusted” sources reminiscent of Harvard Health.

Suleyman mentioned he believes this push to get Microsoft’s AI chatbot to assist strengthen human-to-human relationships “is a very significant tonal shift to other things that are happening in the industry at the moment, which are starting to see these things as deep simulations where you can go off into your own world and have an entire parallel reality, including, in some cases, adult content.”



Sources