new York
CNN
—
AI chatbots have been touted as productivity tools for consumers: They can help you plan a trip, for example, or give advice on writing a confrontational email to your landlord. But they often seem uptight or oddly stubborn or just downright weird.
And despite the proliferation of chatbots and other AI tools, many people continue I have trouble trusting and I don’t necessarily want to use them every day.
Today, Microsoft is trying to solve this problem, by focusing on the “personality” of its chatbot and how it makes users feel, not just what it can do for them.
Microsoft on Tuesday announced a major update to Copilot, its AI system, which it said marks the first step toward creating an “AI companion” for users.
The updated Copilot has new features, including real-time voice interactions and the ability to interpret images and text on users’ screens. Microsoft also claims that it is one of the fastest AI models on the market. But the most important innovation, according to the company, is that the chatbot will now interact with users in “a warm tone and distinct style, providing not only information but also encouragement, feedback and advice while confronting the daily challenges of life.
These changes could help Microsoft’s Copilot stand out in a growing sea of general-purpose AI chatbots. When Microsoft launched Copilotthen called Bing, early last year it was seen as a leader among its big tech peers in the AI arms race. But in the 18 months since, it has been overtaken by competitors with new features, like bots that can have voice conversations, and easily accessible (if imperfect) AI integrations with tools that people already use regularly, like Google Search. With the update, Copilot catches up with some of these features.
When I tried out the new Copilot Voice feature at Microsoft’s launch event on Tuesday, I asked for advice on how to support a friend who is about to have her first baby. The robot responded with practical advice, like providing meals and running errands, but it also provided more delicate advice.
“This is exciting news!” the tool said in an upbeat male voice – Copilot is designed to subtly reflect users’ tone – which the company calls Canyon. “Being there for her emotionally is an important task. Listen to her, reassure her and be her cheerleader… Don’t forget to celebrate this moment with her.
The Copilot update reflects Microsoft’s vision for how everyday people will use AI as the technology develops. Mustafa Suleyman, CEO of Microsoft AI, says people need AI to be more than a productivity tool, they need it to be a kind of digital friend.
“I think going forward, the first thought you’ll have is, ‘Hey, co-pilot,’” Suleyman told CNN in an interview before Tuesday’s announcement.
“You’re going to ask your AI companion to remember it, or buy it, or reserve it, or help me plan it, or teach me… It’s going to be a confidence boost, that will be there. to support you, that’s going to be your hype, you know? he said. “It will be present on many, many surfaces, like all your devices, in your car, in your house, and it will really start to live alongside you.”
The previous iteration of the Microsoft AI chatbot received backlash due to unexpected shifts in tone and sometimes downright responses. The bot would start an interaction seeming empathetic, but might become sassy or rude during long exchanges. In one case, the bot told a New York Times reporter he should leave his wife because “I just want to love you and be loved by you.” (Microsoft later limited the number of messages users can exchange with the chatbot during a single session, to avoid such responses.)
Some experts have also expressed broader concerns about people forming emotional attachments to robots that seem too human to the detriment of their real-world relationships.
To address these concerns while developing Copilot’s personality, Microsoft has a team of dozens of creative directors, language specialists, psychologists, and other non-technical workers to interact with and provide feedback on the model. about the ideal ways to respond.
“We really designed an AI model that’s designed for conversation, so it feels smoother and more user-friendly,” Suleyman told CNN. “He has, you know, real energy… He has character. It pushes back from time to time, it can be a little funny, and it really optimizes that long-term conversational exchange, rather than a question and answer.
Suleyman added that if you tell the new co-pilot that you love him and would like to get married, “he’ll know that’s not something he should talk to you about.” This will politely and respectfully remind you that that’s not what he’s there for.
And to avoid the kind of criticism that tenacious OpenAI on a chatbot voice that sounded like actress Scarlett Johansson, Microsoft paid voice actors to provide training data for four voice options intentionally designed not to imitate well-known characters.
“Imitation is confusing. These things are not human and they shouldn’t try to be,” Suleyman said. “They should give us enough of a feeling that they are comfortable, fun, and familiar to talk to, while still remaining separate and distant…that boundary is how we form trust.”
Building on voice functionality, the new Copilot will have a “daily” function that reads users the weather and a summary of daily updates, thanks to partnerships with media outlets including Reuters, the Financial Times and others.
Microsoft has also integrated Copilot into its Microsoft Edge browser: when users need an answer to a question or translated text, they can type @copilot in the address bar to chat with the tool.
Power users who want to experiment with features still in development will have access to what Microsoft calls “Copilot Labs.” They can test new features like “Think Deeper,” which the company says can reason through more complex questions, and “Copilot Vision,” which can see what’s on your computer screen and answer questions or suggest next steps.
After some negative reactions privacy risks with similar AI tool it was released for Windows earlier this year, called Recall, Microsoft says Copilot Vision sessions are entirely optional and that no content it sees is stored or used for training.