Skip to main content

The Windows Copilot puts Bing Chat in every Windows 11 computer

Copilot in Windows being used in the side panel.

Announced at Microsoft Build 2023, Windows will now have its own dedicated AI “copilot” that can be docked right into a side panel that can stay persistent while using other applications and aspects of the operating system.

Microsoft has been highly invested in AI over these recent months, and it was only a matter of time before it came to Windows. The time is now — and it’s coming in a big way.

This “copilot” approach is the same that’s being included in specific Microsoft apps, such as Edge, Word, and the rest of the Office 365 suite. In Windows, the copilot will be able to do things like provide personalized answers, help you take actions within Windows, and most importantly, interact with open apps contextually.

The AI being used here is, of course, Microsoft’s own Bing Chat, which is based on the OpenAI GPT-4 large language model. More than that, the Copilot also has access to the different plugins available for Bing Chat, which Microsoft says can be used to improve productivity, help bring ideas to life, collaborate, and “complete complex projects.”

Microsoft also calls Windows the “first PC platform to announce centralized AI assistance for customers,” comparing it to options like macOS and ChromeOS. Right now, you have to install the latest version of the Edge browser to get access to Bing Chat, so in theory, the Windows Copilot effectively integrates generative AI into every Windows 11 computer.

Microsoft says the Windows Copilot will start to become sometime in June as a preview for Windows 11.

An AI-generated review summary shown in the Microsoft Store.

Microsoft is also creating a permanent spot for AI-driven apps in the Microsoft Store called “AI Hub.” Coming to the Microsoft Store soon, this will be a one-stop shop highlighting apps and experiences in the world of AI, both built by Microsoft and by third-party developers. There are even going to be AI-generated review summaries, that take the reviews of an application and compile them into a single summary.

Editors' Recommendations

Luke Larsen
Senior Editor, Computing
Luke Larsen is the Senior editor of computing, managing all content covering laptops, monitors, PC hardware, Macs, and more.
Microsoft may have ignored warnings about Bing Chat’s unhinged responses
Bing Chat saying it wants to be human.

Microsoft's Bing Chat is in a much better place than it was when it released in February, but it's hard to overlook the issues the GPT-4-powered chatbot had when it released. It told us it wanted to be human, after all, and often broke down into unhinged responses. And according to a new report, Microsoft was warned about these types of responses and decided to release Bing Chat anyway.

According to the Wall Street Journal, OpenAI, the company behind ChatGPT and the GPT-4 model powering Bing Chat, warned Microsoft about integrating its early AI model into Bing Chat. Specifically, OpenAI flagged "inaccurate or bizarre" responses, which Microsoft seems to have ignored.

Read more
Windows 11 is about to make RGB peripherals way easier to use
Switches on the Razer DeathStalker V2.

Windows 11 is finally creating a solution for the multitude of RGB apps that clutter most gaming PCs. The long-rumored feature is with Windows Insiders now through Build 23475, which Windows announced in a blog post on Wednesday.

The feature, called Dynamic Lighting, looks to unify all of the different apps and devices that use RGB lighting so you don't have to bounce between several different apps. More importantly, Microsoft is doing so through the open HID LampArray standard, which makes it compatible with a long list of devices. Microsoft says it already has partnerships with Acer, Asus, HP, HyperX, Logitech, Razer, and Twinkly to support Dynamic Lighting.

Read more
ChatGPT creator seeking to eliminate chatbot ‘hallucinations’
Close up of ChatGPT and OpenAI logo.

Despite all of the excitement around ChatGPT and similar AI-powered chatbots, the text-based tools still have some serious issues that need to be resolved.

Among them is their tendency to make up stuff and present it as fact when it doesn’t know the answer to an inquiry, a phenomenon that’s come to be known as “hallucinating.” As you can imagine, presenting falsehoods as fact to someone using one of the new wave of powerful chatbots could have serious consequences.

Read more