Almost every major tech company with an AI model has recently announced major upgrades over the past week, and now it is Microsoft's turn. The Redmond-based company unveiled a couple of new Surface Laptops this week and also announced that its AI chatbot, Copilot, is getting an upgrade.
What makes this announcement interesting is that Microsoft is taking the help of its partner OpenAI to upgrade its AI chatbot. Last week OpenAI showcased its GPT-4o model, which is a multimodal AI model that can use vision and audio inputs to answer user queries. It is this new model Microsoft's Copilot will plug into, enhancing its capabilities and providing users with new ways to search for information.
What Can GPT-4o Powered Copilot Do?
With GPT-4o, Copilot is expected to become more powerful than ever before. Instead of just text, users can use visuals and audio to ask the chatbot questions, and it will use information from its surroundings to answer them. The new version of Copilot will be more accurate and precise and can even perform actions like editing photos and opening applications.
One example of how Copilot will work once it is upgraded with GPT-4o is that it can see what is on your screen and provide assistance and suggestions when performing various tasks. For instance, it can watch you play games like Minecraft and provide helpful tips. So, you can have AI assistance seamlessly while doing anything on your PC.
Additionally, you can have more natural conversation with the chatbot, as GPT-4o can simulate and mimic human emotions quite well.
What is GPT-4o?
GPT-4o is OpenAI's new flagship generative AI model, wherein the 'o' stands for omni. It was announced on May 13th and can utilize audio and video in addition to text to provide responses in the form of text, images, or audio. This makes it incredibly powerful, much more than existing models which only focused on text-based processing.
In fact, its vision capabilities are quite advanced, due to which it can complete tasks involving image recognition and manipulation more easily. In addition to being more powerful, GPT-4o is more cost-effective which makes it more accessible for a larger number of developers and users.
GPT-4o can also detect and simulate human emotions, which can be seen in the outputs it provides. This means users can enjoy conversations that feel more natural and engaging. However, how many of these capabilities will make it into Copilot is not known right now.
When Will GPT-4o-Powered Copilot Be Available?
The new Copilot that utilizes the power of GPT-4o will first be available on Copilot Plus PCs and the newly announced Microsoft Surface Pro and Surface Laptop. It will also be available on Copilot Plus devices offered by Microsoft's partners, such as Dell, HP, Lenovo, Asus, Samsung, and Acer.
We do not know whether it will be available on existing Windows machines, but given the hardware requirements for the new software, it seems unlikely.
Having a powerful AI assistant that can not only use text but also audio and video to gather information and provide answers can be very useful, especially from an accessibility point of view. And that is just what Microsoft is promising the new Copilot can do while offering more lifelike conversation and delivering faster results.
The company's decision to integrate Copilot with the latest AI model from OpenAI might be what it needs to compete with rivals Google and Apple. Google recently unveiled its own updated AI models while also announcing a few new ones, while Apple is expected to make its own AI-related announcements at WWDC next month.
Member discussion