Microsoft's Copilot AI Service Boosted by Local PC Integration and Enhanced Features
The Artificial Intelligence Podcast - A podcast by Dr. Tony Hoang
Categories:
Microsoft's Copilot AI service will soon be able to run locally on PCs, thanks to built-in neural processing units (NPUs) with over 40 trillion operations per second (TOPS) of power. By running more elements of Copilot locally, lag time will be reduced and performance and privacy may be improved. Currently, Copilot runs primarily in the cloud, causing delays for smaller tasks. Intel's Lunar Lake chips, shipping in 2025, will have triple the NPU speeds of its current chips. Microsoft is also expanding Copilot's capabilities in Teams, allowing it to pull insights from both meeting chat and call transcripts, help rewrite messages, and generate new messages based on chat context. Microsoft is also introducing features to improve hybrid meetings, such as individual video feeds for each attendee and automatic camera switching for the best view. These updates will be rolled out in the coming months. --- Send in a voice message: https://podcasters.spotify.com/pod/show/tonyphoang/message