logo
How to use Copilot Vision : Professionals and Educators Guide 2025

How to use Copilot Vision : Professionals and Educators Guide 2025

Geeky Gadgets5 hours ago

Have you ever wished your computer could truly understand what you need—without endless clicks, menus, or instructions? With Copilot Vision in Microsoft Copilot, that wish is no longer a distant dream. Imagine seamlessly navigating your workday by simply sharing your screen and giving voice commands, while an AI assistant handles the heavy lifting in real time. Whether you're managing a complex project in OneNote, solving math equations step by step, or delivering a polished lesson as an educator, Copilot Vision transforms how you interact with technology. This isn't just a tool; it's a smarter, more intuitive way to work, learn, and collaborate.
In this how-to by Mike Tholfsen, you'll learn the full potential of Copilot Vision, from its AI-powered screen sharing to its ability to generate automatic transcripts of your sessions. We'll explore how it integrates with tools like Microsoft Edge and OneNote, making even the most intricate tasks feel effortless. Along the way, you'll discover features that cater to educators, students, and professionals alike—like personalized learning support and real-time feedback. Whether you're new to Copilot Vision or looking to master its advanced capabilities, this guide, inspired by insights from Mike Tholfsen, will help you unlock a more productive, hands-free workflow. After all, the future of work and learning isn't just about doing more—it's about doing it smarter. Getting Started with Copilot Vision
To begin using Copilot Vision, access it through the Consumer Copilot app on your Windows device. If the app is not pre-installed, you can download it from the Microsoft Store. Once installed, activate Copilot Vision to start sharing windows and interacting with the AI assistant. The interface is designed to be intuitive, allowing you to focus on your work while the AI manages technical details in the background.
To ensure a smooth experience, verify that your device meets the system requirements for the app. Once activated, you can immediately begin exploring its features, such as voice-activated commands and real-time content analysis, which are tailored to streamline your workflow. Key Features of Screen Sharing
One of the most notable features of Copilot Vision is its ability to share specific windows with the AI assistant. This functionality allows the AI to analyze and respond to the content displayed in real time, making it a powerful tool for multitasking. For example, you can share your Microsoft Edge browser or a OneNote document and use voice commands to: Navigate tools and menus effortlessly without manual input.
Request explanations or clarifications on visible content for better understanding.
Receive constructive feedback to refine your work or ideas.
This hands-free interaction not only saves time but also enhances productivity by allowing you to focus on critical tasks while the AI handles routine operations. The seamless integration of screen sharing with voice commands ensures a fluid and efficient workflow. Using Copilot Vision for Smarter Work & Learning
Watch this video on YouTube.
Discover other guides from our vast content that could be of interest on Copilot Vision. Enhanced OneNote Integration
Copilot Vision is particularly effective when paired with OneNote, offering a range of features that benefit both educators and students. The AI assistant recognizes toolbar functions, including those in Class Notebook, and provides step-by-step guidance on their use. This integration allows educators to: Distribute pages to students quickly and efficiently.
Organize and manage student notebooks with minimal effort.
Access detailed explanations of advanced features to maximize functionality.
For students, this integration simplifies the learning process by offering personalized support and guidance. Whether you're navigating OneNote for the first time or exploring its advanced tools, Copilot Vision ensures you can unlock its full potential without requiring extensive training. Math Tutoring and Visual Learning
Copilot Vision excels in supporting math-focused tasks, offering robust tutoring capabilities that break down complex problems into manageable steps. For example, it can guide you through solving equations like the Pythagorean theorem by providing clear, step-by-step instructions. Additionally, it supports: Drawing and annotating graphs or shapes to enhance visual understanding.
Offering interactive tools for solving complex mathematical concepts.
These features are invaluable for subjects that demand detailed explanations and visual aids. By combining interactive problem-solving tools with AI-driven guidance, Copilot Vision makes learning math more engaging and accessible, catering to a wide range of educational needs. Automatic Transcript Creation
Another significant feature of Copilot Vision is its ability to automatically record and generate transcripts of all interactions. This functionality is particularly useful in educational and professional settings, where revisiting explanations or instructions can reinforce understanding. The transcript ensures that no detail is overlooked, providing a reliable reference for: Reviewing key points discussed during a session.
Tracking progress and identifying areas for improvement.
Sharing detailed records with team members or students for collaboration.
By offering a comprehensive record of interactions, Copilot Vision supports continuous learning and ensures that users can revisit important details whenever needed. Applications in Education
Copilot Vision is designed to enhance collaboration and learning experiences, making it an invaluable tool for educators and students alike. Its AI-powered tools and real-time feedback create an interactive environment that fosters engagement and productivity. Key educational applications include: Improving lesson delivery with AI-guided tools that simplify complex topics.
Providing personalized support to students during assignments or projects.
Facilitating content sharing and annotation for group activities.
By integrating AI assistance with collaborative features, Copilot Vision transforms traditional learning methods into dynamic, interactive experiences. Its ability to adapt to various educational scenarios ensures that both teachers and students can benefit from its capabilities. Availability and Regional Support
Copilot Vision is available globally, although access may vary depending on your region. Microsoft provides localized support to ensure users can understand and use the feature effectively in their area. This approach ensures that Copilot Vision is accessible to a diverse audience, making it a valuable tool for users worldwide.
To check availability in your region, visit the Microsoft website or consult the app's settings for localized options. By offering tailored support, Microsoft ensures that users can maximize the benefits of Copilot Vision regardless of their location.
Media Credit: Mike Tholfsen Filed Under: AI, Guides
Latest Geeky Gadgets Deals
Disclosure: Some of our articles include affiliate links. If you buy something through one of these links, Geeky Gadgets may earn an affiliate commission. Learn about our Disclosure Policy.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

How is Tesla expected to remotely control its robotaxis, and what are its limitations?
How is Tesla expected to remotely control its robotaxis, and what are its limitations?

Reuters

time25 minutes ago

  • Reuters

How is Tesla expected to remotely control its robotaxis, and what are its limitations?

June 20 (Reuters) - Tesla (TSLA.O), opens new tab is expected to tiptoe into its long-awaited robotaxi service in Austin, Texas, as soon as Sunday with about 10 of its Model Y SUVs that will operate within strict limits. CEO Elon Musk has said the company is being "super paranoid" about safety and that humans will remotely monitor the fleet. Remote access and control - known in the industry as "teleoperation" - is used in varying degrees by the handful of robotaxi startups operating around the globe. The technology has clear advantages and important limitations. Here are some details of how it works: Teleoperation is the control of machines by humans in a different location, usually over a wireless network. It is used to train robots to operate autonomously, monitor their autonomous activity, and take over when required. The global robotaxi industry is still in test mode, as companies deploy the vehicles in limited geographic areas and continually adjust the artificial intelligence software that controls them. Teleoperation is often used to intervene when a vehicle is unsure of what to do. Alphabet's (GOOGL.O), opens new tab Waymo, for example, has a team of human "fleet response" agents who respond to questions from the Waymo Driver - its bot. "Much like phone-a-friend, when the Waymo vehicle encounters a particular situation on the road, the autonomous driver can reach out to a human fleet response agent for additional information," Waymo said in a blog post last year. Former Waymo CEO John Krafcik told Reuters, "the cars aren't being actively monitored," adding that the software is "the ultimate decision-maker." A Waymo video shows a car asking a remote operator whether a street with emergency response vehicles is open to traffic. When the human says yes, the vehicle proceeds. In contrast, other companies, such as Baidu's Apollo Go in China, have used fully remote backup drivers who can step in to virtually drive the vehicles. Baidu declined to comment. Driving vehicles remotely on public roads has a major potential problem: it relies on cellular data connections that can drop or operate with a lag, disconnecting the vehicle from the remote driver in dangerous situations. Philip Koopman, a Carnegie Mellon University engineering professor and autonomous-vehicle safety expert, said that approach could work for a small test deployment of 10 vehicles, such as Tesla's initial effort in Austin, but he called teleoperation "inherently unreliable technology." "Eventually you will lose connection at exactly the worst time," he said. "If they've done their homework, this won't ever happen for 10 cars. With a million cars, it's going to happen every day." Former Waymo CEO Krafcik agreed, adding that the time delay in cell signal makes remote driving "very risky." On the other hand, relying on the vehicle to reach out for help and allowing the vehicle to be the decision-maker are risky as well, Koopman said, as it does not guarantee the vehicle will make the right decision. Waymo declined to comment on the limitations of its approach. Koopman also noted there are limits to how many vehicles one person can safely monitor. A group of Democratic Texas lawmakers asked Tesla on Wednesday to delay its robotaxi launch until September, when a new autonomous-driving law is scheduled to take effect. The Austin-area lawmakers said in a letter that delaying the launch "is in the best interest of both public safety and building public trust in Tesla's operations." Musk for years has promised, without delivering, that its Full Self-Driving (Supervised) advanced driver assistance software would graduate to completely self-driving and control robotaxis. This year, he said Tesla would roll out a paid service in Austin underpinned by an "unsupervised" version of the software. "Teslas will be in the wild, with no one in them, in June, in Austin," Musk told analysts and investors in January. In May, he told CNBC that the robotaxi would only operate in parts of Austin that are safe for it, would avoid difficult intersections, and would use humans to monitor the vehicles. What those teleoperators will do is not clear. For years inside Tesla, company executives have expected to use teleoperators who could take over in case of trouble, said one person familiar with the matter. For instance, if a robotaxi were stuck in a crowded pedestrian area and confused about what to do next, a human teleoperator could take over and guide it, the source said. Tesla advertised for teleoperation positions, saying the company needs the ability to "access and control" autonomous vehicles and humanoid robots remotely. Such employees can "remotely perform complex and intricate tasks," it said in the advertisements. Tesla did not respond to a request for comment. "We are being super paranoid about safety, so the date could shift," Musk said in a post on X last week while providing a tentative launch date of June 22.

Exclusive: Nvidia, Foxconn in talks to deploy humanoid robots at Houston AI server making plant
Exclusive: Nvidia, Foxconn in talks to deploy humanoid robots at Houston AI server making plant

Reuters

time41 minutes ago

  • Reuters

Exclusive: Nvidia, Foxconn in talks to deploy humanoid robots at Houston AI server making plant

TAIPEI, June 20 (Reuters) - Taiwan's Foxconn ( opens new tab and U.S. artificial intelligence chips maker Nvidia (NVDA.O), opens new tab are in talks to deploy humanoid robots at a new Foxconn factory in Houston that will produce Nvidia AI servers, two sources familiar with the matter said. This would be the first time that an Nvidia product will be made with the assistance of humanoid robots and would be Foxconn's first AI server factory to use them on a production line, the sources said. A deployment, expected to be finalised in the coming months, would mark a milestone in the adoption of the human-like robots that promises to transform manufacturing processes. Foxconn is developing its own humanoid robots with Nvidia and has also trialed humanoids made by China's UBTech ( opens new tab. The sources said it was not clear what type of humanoid robots are being planned for use in the Houston factory, what they will look like or how many will be deployed initially. They said the two companies are aiming to have the humanoid robots at work by the first quarter of next year when Foxconn's new Houston factory will begin production of Nvidia's GB300 AI servers. And while it was not clear what exactly the robots will be doing at the factory, Foxconn has been training them to pick and place objects, insert cables and do assembly work, according to a company presentation in May. Foxconn's Houston factory was ideally suited to deploy humanoid robots because it will be new and have more space than other existing AI server manufacturing sites, one of the sources said. Nvidia and Foxconn declined to comment. The sources did not wish to be identified as they are not authorised to speak to the media. Leo Guo, general manager of the robotics business unit at Foxconn Industrial Internet ( opens new tab, a subsidiary of Foxconn that is in charge of the group's AI server business, said last month at an industry event in Taipei that Foxconn plans to showcase at the company's annual technology event in November two versions of humanoid robots that it has developed. One of those will be with legs and the other will use a wheeled autonomous mobile robot (AMR) base, which would cost less than the version with legs, he said, without disclosing details. Nvidia announced in April that it planned to build AI supercomputer manufacturing factories in Texas, partnering with Foxconn in Houston and Wistron ( opens new tab in Dallas. Both sites are expected to ramp up production within 12 to 15 months. For Nvidia, using humanoid robots in the manufacturing of its AI servers represents a further push into the technology as it already supplies humanoid makers with a platform they can use to build such robots. Nvidia CEO Jensen Huang predicted in March that their wide use in manufacturing facilities was less than five years away. Automakers such as Germany's Mercedes-Benz ( opens new tab and BMW ( opens new tab have tested the use of humanoids on production lines, while Tesla (TSLA.O), opens new tab is developing its own. China has also thrown its weight behind humanoids, betting that many factory tasks will eventually be performed by such robots.

Keir Starmer's AI tsar to step down after six months in role
Keir Starmer's AI tsar to step down after six months in role

The Guardian

timean hour ago

  • The Guardian

Keir Starmer's AI tsar to step down after six months in role

Keir Starmer's artificial intelligence tsar, a key figure in steering the government's approach to artificial intelligence, is stepping down after six months in the role. Matt Clifford, the author of the government's AI opportunities action plan, said: he would leave his post next month for personal reasons. He described his work on drafting and implementing the 50-point plan as a 'privilege', adding he was 'hugely optimistic about the UK's potential to be an AI superpower'. 'For family reasons, I will step back from my role as the prime minister's adviser on AI opportunities at the end of July, but I'm delighted that this important work will continue across government.' A government spokesperson said Starmer had thanked Clifford, who was appointed in January, for his 'dedicated work' on AI policy. 'We will be building on this work to bolster AI expertise across government and cement the UK's position as a world leader in AI,' the spokesperson said. Clifford came to prominence as a tech investor – he is the chair of the investment firm Entrepreneurs First – but was already established as an influential political adviser before Labour won the 2024 general election. The 39 year-old played a key role in organising the global AI Safety summit, hosted by Rishi Sunak in 2023, and establishing the government's AI Safety Institute, now called the AI Security Institute. Clifford published the action plan in January and its recommendations were accepted in full by the government. They included: creating AI 'growth zones' to host data centres that are the 'central nervous system' of the technology; embedding AI in the public sector; and creating 'national champion' AI companies. Sign up to Headlines UK Get the day's headlines and highlights emailed direct to you every morning after newsletter promotion The plan also recommended changes to the UK's copyright regime, reflecting the need for AI companies to use copyright-protected data to train their systems. The issue has become a battleground between the government and the tech sector on one side and the creative industries on the other, who argue that it poses a serious threat to creative professionals' livelihoods. Beeban Kidron, a crossbench peer and a leading campaigner against the proposed copyright changes, criticised the government for taking guidance from tech sector-linked advisers such as Clifford. At the time sources told the Guardian that Clifford had agreed not to buy or sell any of the companies he part-owns while working for the government, or to be involved in decisions on new investments made by Entrepreneur First.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store