logo
Spatial Meetings Go Beyond Remote Learning With 'Distance Zero'

Spatial Meetings Go Beyond Remote Learning With 'Distance Zero'

Forbes2 days ago

Inside the Spatial Meeting environment, students have an immersive experience of being in the ... More classroom while also being able to engage with renderings of objects, achieving an experience Cisco calls "Distance Zero."
Cisco Systems has partnered with Apple and H-Farm, an Italian educational institution and technology incubator, to launch an innovative educational initiative called 'Spatial Meetings.' This collaboration leverages Apple's Vision Pro headset and Cisco's immersive meeting infrastructure to achieve 'Distance Zero,' a term coined by Cisco to refer to the ability of technology to replicate the immediacy and interpersonal dynamics of face-to-face meetings, even when participants are geographically distant.
Cisco's Spatial Meetings utilize high-resolution, stereoscopic 3D video combined with true-to-color rendering, delivered through Apple's Vision Pro headset. This combination allows participants to see detailed facial expressions and body language and even interact with objects in a virtual yet realistic environment. As Snorre Kjesbu, Senior Vice President & General Manager of Collaboration Devices at Cisco explains, 'Good technology should always be the supporting actor. It should never get in the way—it should support what you're doing.' The goal, after all, is not to have the most realistic virtual environment possible; the goal is to have highly effective meetings in which the focus is on the content of the meeting and not the experience of the conference room.
The site for this experiment has been the campus of H-Farm. Located near Venice, Italy, H-Farm began as a startup incubator and has evolved into an educational hub, serving over 3,000 students across high school and university levels. The institution prioritizes digital innovation and, as such, was a natural early adopter of Cisco's Spatial Meetings. Diego Pizzocaro, Director at H-Farm College, recalled their initial experience of seeing the Spatial Meetings technology: 'When we tried it, it was basically a lightbulb moment.'
Describing the Spatial Meetings is challenging, as it is very much the type of thing one needs to experience for oneself. What is remarkable about the environment is the fact that it easily achieves many of the benchmarks essential for an immersive conference experience. To begin with, the head sizes and positions of the other participants are effectively the same as your own. This gives the sense of people actually being in a common space rather than being technologically mediated. The other key element is the ability to readily convey non-verbal information. Depending on the context, non-verbal communication can account for anywhere from 60% to 90% of what is communicated. Anyone who has been in a seminar where two people arguing have gone off the rails has had the experience of turning to a classmate and both rolling your eyes. This is an essential experience of a rich environment and one that is typically lacking in standard video conferencing. In Spatial Meetings these types of interactions occur naturally.
Educational applications at H-Farm using Spatial Meetings focus heavily on hybrid learning environments, significantly enhancing student engagement and concentration. 'Our students reported increased immersion and focus,' Pizzocaro noted. 'They can attend a math lecture from literally anywhere and feel like they're in the front row.' Another significant advantage is that, unlike in typical video conferencing environments, they are fully immersed in the experience, free from the distraction of competing apps or objects in their surroundings.
Cisco's Spatial Meetings allow educators and students to interact with physical objects alongside virtual renderings, facilitating interactive and comparative analysis in fields such as design, architecture, and engineering. Cisco's technology includes specialized hardware already installed in thousands of locations globally, facilitating seamless integration of this immersive meeting experience. Standing in front of a properly equipped conference room, a professor can demonstrate the properties of a physical object and then send a digital copy to a remote participant who can explore just as if she were in the classroom.
In addition to bringing remote participants into the classroom and classroom objects to remote participants, the space created by the Spatial Meetings can also facilitate interaction between students and AI. Since the meeting space is virtual, remote students and generated participants are joining the same virtual space in the same way, providing a type of ontological parity between the attendees. The fact that only one exists elsewhere in physical space ceases to be a point of significance.
Moving forward, Cisco and Apple plan to deepen their collaboration, enhancing the capabilities of the Spatial Meetings platform. H-Farm continues to develop educational content and tools to take advantage of the environment, enabling educators to integrate immersive experiences into their teaching more easily. Demonstrations and further integrations of this technology are planned to showcase its broader applicability and effectiveness. For now, the biggest challenge is the limited availability of the headsets, but this should change as newer models are released and prices come down. At the end of the day, the collaboration among Cisco, Apple, and H-Farm provides a striking illustration of how technology can not only bridge physical distances but also fundamentally enhance the educational experience.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Apple's most underrated app could change soon, and you're going to love it
Apple's most underrated app could change soon, and you're going to love it

Digital Trends

time25 minutes ago

  • Digital Trends

Apple's most underrated app could change soon, and you're going to love it

Apple's shortcuts app is a power user's dream. I think it's one of the most underrated features you can find on an iPhone, and even Macs. In case you haven't used it yet, it allows you to perform a multi-step task in one go, or even trigger certain actions automatically. One of my favorite shortcuts is instantly generating a QR code of a Wi-Fi network, instead of narrating a complex password. I've got another one that automatically deletes screenshots after a 30-day span. There are a few in my library that trigger Do Not Disturb mode for a certain time slot, turn any webpage into a PDF, even snap Mac windows, and activate my smart devices when I reach home. Recommended Videos All that sounds convenient, but creating those shortcuts isn't a cakewalk. The UI flow and action presets can overwhelm tech-savvy users when it comes to creating their own automations. Apple may have a user-friendly solution, thanks to AI, and you just might get it this year. Apple has the foundation ready According to Bloomberg, Apple is preparing an upgraded version of the Shortcuts app that will put AI into the mix. 'The new version will let consumers create those actions using Apple Intelligence models,' says the report. The AI models could be Apple's own, which means they are better suited for integration with system tools and apps than a third-party AI model. Take, for example, the Siri-ChatGPT integration. OpenAI's chatbot can handle a wide range of tasks that Siri can't accomplish, but ChatGPT isn't able to interact with other apps and system tools on your iPhone. That means it can't assist you with making cross-app Shortcuts either. At WWDC 2025, Apple is rumored to reveal its own AI models and open them to app developers, as well. The idea is to let developers natively integrate AI-driven features in their apps without having to worry about security concerns. Microsoft is already using in-house AI models for a wide range of Copilot experiences on Windows PCs. Moreover, the company also offers its Phi family of open AI models to developers for building app experiences. Apple just needs to follow in Microsoft's footsteps. With developers adopting Apple's AI foundations and the company expanding it to the Shortcuts app, it would be much easier to create multi-step workflows easily. How so? Well, just look at Gemini on Android phones. Shortcuts needs an AI makeover Imagine just narrating a workflow to Siri, and it's turned into a shortcut. That's broadly what AI tools are already capable of, but instead of creating a rule for the future, they just execute the task at hand immediately. With AI in Shortcuts, things should go like: 'Hey Siri, create a shortcut that automatically replies to all messages I get on weekends regarding my unavailability, and tell them to reach me again on Monday. Trigger the action when I say the words I'm out.' With natural language processing on AI models, that's feasible. Look no further than how Gemini works on Android devices, especially those with on-device Gemini Nano processing. With a voice command, Gemini can dip into your workspace data and get work done across Gmail, Docs, and more connected apps. It can even handle workflows across third-party apps such as WhatsApp and Spotify. The list keeps on growing, and as capabilities like Project Mariner and Astra are rolled out through Gemini Live, newer possibilities will open. With a revamped Shortcuts app, Apple just needs to get the voice processing right and convert the prompts into actionable commands. Apple's partner, OpenAI, already offers a feature called Operator that can autonomously handle tasks on the web. Creating a chain of commands across mobile apps that are running locally should be easier and less risky compared to browsing websites. With ChatGPT's language chops already baked at the heart of Apple Intelligence, I won't be surprised if the next-gen Shortcuts app exploits it to the fullest. Oh hey, here's a sample Talking about ChatGPT and its integration with iOS, there's already an open-source project out there that can give a rough idea of how voice commands turn into actions on an iPhone. Rounak Jain, an iOS engineer at OpenAI, has created an AI agent that transforms audio prompts into actions on an iPhone. 🚨🤖 Today, I'm launching an AI agent that gets things done across iPhone apps. It's powered by OpenAI GPT 4.1 and is open source. Try it out! — Rounak Jain (@r0unak) June 1, 2025 Jain says the demo video is built atop OpenAI's GPT-4.1 AI model, and it can get work done across multiple apps with a single voice command. For example, users can control the flashlight after sliding down the control center, click and send a picture to one of their contacts, or text travel details and book a cab. Jain's demo is a clear sign that integrating an AI model at the system level, or having it perform tasks across apps, is feasible. A similar pipeline can be integrated to turn those voice commands into shortcuts, instead of executing them immediately. I am just hoping that when Apple implements AI within Shortcuts and lets users create their own routines with natural language commands, it offers a flow where users have the flexibility to modify them at will. I believe the best approach would be to show users the chain of commands and let them make adjustments before the prompt is turned into a shortcut.

The Trump administration is delaying a 25% tariff on Chinese-made graphics cards
The Trump administration is delaying a 25% tariff on Chinese-made graphics cards

Yahoo

time3 hours ago

  • Yahoo

The Trump administration is delaying a 25% tariff on Chinese-made graphics cards

Graphics cards and motherboards assembled in China are avoiding President Donald Trump's import taxes, for now. In a three-page notice published Saturday in the Federal Register, the Office of the Trade Representative said it was 'appropriate' to extend a moratorium that won't subject vendors of electronics equipment to tariffs on graphics cards and graphics processing units. The measure stems from a long-running Section 301 investigation into Chinese economic policies and whether they are harming American companies. That federal probe started under the first Trump administration, and it would apply a 25% tariff if officials conclude that Chinese companies had an unfair advantage. The 25% tariff has been inactive due to a string of reprieves from both the Biden and Trump administrations. China remains the world's largest electronics manufacturer, according to the International Trade Centre; the nation is still subject to a minimum 30% import tax by the Trump administration, barring some exemptions. Major tech companies including Apple (AAPL), Nvidia (NVDA) and Microsoft (MSFT) were spared the brunt of Trump's 'reciprocal' tariffs in April. A federal court battle is now brewing over the fate of those tariffs. For the latest news, Facebook, Twitter and Instagram.

Need to Sign or Scan Papers? Here's How To Use Your iPhone's Hidden Document Scanner
Need to Sign or Scan Papers? Here's How To Use Your iPhone's Hidden Document Scanner

CNET

time3 hours ago

  • CNET

Need to Sign or Scan Papers? Here's How To Use Your iPhone's Hidden Document Scanner

These days, it's pretty easy to digitally sign important documents, but sometimes you just need to sign a physical piece of paper and scan it to send over email. When you just have to put your signature on a physical document and digitally upload it, and you don't have a standalone scanner handy, the easiest method to do it is right in your pocket — using your iPhone to turn images into PDFs. Yes, your iPhone doubles as a document scanner. It may not produce images as sharp as a dedicated scanner would, but it does a respectable job, even when the phone is positioned at odd angles trying to capture text. iPhones have had this hidden feature since iOS 11 launched in 2017, but as the cameras built into Apple phones have improved, so has their ability to take decent scans of documents and turn them into PDFs you can email. You won't need to download additional software or pay for a third-party app — Apple's Notes app that comes preinstalled on iPhones does the trick. The good news is that it's quick and easy to scan a document, save it, and send it wherever it needs to go. Keep in mind that the new iOS 18 changes the icons you use to select document scanning, which we've noted below. If you've upgraded to iOS 18, the process will be different, but we'll walk you through it. Here's how to scan a document with your iPhone. James Martin/CNET Scan a document with your iPhone or iPad To scan a document with your iPhone or iPad, first place the document on a flat surface in a well-lit area. Open up the Notes app and either open an existing note or start a new one by tapping the New Note button in the bottom right corner (pencil-in-square icon). On iOS 17 versions and older, tap the Camera button at the bottom of the screen (or if you're editing a note, the same Camera icon will be above the keyboard) and tap Scan Documents. If you're on iOS 18, instead of a Camera icon, you'll tap the Attachments button (the paperclip icon) and likewise tap Scan Documents. This will open a version of the Camera app that just looks for documents. Once you position your iPhone over the document that needs scanning and in view of the camera, a yellow rectangular layer will automatically appear over the document showing approximately what will be captured. Hover over the document for a few seconds and the iPhone should automatically capture and scan the document, but you can also tap the Shutter button in the bottom center. James Martin/CNET Sign, share or save your scanned document Once you've captured a document, you can tap it, and others you've captured in the same session, to edit them before saving them. You can also tap Retake in the top right corner to start again. When you edit the document, you can re-crop it from the original photo captured (if you need to tweak its edges), switch between color filters (color, black and white, grayscale or the unedited original photo). Then you can save the scanned document. Once it's saved into a note, you can tap the Markup button (circled pen icon) at the bottom to sketch or scribble with different colors. If you tap the Add button on the bottom right (plus sign icon), you can add text, your signature, shapes or even stickers. To send or locally save the document, tap Share button at the top (the square-and-arrow icon) to send it via Messages or apps, copy it, save it locally to the Files app, print it out via linked printer or other options. How to export your scanned document as a PDF Understandably, you may want to send your scanned document as a PDF. Tap the Share button at the top (the square-and-arrow icon) and scroll down below the contact and app roulettes to the additional list of options. The easiest way to send your scanned document as a PDF is a bit convoluted: among the aforementioned list, tap Print and then tap the Share button at the top (square-and-arrow icon) once more -- this will share your PDF-converted document. Then pick your share method of choice, most easily via email, though you can also upload it to cloud storage or send it via text message if you want. You can also use a third-party app to convert your document to PDF if you so choose. Scroll down past the Print button to find your app of choice. For instance, if you have the Adobe Acrobat app downloaded to your device, you can select Convert to PDF in Acrobat to do so -- though you'll need to wade past several screens attempting to upsell you on Adobe subscriptions first. Why can't I find the camera button to scan documents? If you're running iOS 18, the Camera button has been replaced with an Attachments button (paperclip symbol). It should function just the same: Tap it and choose Scan Documents from the dropdown menu If you can't see the Camera or the Attachments button, check to see if you've opened the note in either the iCloud section or the On My iPhone section — you'll only be able to scan documents and save them in either of these places. If you can't tell, tap Folders in the top left corner of the Notes screen and select either iCloud or On My iPhone. The documents scanner is just one of many unnoticed iPhone features that come prepackaged in Apple's handsets, often nested in the apps that come with your phone. Some hidden iOS 18 features add even more surprising capabilities already on your iPhone. But you can also find ways to do other tasks, like making a GIF on your iPhone, using third-party apps and through your browser.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store