
I Always Dreamed of Expanding My Desktop With Glasses. This Software Made It Real
Display glasses like Xreal's and others already work as plug-in displays for lots of devices, and show a virtual monitor that can feel big and TV-like. But they can't do multi-app multitasking, and that's why my recent test-drive of Spacetop's software got me so intrigued. I can see a future forming here, if other software companies figure out a way to work better with things like glasses.
Spacetop is made by Sightful, a startup that I met with several years ago when the concept was a display-free laptop keyboard base bonded with tethered Xreal glasses that became the laptop's monitor. That product never happened: instead of using custom-made Chromebook-like laptop bases with Qualcomm processors, the founders pivoted over to using more AI-focused "NPUs" on recent thin AI laptops with processors made by Intel, which Sightful's team says has offered better performance without needing to make a new device to work with glasses.
"The moment we saw [Microsoft's] announcements about AI computers -- that everyone's computers, in the coming few years, are going to be AI computers -- it made perfect sense to say we can enable the audience earlier and faster than if we built our own integrated solution," Sightful's founders, Tamir Berliner and Tomer Kahan, told me when Spacetop transitioned to its new business plan last fall. Instead of a whole new "AR laptop," Spacetop is subscription software that runs on certain Windows laptops and connects with a particular model of Xreal Air 2 Ultra smart glasses.
I can't shoot photos to show what I saw, but think of it as a larger-size curved space where apps can be laid out from your laptop.
Scott Stein/CNET
The experience: A 180-degree floating desktop
What you get, running this software layer, is a curved desktop space that floats in the air, indicated by small arrays of dots, which you can open Windows apps onto, drag around, and resize as needed. It feels like a desktop for my laptop, but one that's larger and doesn't need my laptop screen at all. Provided you're OK wearing display glasses, this is the way I'd prefer to work: Making my own screens wherever I go and feeling like I've got a larger-scale office without needing to prop open anything else.
Spacetop opens up the conversation around what glasses could be doing when connected with our own computers. That's the part that's missing on most phones and laptops and tablets right now.
Xreal's most recent glasses, the Xreal One, already can fix a curved display in space. Spacetop's software pushes the capabilities more by having more of a handshake with the software on the laptop, which manages what apps will show on-glasses. Qualcomm began working on this type of software with Spaces, which ran on Android phones and interfaced with connected glasses. Google's upcoming Android XR software looks like it could possibly do the same down the road. Apple's Vision Pro, which can run a variety of iPad apps and float them anywhere while simultaneously mirroring a Mac monitor, is a bulky device in comparison, and you need both a Vision Pro and a MacBook to float apps around in the way that Spacetop's software enables.
Spacetop's rendering isn't exactly how I saw it, but it's close enough to describe the effective experience (the field of view in-glasses is smaller, but you can turn your head to see apps all around you).
Spacetop
You can't do much more than open individual 2D apps up, though. That's fine for everyday work, and Spacetop's software is aimed at business subscriptions, for people who might want to get more work space beyond their laptop screen while on the road. I could see a use for this in meetings or in situations where you'd want to be looking at something in the real world while floating windows in the air around you. That might sound bizarre, but I used the Xreal One glasses back in January to take notes on my phone while watching a presentation: my notes app just hovered off to the side of the live speakers I was in the same room with.
Clever details and awkward moments
Spacetop's little software touches are clever. A little toolbar handles app launching, and a duplicate of your laptop display rests on the bottom of the floating desktop, lining up mostly with the actual laptop display that's open. I found that I could glance around at the open floating windows and then go down to the laptop screen and adjust settings if I needed to without feeling strange. My mouse cursor came along with me, either floating in air or appearing back on the laptop screen again as needed, mostly automatically.
The glasses connect via USB-C cable to one of the laptop's Thunderbolt-enabled ports for video and audio to work.
Scott Stein/CNET
That doesn't mean there aren't quirks: I found the pop-up displays sometimes were slow to launch or didn't launch at all, something Sightful suggested I unplug and re-plug the glasses in to fix.
There's also the limited field of view on the glasses to consider. As good as Xreal's glasses are at projecting a quality OLED display in the air, the viewing area is still limited to what feels like a boxed-out rectangle in the middle of your vision. It feels like about the same dimensions as a medium-to-large monitor, and can fit a couple of windows (or one large one) into view easily, but to see the rest of the floating apps around you you'll need to turn your head around to make sure the other parts of the curved desktop come into view. The Xreal Air 2 Ultra glasses can also make your surroundings dimmer like sunglasses, or turn the glasses more transparent as needed, and they have their own speakers.
Prescription inserts are needed for me to use these Xreal Air 2 Ultra glasses, adding an extra layer of thickness. But there are adjustable nose pads.
Scott Stein/CNET
A potential future for glasses (but ideally without a subscription)
The Spacetop subscription is $200 a year, on top of needing a specific pair of $699 Xreal Air 2 Ultra glasses (Sightful is selling the glasses and one year of the software together for $899). Sightful needs these particular glasses because they have full-room tracking capabilities built in, which can be used in a travel mode to make sure the floating monitor stays centered wherever the laptop is. The software also needs to run, for now, on particular Windows AI laptops with Intel NPUs. I tested on an HP Elitebook.
It's hardly something for the average person right now, but it does show me exactly what I really want: ways for my own laptops and tablets and phones to work better with glasses-as-displays. I think it can happen. Microsoft, Google, and Apple are going to have to wake up and play a better part. In the meantime, Sightful's Spacetop is making some things happen on its own.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
4 hours ago
- Yahoo
Microsoft (MSFT) Unveils AI-Powered ‘Copilot Mode' for Edge Browser
Microsoft Corporation (NASDAQ:MSFT) is one of the . On July 28, the company announced the launch of a new 'Copilot Mode' for its Edge browser. The Copilot Mode leverages artificial intelligence to improve the browsing experience for its users, helping carry out tasks, organize browsing, and even compare results across tabs without the need to switch between. According to Microsoft, the new feature will enable users to see a page with a single input box combining chat, search, and web navigation features. Moreover, Copilot will also support voice navigation for browsing in Edge. While not available at the moment, users will also be able to permit Copilot to access additional browser context, such as user history and credentials, to take more actions shortly. A hacker on his laptop. Photo by Sora Shimazaki on Pexels It will also be able to access browsing content when enabled, or even provide visual cues to allow users to know when it is active in the background. It will be available at no cost across Copilot markets in Windows and Mac PCs for a limited time. Microsoft Corporation (NASDAQ:MSFT) provides AI-powered cloud, productivity, and business solutions, focusing on efficiency, security, and AI advancements. While we acknowledge the potential of MSFT as an investment, we believe certain AI stocks offer greater upside potential and carry less downside risk. If you're looking for an extremely undervalued AI stock that also stands to benefit significantly from Trump-era tariffs and the onshoring trend, see our free report on the best short-term AI stock. READ NEXT: and Disclosure: None.


Time Business News
7 hours ago
- Time Business News
Kmode Exception Not Handled: What It Means and How to Fix It
If you've encountered the dreaded Kmode Exception Not Handled error on your Windows PC, you're not alone. This common Blue Screen of Death (BSOD) can be alarming, especially when your system crashes suddenly and restarts without warning. Fortunately, this error is fixable, and understanding what it means is the first step toward resolving it. In this article, we'll break down what causes the Kmode Exception Not Handled error, how to diagnose the issue, and several methods to fix it. The Kmode Exception Not Handled error is a system-level problem in Windows, usually triggered when a kernel-mode program generates an exception that the error handler fails to catch. In simple terms, it's a crash caused by a driver or a piece of software trying to access restricted or corrupted memory. When this happens, Windows displays a blue screen with the error message, sometimes along with a specific file name (e.g., or another driver file), which gives clues about what's causing the issue. Several underlying issues can trigger this error. The most common causes include: Faulty or outdated drivers: Device drivers, especially for network cards, graphic cards, or storage devices, are often responsible. Corrupt system files: Damaged or missing system files can lead to improper handling of kernel exceptions. Faulty RAM or hardware: Memory or hardware failures can also trigger the error. Incompatible software: New programs or recently updated software might conflict with system operations. Overclocking or BIOS issues: Aggressive system tuning or outdated BIOS settings can destabilize the system. Identifying the root cause is key to resolving the Kmode Exception Not Handled error for good. Below are step-by-step solutions to help you fix the Kmode Exception Not Handled error. Start with the most straightforward options and move to the more technical ones if needed. Outdated or corrupt drivers are the leading cause of this error. Steps: Press Windows + X and choose Device Manager . . Look for any yellow exclamation marks indicating driver issues. Right-click the device and select Update driver . . If the problem persists, uninstall the driver and reboot your system. Windows will attempt to reinstall it automatically. If the BSOD message displayed a driver name (e.g., focus on updating or replacing that specific driver. Corrupted system files can cause the Kmode Exception Not Handled error. Windows includes a tool to check and repair them. Steps: Open Command Prompt as Administrator. as Administrator. Type sfc /scannow and press Enter. Wait for the process to complete. If any files are fixed, reboot your system. Fast Startup is a Windows feature that can sometimes cause instability with certain drivers, leading to this error. Steps: Go to Control Panel > Power Options > Choose what the power buttons do . . Click Change settings that are currently unavailable . . Uncheck Turn on fast startup (recommended) . . Save changes and restart your PC. Faulty memory can result in the Kmode Exception Not Handled BSOD. Use the Windows Memory Diagnostic tool: Steps: Press Windows + R, type and press Enter. Choose Restart now and check for problems . . The system will reboot and scan your memory. If errors are detected, consider replacing the faulty RAM module. Outdated BIOS versions can cause compatibility issues with drivers or hardware. Caution: Updating BIOS carries some risk. Be sure to follow the instructions from your motherboard or system manufacturer carefully. If the error prevents Windows from starting normally, you can enter Safe Mode to troubleshoot. Steps: Interrupt the boot process three times to enter the Windows Recovery Environment . . Navigate to Troubleshoot > Advanced options > Startup Settings > Restart . . Press F4 to boot into Safe Mode. From here, you can uninstall software or drivers that might be causing the Kmode Exception Not Handled error. Once you've resolved the issue, it's important to prevent it from recurring: Keep Windows and all drivers up to date. Avoid downloading drivers or software from untrusted sources. Regularly scan your system for malware. Run system maintenance tools (like Disk Cleanup) periodically. Avoid aggressive overclocking or unstable system tuning. If you've tried all the above fixes and the Kmode Exception Not Handled error continues to occur, the problem may be more complex, possibly involving deeper hardware issues or conflicts that require advanced diagnostics. In such cases, contacting a technician or visiting an authorized service center is advisable. The Kmode Exception Not Handled error might be frustrating, but it's usually solvable with a combination of driver updates, system checks, and careful troubleshooting. In most cases, addressing driver issues or disabling Fast Startup resolves the problem quickly. Understanding what causes this BSOD, and how to fix it, empowers you to take control of your system's stability. So, the next time you see the Kmode Exception Not Handled message, don't panic—just follow the steps outlined here and get your system back to normal. TIME BUSINESS NEWS
Yahoo
8 hours ago
- Yahoo
Zuckerberg Outlines His Vision for the Development of AI ‘Superintelligence'
This story was originally published on Social Media Today. To receive daily news and insights, subscribe to our free daily Social Media Today newsletter. Meta CEO Mark Zuckerberg has outlined his view for the development of AI 'superintelligence,' which is a goal that he believes is now in view, as we accelerate development of the latest AI systems. Which, in technical terms at least, are not 'AI' at all, as they are not 'intelligent,' as such. The current slate of generative AI tools are able to crossmatch data patterns, in order to come up with human-like responses, based on whatever inputs you give them, but that doesn't equate to thought, and these systems aren't capable of original ideas or human-like thinking. Indeed, Meta's own AI chief Yann Lecun has repeatedly highlighted the limitations with LLMs, noting that such tools will be useful in a range of applications. But they're not likely to lead to artificial general intelligence (AGI), which is the Holy Grail of AI development, because such systems have no way of understanding the physical world, nor what their outputs are, in practical sense. In this respect, the current gen AI models are more akin to calculators than replicating human-like thinking. And it's that next stage, and the possibility of systems that can think for themselves, that Zuckerberg is now aiming for with his new super intelligence team. Which is a little scary, but Zuck has the money, and time. So it's happening, whether we like it or not. As explained by Zuckerberg: 'I am extremely optimistic that superintelligence will help humanity accelerate our pace of progress. But perhaps even more important is that superintelligence has the potential to begin a new era of personal empowerment where people will have greater agency to improve the world in the directions they choose.' Zuckerberg says that his new superintelligence project aims to 'bring personal superintelligence to everyone,' providing the power of advanced machine learning to everyday applications: 'As profound as the abundance produced by AI may one day be, an even more meaningful impact on our lives will likely come from everyone having a personal superintelligence that helps you achieve your goals, create what you want to see in the world, experience any adventure, be a better friend to those you care about, and grow to become the person you aspire to be.' Yeah, some of these are a little concerning, in reflecting Zuck's worldview, and the value of such tools. Like, being a better friend seems like that should remain in the realm of purely human experience, but maybe, for people like Zuckerberg, human connection is a key element that true AI can help with. Which may or may not be where we want to be headed. Either way, Zuckerberg has assembled his own Avengers-like team of AI development superstars, after gathering up staff from other AI projects. That team will be led by Shengjia Zhao, the co-creator of ChatGPT, and Alexandr Wang, the former CEO of Scale AI, with the two of them now tasked with building a system that can replicate the synapses of the human brain in digital form. Which will be some feat, though Meta is likely the frontrunner in this race. For years, Meta has been working on computer systems that can understand more about their environment, in order to factor such into their responses. Meta's V-JEPA 2 world model, for example, aims to mimic human understanding of the physical world, while Meta's 'Brain Decoding' process, which it first previewed in 2023, which aims to simulate neuron activity, and understand how humans think. Meta's even got direct insight into brain control itself, based on its previous efforts to build a human brain computer interface. That project has been in varying levels of exploration since 2017, and while Meta has since stepped back from its initial brain implant approach, it has been using this same MEG (magnetoencephalography) tracking to map brain activity in its more recent mind-reading projects. So Meta is already well-advanced in regards to understanding how the human brain functions, and how neurons can be translated to computer chips. Now, it's looking to cross the next threshold in building systems that can capitalize on this knowledge. Can it be done? Well again, Zuck's confident that it can, so much so that he's investing 'hundreds of billions' of dollars into making it a reality. 'Personal superintelligence that knows us deeply, understands our goals, and can help us achieve them will be by far the most useful. Personal devices like glasses that understand our context because they can see what we see, hear what we hear, and interact with us throughout the day will become our primary computing devices.' A world of empowered AI, which you can summon at any time via a wearable device. It's either utopia, or a disaster, with seemingly little in between. 'The rest of this decade seems likely to be the decisive period for determining the path this technology will take, and whether superintelligence will be a tool for personal empowerment or a force focused on replacing large swaths of society.' Zuckerberg's obviously angling for the former, but really, we have no idea where this path leads, and what might come from Meta's superintelligence push. Sign in to access your portfolio