
Orbs Perpetual Hub Integrates With PriveX to Support Privacy-First Perps Trading on COTI
The expansion of Orbs' omnichain Perpetual Hub to COTI's Layer-2 will significantly enhance user experience and capital efficiency for PriveX traders. Combining Orbs' advanced trading infrastructure with COTI's robust privacy features will ensure a secure and efficient perpetual trading experience.
The integration enables intent-based perpetual futures trading enhanced by Orbs' aggregated liquidity solution. Through leveraging COTI's implementation of Garbled Circuits, PriveX ensures transaction confidentiality, protecting traders from front-running and Miner Extractable Value (MEV) exploits.
Orbs' Perpetual Hub supports over 300 trading pairs with up to 60x leverage, automated liquidations, and deep liquidity routing for superior capital efficiency. Its addition to PriveX provides a CEX-like trading experience within a fully decentralized, privacy-first environment.
Orbs Chief Business Officer Ran Hammer said: 'We're thrilled to partner with PriveX to optimize privacy-focused perpetual trading. This integration showcases the power of Orbs' Layer-3 infrastructure in enabling more efficient onchain trading experiences that rival anything centralized platforms can offer.'
PriveX has been designed to overcome one of the major shortcomings with traditional DEXs, which publicly expose every action before order execution. Its architecture ensures that order-based behavior remains hidden, protecting trading strategies and reducing slippage. This innovation allows users to trade hundreds of assets with institutional-grade execution and partial onchain privacy.
Using PriveX, traders can specify desired outcomes and a backend solver network, now enhanced by Orbs Perpetual Hub, sources optimal liquidity to fulfill these intents. PriveX's combination of advanced AI tools and deep liquidity, coupled with robust privacy features, have the potential to redefine the onchain perps landscape.
Orbs is a decentralized Layer-3 (L3) blockchain designed specifically for advanced onchain trading. Utilizing a Proof-of-Stake consensus, Orbs acts as a supplementary execution layer, facilitating complex logic and scripts beyond the native functionalities of smart contracts. Orbs powered protocols such as dLIMIT, dTWAP, Liquidity Hub, and Perpetual Hub push the boundaries of DeFi and smart contract technology, introducing CeFi-level execution to onchain trading.
Learn more: https://www.orbs.com/
About PriveX
Privex is a next-gen DeFAI platform built at the intersection of AI, privacy, and intent-based execution. It enables users to deploy autonomous trading agents, and trade on the integrated DEX with CEX-grade liquidity – all fully private and permissionless. Powered by COTI's privacy layer and integrated with Symmio's advanced settlement infrastructure, Privex creates a seamless environment where strategies are tokenized, agents act independently, and users retain full control. It's the foundation for a new era of intelligent, composable, and censorship-resistant finance.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Digital Trends
6 hours ago
- Digital Trends
This remarkable ‘robot dog' will send a shiver down your spine
Unitree Robotics recently dazzled us with its R1 humanoid robot, and now it's unveiled something even more astonishing: a super-robust dog-like robot called the A2 Stellar Explorer. In a video released by the Chinese company this week, the A2 makes quite an entrance, smashing through a pane of glass as if to say, 'If there is ever a robot takeover, I'll be leading it.' Suffering zero damage except perhaps for a couple of scratches on its tough metal exterior, the A2, which has autonomous capabilities, is then shown flipping forward across uneven ground before performing a balancing act involving just a single leg. This thing can run, too, featuring s a top speed of 11.2 mph (about 18 kph). Not quick enough for you? Then you can stick on a set of wheels to make it go even faster. Next up, we see someone jumping up and down on the A2, with the robot's legs able to handle the heavy load with ease. Indeed, the quadruped can comfortably carry a payload of up to 55 pounds (25 kg) while walking and 221 pounds (100 kg) while standing. You won't be surprised to learn that climbing up giant concrete steps is a piece of cake for the robot, as well as sprinting down a flight of smaller ones. The remarkable machine uses ultra-wide Lidar 3D perception technology to make sense of its surroundings. It also features a front-facing HD camera and advanced computing options to help developers push its autonomous capabilities even further. It also comes with two batteries that are easy to swap out, though there's no information on run times. The company has yet to announce cost and availability for the A2 robot, but if it's anything like the R1, the price tag could be surprisingly reasonable. Unitree, which launched in 2016, says the A2 is designed for industrial and civilian tasks, and on its YouTube page says: 'Please everyone, be sure to use the robot in a friendly and safe manner.' But watching it in action, it's clear that the technology on display makes this robot and others like it plausible candidates for future military deployment. Indeed, the U.S. Army has already tested at least one such design — armed with an AI-enabled gun turret — while China's PLA has also been trialing its own quadruped robots.


Atlantic
12 hours ago
- Atlantic
The New ChatGPT Resets the AI Race
Yesterday evening, Sam Altman shared an image of the Death Star on X. There was no caption on the picture, which showed the world-destroying Star Wars space station rising over an Earth-like planet, but his audience understood the context. In less than 24 hours, OpenAI would release an AI model intended to wipe out all the rest. That model, GPT-5, indeed launched earlier today with all the requisite fanfare. In an announcement video, Altman said that the product will serve as a 'legitimate Ph.D.-level expert in anything—any area you need, on demand—that can help you with whatever your goals are.' He added that, 'anyone, pretty soon, will be able to do more than anyone in history could.' In more concrete terms, GPT-5 is an upgrade to the ChatGPT interface you're likely already familiar with: a model that's now a bit better at writing, coding, math and science problems, and the like. Of course, Altman has a penchant for hyperbole, and OpenAI—like the rest of the AI industry—likes to tout each new model as the best ever. But this particular release feels notable for a few reasons. First, it has been a long wait since the release of GPT-4 in March 2023, just a few months after ChatGPT's debut in November 2022. And second, in that time, OpenAI has become a bona fide tech empire: As of this week, OpenAI now provides enterprise ChatGPT accounts to federal agencies at essentially no cost; its products are also used by nearly every Fortune 500 company; and today Altman announced that roughly 700 million people worldwide use ChatGPT every week. In terms of sheer reach, this is the company's most consequential product announcement, ever. As OpenAI has ascended to the scale of a typical tech giant—as of this week, it is reportedly in talks for a $500 billion valuation—the firm has also started to act like its corporate rivals. To attract new users and customers (and keep existing ones from turning to other AI products), OpenAI has doubled down on institutional partnerships and polishing its product lineup. Sure, the company still pushes the limits of AI capabilities—but its products are what keep most consumers and businesses coming back for more. For instance, OpenAI has partnered with Bain & Company, Mattel, Moderna, Los Alamos National Laboratory, and Harvard. It has brought on Jony Ive, the designer of the iPhone, to spearhead the creation of physical OpenAI devices. (The Atlantic and OpenAI have a corporate partnership.) GPT-5 achieves state-of-the-art performance on a number of AI benchmarks, according to OpenAI's internal tests, but it is far from a clean sweep: There are a few tests on which competing products such as Google Gemini, Anthropic's Claude, or xAI's Grok outperform, or are just barely below the level of, OpenAI's new top model. The GPT-5 announcement video and launch page also contained a number of errors—incorrect labels, numbers and colors that made no sense, and missing entries on charts—that made the program's precise abilities, and the trustworthiness of OpenAI's reporting, hard to discern (and led some observers to joke that perhaps GPT-5 itself had made, or hallucinated, the graphics). Yet that may not matter. OpenAI's animating theme for GPT-5 is user experience, not 'intelligence': Its new model is intuitive to use, fast, and efficient; adapts to human preferences and intentions; and easily personalizable. Before it is more intelligent, GPT-5 is more usable —and more likely to attract and retain users. 'The important point is this,' Altman said, pinching a thumb and index finger together for emphasis: 'We think you will love using GPT-5 much more than any previous AI.' In some sense, OpenAI is learning from its greatest success. ChatGPT took off because it effectively redesigned an existing product: GPT-3.5, ChatGPT's original underlying model, was months old by the time the chatbot came out, but it was relatively obscure. Placing essentially the same program within a conversational interface, however, made the model easy to use and obsess over. GPT-4 would eventually provide a new engine—smarter and more capable—but this was almost beside the point; to most people, the product was already firmly established as ChatGPT. And, like the original ChatGPT, GPT-5 is free, although nonpaying users have a limit on their usage of this most advanced model—giving everyone a small taste of OpenAI's ecosystem to open up the possibility they will want, and pay, for more. During the ensuing two-plus years of the AI race, OpenAI has kept up by releasing a slew of more minor models and new features. When Google released a version of Gemini that was extremely fast and cheap, OpenAI did the same; when DeepSeek launched a free and advanced model that could 'reason' through complex questions, OpenAI publicly released a still more powerful reasoning system of its own; as Anthropic's Claude Code seemed to corner the AI-coding market, OpenAI came out with the Codex tool for software engineers. The empire's ambitions had no limits. Read: China's DeepSeek surprise But these products were accompanied by a labyrinth of names and uses: GPT-4o and GPT-4o mini and GPT-4.1; o1-mini and o1-pro; o3 and o3-pro and o4-mini; and so on. This was a matter not only of poor branding but of poor design. Despite the numbers, for some uses o3 is better than o4. Users frequently complain that they don't know how to select from OpenAI's models. 'We are near the end of this current problem,' Altman said on OpenAI's podcast in June. 'I am excited to just get to GPT-5 and GPT-6, and I think that'll be easier for people to use.' Now OpenAI has arrived at GPT-5, and indeed, the model might be best understood as providing easier and frictionless use—as an amalgam of all of OpenAI's disparate, discrete advances from the previous two-plus years. GPT-5 'eliminates this choice' among models and their specialties, Mark Chen, OpenAI's chief research officer, said in today's announcement, and that may be the new model's core feature. GPT-5 modulates its approach to your query, using more or less 'reasoning' power—doing the equivalent of selecting among the GPT-4os and o3s and o4s—depending on what is asked of it. OpenAI is now retiring a large number of its previous, major models. Alongside GPT-5, OpenAI also announced a number of other additions to the ChatGPT experience to 'make ChatGPT more personalized,' Chen said, 'so it's more like your AI.' These new features are customizable color schemes, personalities ('cynic,' 'robot,' 'listener,' 'nerd'), and access to Gmail and Google Calendar—all building on top of the recently added 'Memories' feature, through which ChatGPT can pull information from previous chats. These add-ons have little to do with the bot's engine—how 'intelligent' or 'capable' it is—but they will make ChatGPT more customizable, more useful, and perhaps more fun. Businesses can integrate their data, as well. Just as the years of photos and notes on your iPhone make it undesirable to switch to a Google Pixel, or years of using Google Drive make it hard to migrate to Microsoft OneDrive, if ChatGPT morphs from a vanilla bot into your AI or your company's AI, leaving for Gemini or Claude becomes not just burdensome, but a downgrade. At this stage of the AI boom, when every major chatbot is legitimately helpful in numerous ways, benchmarks, science, and rigor feel almost insignificant. What matters is how the chatbot feels —and, in the case of the Google integrations, that it can span your entire digital life. Before OpenAI builds artificial general intelligence—a model that can do basically any knowledge work as well as a human, and the first step, in the company's narrative, toward overhauling the economy and curing all disease—it is aiming to build an artificial general assistant. This is a model that aims to do everything, fit for a company that wants to be everywhere.
Yahoo
13 hours ago
- Yahoo
How to use Instagram Map and set your location-sharing preferences
This week, Instagram introduced a new Snap Map-like feature called Instagram Map that lets U.S. users share their most recent active location with others and discover location-based content. If you decide to turn on your location on Instagram Map, your location only updates when you open the app or have it running in the background, meaning it doesn't provide continuous, real-time location updates. This is different from Snap Map, which lets users choose whether their location is updated only when they open the app or in real time. Meta says location sharing is off by default on Instagram Map. Following concerns around safety, Instagram head Adam Mosseri reassured users that the feature is opt-in, and noted that some people are 'confused' about how the feature works. For instance, people are accessing the feature and then seeing the map populated with the location information of some users. This is because the feature automatically pulls in location tags from recently posted Stories or Reels that include a tagged location. It's worth noting that this isn't new, as Instagram already had location tags visible to users in its map view. However, the new Instagram Map feature makes this information a lot easier to access. The new feature might make you rethink whether you want to post your Stories and Reels with location tags (especially if you're still at the location). For those who want to ensure their location is turned off on Instagram Map, we'll walk you through the steps below. How to access Instagram Map To access the new feature, you need to navigate to your DMs page (direct messages) and tap the new 'Map' option at the top. If it's your first time accessing the feature, you'll likely see a pop-up message notifying you about the new Map. The pop-up will tell you that no one can see your location until you share it with them, and that you can change your settings at any time. How to set your location-sharing preferences The first time you open Map, you will see a page that reads 'Who can see your location.' From there, you can choose to share it with your Friends (followers you follow back), your Close Friends list, select users, or no one. To change your preferences, you need to click on your profile, tap the settings option in the top-right corner, select the 'Story, live and location' option, and then tap the 'Location sharing' button. Here, you will be able to change your settings. Note that if you have location sharing turned off, others can still share their location with you, which means you'll be able to see them on the Map. How to use Instagram Map When you open the app, you will see the locations of friends who have shared their location with you. You will also see location-based Stories and Reels from people you follow. For example, if your friend attended a nearby music festival and posted a story while there, it will appear on the map. Similarly, if a creator you follow posts a reel about a new restaurant in your city, you'll be able to discover it on Instagram Map. You will still see location-based content on the Map even if you have your location turned off. You also can leave short, ephemeral messages, or 'Notes,' on the map for others to see. Instagram Notes are the short messages that currently appear at the top of your direct messaging feed, but with the launch of Instagram Map, you will now see these posts on the map if they're shared with a location.