logo
#

Latest news with #dataquality

When economic data quality deteriorates: Two thoughts for investors
When economic data quality deteriorates: Two thoughts for investors

Yahoo

time2 days ago

  • Business
  • Yahoo

When economic data quality deteriorates: Two thoughts for investors

A version of this post first appeared on Policymakers, politicians, business leaders, and investors all use economic data. So, most people agree that any data being cited should be high quality. But everyone relies on data differently depending on their goals and interests, which means the implications of data quality varies depending on who's using it. Today, I'm going to provide some updates on deteriorating data quality and share some thoughts from the stock market investor's perspective. We got some unsettling news about data quality last week. From the Bureau of Labor Statistics (BLS) on Tuesday: Due to minor errors to weights associated with the introduction of a redesigned Current Population Survey (CPS) sample, some April 2025 estimates will be corrected on June 6, 2025. Major labor force measures, such as the unemployment rate, labor force participation rate, and employment–population ratio were unaffected. While corrections will be made to many estimates, the impact is negligible. In April 2025, the CPS began to phase in a redesigned sample that is based on information from the 2020 Census. During the introduction of this new sample in April, a derived geographic variable used in the weighting process was miscoded, treating micropolitan areas like metropolitan areas, which led to misapplied noninterview weights for some cases. That doesn't instill confidence. It didn't end there. Here's the The Wall Street Journal on a BLS notice published on Wednesday: The Bureau of Labor Statistics, the office that publishes the inflation rate, told outside economists this week that a hiring freeze at the agency was forcing the survey to cut back on the number of businesses where it checks prices. In last month's inflation report, which examined prices in April, government statisticians had to use a less precise method for guessing price changes more extensively than they did in the past. Economists say the staffing shortage raises questions about the quality of recent and coming inflation reports. There is no sign of an intentional effort to publish false or misleading statistics. But any problems with the data could have major implications for the economy. "These errors have consequences," UBS's Paul Donovan wrote on Friday. "Less understanding of U.S. inflation increases the chances of the Federal Reserve making a policy error (especially with the mantra of 'data dependency')." The news only adds to ongoing concerns about the quality of government data, which relies on extensive surveys and analysis of those surveys. One of the more discussed concerns in recent years has been the falling response rates to these surveys. The BLS publishes those response rates, and they've mostly been going down and to the right. It takes a lot of resources to conduct these surveys. But the cost is justified by the value it brings to those making decisions about economic policy, monetary policy, and business. It's not totally clear how much these developments are affecting the accuracy of the data. But it certainly has affected the robustness of the data and the confidence of those using it. First, remember TKer's rule No. 1 of analyzing the economy: Don't count on the signal of a single metric. Even when the response rates for these BLS surveys were higher, the results were still susceptible to revisions — and sometimes those revisions were significant. And even when the data is accurate, it's possible that the bulk of other data tells a conflicting story that may actually be the correct one. As always, I'd also caution against reading to much into one month's worth of data. Data can zig zag over short periods. The true picture always becomes more clear when you zoom out and examine trends, not single data points. "For investors, it is important to remember that broad trends matter, and data precision is increasingly an illusion," Donovan said. This is why when analyzing the economy, it's important to consider the confluence of data holistically and over time. (Kind of like how we do in TKer's weekly review of the macro crosscurrents.) It's extremely unlikely that all of the available data will be simultaneously wrong in the same direction over an extended period of time. Second, the good news is that reported earnings from publicly traded companies are pretty much always accurate. Recall TKer Stock Market Truth No. 5: "News about the economy or policy moves markets to the degree they are expected to impact earnings. Earnings (a.k.a. profits) are why you invest in companies." What investors really care about are earnings because they're the most important driver of stock prices. And economic data has mattered because it has helped us calibrate our expectations for those earnings. Every quarter, publicly traded companies report their earnings along with comprehensive financial statements. This information is not deduced from a sample like what we get in economic surveys. These quarterly statements cover all of the financial transactions that are executed, and the numbers are audited by third-party accountants. Outside of very rare occasions (e.g. accounting fraud, major failure of internal processes), these numbers are accurate and do not get revised. So regardless of the accuracy of the economic data, what matters to investors is if companies are delivering on earnings. To that second point, I like to think of quarterly earnings season as a time to reset and recalibrate my views as an investor in the stock market. And it's not just because the reported financial figures are complete and accurate. We also learn how successfully companies have been able to adapt and execute in what may arguably be a difficult business environment as defined by the economic data. This is not to suggest we should be dismissive of economic data. Rather we should just be mindful of what "hat" we're wearing as we consider data. When we're wearing our stock market hat, economic data matters to the degree it's expected impact to earnings. To be clear, deteriorating economic data quality is a negative development for investors. While investors have the benefit of getting audited financial figures every quarter, the companies they invest in are affected by decisions made by policymakers. If policymakers are acting on bad data, their decisions may create inefficiencies in the economy and hinder business activity. Everyone should be in favor of preserving and improving the quality of economic data, especially when that data is informing policy decisions. There were several notable data points and macroeconomic developments since our last review: 👍 The labor market continues to add jobs. According to the BLS's Employment Situation report released Friday, U.S. employers added 139,000 jobs in May. The report reflected the 53nd straight month of gains, reaffirming an economy with growing demand for labor. Total payroll employment is at a record 159.6 million jobs, up 7.3 million from the prepandemic high. The unemployment rate — that is, the number of workers who identify as unemployed as a percentage of the civilian labor force — stood at 4.2% during the month. While it continues to hover near 50-year lows, the metric is near its highest level since November 2021. While the major metrics continue to reflect job growth and low unemployment, the labor market isn't as hot as it used to be. For more on the labor market, read: 💼 and 📉 💸 Wage growth could be lower. Average hourly earnings rose by 0.4% month-over-month in May, down from the 0.2% pace in April. On a year-over-year basis, this metric is up 3.9%. For more on why policymakers are watching wage growth, read: 📈 💼 Job openings tick higher. According to the BLS's Job Openings and Labor Turnover Survey, employers had 7.39 million job openings in April, up from 7.20 million in March. During the period, there were 7.17 million unemployed people — meaning there were 1.03 job openings per unemployed person. This continues to be one of the more obvious signs of excess demand for labor. However, this metric has returned to prepandemic levels. For more on job openings, read: 🤨 and 📈 👍 Layoffs remain depressed, hiring remains firm. Employers laid off 1.79 million people in April. While challenging for all those affected, this figure represents just 1.1% of total employment. This metric remains below prepandemic levels. For more on layoffs, read: 📊 Hiring activity continues to be much higher than layoff activity. During the month, employers hired 5.57 million people. That said, the hiring rate — the number of hires as a percentage of the employed workforce — has been trending lower, which could be a sign of trouble to come in the labor market. For more on why this metric matters, read: 🧩 🤔 People are quitting less. In April, 3.19 million workers quit their jobs. This represents 2.0% of the workforce. While the rate is above recent lows, it continues to trend below prepandemic levels. A low quits rate could mean a number of things: more people are satisfied with their job; workers have fewer outside job opportunities; wage growth is cooling; productivity will improve as fewer people are entering new unfamiliar roles. For more, read: ⚙️ 📈 Job switchers still get better pay. According to ADP, which tracks private payrolls and employs a different methodology than the BLS, annual pay growth in May for people who changed jobs was up 7% from a year ago. For those who stayed at their job, pay growth was 4.5%. For more on why policymakers are watching wage growth, read: 📈 💼 Unemployment claims tick higher. Initial claims for unemployment benefits rose to 247,000 during the week ending May 31, up from 239,000 the week prior. This metric continues to be at levels historically associated with economic growth. For more context, read: 🏛️ and 💼 🏭 Business investment activity declines. Orders for nondefense capital goods excluding aircraft — a.k.a. core capex or business investment — declined 1.3% to $74.7 billion in April. Core capex orders are a leading indicator, meaning they foretell economic activity down the road. The recent decline could portend slowing growth in the months to come. For more on core capex, read: ⚠️ 🤷🏻‍♂️ Services surveys were mixed. From S&P Global's May Services PMI: "Service sector growth has improved more than first estimated in May, with confidence about the year ahead also lifting higher, buoyed in part due to pauses on higher rate tariffs. Companies have matched that optimism with increased spending and hiring. That said, the improvements come from a low base, following a very gloomy April, which saw growth nearly stall as confidence sank to a two-and-half year low. Reports from companies underscore how uncertainty about the policy outlook continued to act as a deterrent to expansion plans in May." The ISM's May Services PMI reflected contraction in the sector. 👎 Manufacturing surveys weren't great. From S&P Global's May Manufacturing PMI (emphasis added): "The rise in the PMI during May masks worrying developments under the hood of the US manufacturing economy. While growth of new orders picked up and suppliers were reportedly busier as companies built up their inventory levels at an unprecedented rate, the common theme was a temporary surge in demand as manufacturers and their customers worry about supply issues and rising prices. These concerns were not without basis: supplier delays have risen to the highest since October 2022, and incidences of price hikes are at their highest since November 2022, blamed in most cases on tariffs. Smaller firms, and those in consumer facing markets, appear worst hit so far by the impact of tariffs on supply and prices" The ISM's May Manufacturing PMI reflected further contraction in the sector. Keep in mind that during times of perceived stress, soft survey data tends to be more exaggerated than actual hard data. For more on soft sentiment data, read: 📊 and 🙊 🔨 Construction spending ticks lower. Construction spending decreased 0.4% to an annual rate of $2.152 trillion in April. 💳 Card spending data is holding up. From JPMorgan: "As of 30 May 2025, our Chase Consumer Card spending data (unadjusted) was 1.2% above the same day last year. Based on the Chase Consumer Card data through 30 May 2025, our estimate of the US Census May control measure of retail sales m/m is 0.45%." From BofA: "Total card spending per HH was up 0.5% y/y in the week ending May 31, according to BAC aggregated credit & debit card data. Relative to last week, airlines & transit saw the biggest rise in spending growth. Furniture saw the biggest decline." For more on consumer spending, read: 😵‍💫 and 🛍️ ⛽️ Gas prices tick lower. From AAA: "The summer driving season is underway, and while gas prices normally peak this time of year, drivers are getting a reprieve. The national average for a gallon of regular is $3.14, down two cents from last week. Pump prices are 36 cents cheaper than last June, thanks to this year's consistently low crude oil prices. Currently, oil supply in the market is outweighing demand. June gas prices haven't been this low since 2021." For more on energy prices, read: 🛢️ 🏠 Mortgage rates tick lower. According to Freddie Mac, the average 30-year fixed-rate mortgage declined to 6.85%, down from 6.89% last week. From Freddie Mac: "The average mortgage rate decreased this week, which is welcome news to potential homebuyers who also are seeing inventory improve and house price growth slow." There are 147.8 million housing units in the U.S., of which 86.1 million are owner-occupied and about 34.1 million of which are mortgage-free. Of those carrying mortgage debt, almost all have fixed-rate mortgages, and most of those mortgages have rates that were locked in before rates surged from 2021 lows. All of this is to say: Most homeowners are not particularly sensitive to movements in home prices or mortgage rates. For more on mortgages and home prices, read: 😖 🏢 Offices remain relatively empty. From Kastle Systems: "Peak day office occupancy was 60.3% on Wednesday last week, as many workers extended the three-day holiday weekend. Occupancy on Tuesday after Memorial Day was 58.8%, down 3.4 points from the previous week. Washington, D.C. had the biggest drop around the holiday, falling 5.8 points to 30.3% on Friday and 4.5 points to 57.7% on Tuesday. The average low was on Friday at 30.6%, down 4.2 points from the previous week." For more on office occupancy, read: 🏢 📈 Near-term GDP growth estimates are tracking positive. The Atlanta Fed's GDPNow model sees real GDP growth rising at a 3.8% rate in Q2. For more on GDP and the economy, read: 📉 and 🤨 🚨 The Trump administration's view on tariffs threatens to disrupt global trade — with significant implications for the U.S. economy, corporate earnings, and the stock market. Until we get more clarity, here's where things stand: Earnings look bullish: The long-term outlook for the stock market remains favorable, bolstered by expectations for years of earnings growth. And earnings are the most important driver of stock prices. Demand is positive: Demand for goods and services remains positive, supported by healthy consumer and business balance sheets. Job creation, while cooling, also remains positive, and the Federal Reserve — having resolved the inflation crisis — has shifted its focus toward supporting the labor market. But growth is cooling: While the economy remains healthy, growth has normalized from much hotter levels earlier in the cycle. The economy is less "coiled" these days as major tailwinds like excess job openings and core capex orders have faded. It has become harder to argue that growth is destiny. Actions speak louder than words: We are in an odd period given that the hard economic data has decoupled from the soft sentiment-oriented data. Consumer and business sentiment has been relatively poor, even as tangible consumer and business activity continue to grow and trend at record levels. From an investor's perspective, what matters is that the hard economic data continues to hold up. Stocks are not the economy: Analysts expect the U.S. stock market could outperform the U.S. economy, thanks largely due to positive operating leverage. Since the pandemic, companies have adjusted their cost structures aggressively. This has come with strategic layoffs and investment in new equipment, including hardware powered by AI. These moves are resulting in positive operating leverage, which means a modest amount of sales growth — in the cooling economy — is translating to robust earnings growth. Mind the ever-present risks: Of course, this does not mean we should get complacent. There will always be risks to worry about — such as U.S. political uncertainty, geopolitical turmoil, energy price volatility, cyber attacks, etc. There are also the dreaded unknowns. Any of these risks can flare up and spark short-term volatility in the markets. Investing is never a smooth ride: There's also the harsh reality that economic recessions and bear markets are developments that all long-term investors should expect to experience as they build wealth in the markets. Always keep your stock market seat belts fastened. Think long-term: For now, there's no reason to believe there'll be a challenge that the economy and the markets won't be able to overcome over time. The long game remains undefeated, and it's a streak long-term investors can expect to continue. A version of this post first appeared on

UK inflation number for April too high after data blunder
UK inflation number for April too high after data blunder

BBC News

time5 days ago

  • Business
  • BBC News

UK inflation number for April too high after data blunder

The UK's statistics agency has said the headline inflation rate for April was too high after it discovered it had been given incorrect road tax data by the Department for Office for National Statistics (ONS) said the pace of general price rises should have been 3.4%, instead of the 3.5% it had comes as the ONS faces a crisis of confidence in its work after concerns about the quality of its concerns make it more difficult for the government and companies to make fully informed decisions about the UK economy. The ONS said it had spotted an error in Vehicle Excise Duty data. It found that the number of vehicles people were paying road tax on in the first year of registration was too high in the data that was statistics agency said it would not be amending April's inflation figure, in line with a policy that it only carries out revisions in exceptional it did say it would be reviewing how it checks the quality of data from outside the agency "in light of this issue".Both the Consumer Prices Index (CPI) and Retail Prices Index (RPI) inflation figures were 0.1 of a percentage point too high for the year to April. Last month, the ONS's former head Sir Ian Diamond resigned with immediate effect due to health April, the UK's Office for Statistics Regulation had set out its concerns about the quality of the data provided by the ONS concerns focused on, but were not limited to, widely-recognised problems with the Labour Force Survey which is used to measure the unemployment rate in the the pandemic, statistics agencies around the world have struggled to get good enough response rates to ensure their data is of the quality they would regulator said it would like more assurance that the ONS had sufficient steps in place to regularly review and improve sample design and representativeness, tackling bias, survey methodology, and imputation.

Clean, Connected Data Is The Foundation Of AI—iPaaS Makes It Possible
Clean, Connected Data Is The Foundation Of AI—iPaaS Makes It Possible

Forbes

time27-05-2025

  • Business
  • Forbes

Clean, Connected Data Is The Foundation Of AI—iPaaS Makes It Possible

Businesses need connected, clean and contextual data to unlock AI's true power. iPaaS makes that ... More possible. It cuts through complexity to lay the foundation for AI success. 'If we don't address the chaos of disconnected systems now, the AI revolution could slip through our fingers before we even begin to harness its power,' writes Steve Lucas in his new book, Digital Impact: The Human Element of AI-Driven Transformation. 'Chaos' isn't an overstatement. The average large organization runs hundreds of cloud applications, resulting in thousands of data sources spread across on-premise and cloud locations. Most of these apps and data sources are disconnected. Layering AI on top of this fragmented and siloed digital infrastructure won't produce good results; it only adds more chaos. Simply put, poor data quality makes AI useless. Organizations that don't solve their data problems to prepare for AI are risking irrelevance, Lucas warns. 'Businesses with disconnected systems will not survive the coming change,' he writes in Digital Impact. 'You can have the shiniest, fanciest AI model at the heart of your business, but AI won't understand your business if your systems aren't integrated and processes aren't automated.' To properly harness AI's power, companies need connected, quality, real-time, context-rich data. Yet digital transformation projects that completely overhaul a company's apps and data systems are extremely time-consuming, expensive and disruptive. Lucas shares integration platform-as-a-service (iPaaS) provides an elegant solution to data woes. In addition to being a national best-selling author, Lucas is also chairman and CEO of Boomi, which launched the cloud-based iPaaS category nearly two decades ago. iPaaS is a cloud-based solution that connects and automates the flow of data across an organization's software systems. 'iPaaS is like a centralized hub of data activity that allows companies to connect systems and manage data with minimal custom development; it serves as a foundational platform for integrating AI with structured enterprise data,' says Andy Park, co-founder of Team Central, a no-code iPaaS tool. Team Central was developed at my company, Centric Consulting, and later spun out into its own organization. Until recently iPaaS has flown somewhat under the radar as a tool for helping companies get AI ready, and the market demand is poised for growth, says both Lucas and Park. Even though many companies have complex, thorny data issues to tackle before they can fully leverage AI, iPaaS is well-suited for the challenge. It can solve the main roadblocks companies face in getting their data ready for AI. Lucas refers to iPaaS as a 'Swiss Army knife' for data access and orchestration. For one, iPaaS eliminates data silos. It connects all the data sources across an enterprise. This data connection is the foundation of AI success. 'Every app, every API, every piece of enterprise knowledge has to be accessible to AI,' Lucas says in an interview. He compares an organization's data to the human nervous system, and iPaaS as the tool that connects all the neural pathways. Just like a human brain can't function with faulty neural pathways, AI can't work without clear and easy access to all the information across an organization. Furthermore, iPaaS helps companies maintain clean, consistent data. It syncs across all the systems in an organization, so whenever an employee or customer updates data in one place, iPaaS automatically updates it everywhere else. For example, if a customer updates their email in an online store, iPaaS makes sure the information is also changed in your marketing email and shipping platforms. This way, employees across the entire organization always have the most up-to-date information. iPaaS also understands the context for an organization's data, which Lucas says is especially crucial for AI agents to function. 'For example, if I asked an agent for my sales forecast, the agent must know where the information lives and understand how you think about forecasting. That's contextual metadata. iPaaS has the location and the context, not just the data itself,' he says. Finally, because iPaaS automatically maintains clean, connected, context-rich data, analytics are not only accurate but easy to access, helping companies achieve informed, agile decision making—an essential capability as the pace of new technology accelerates. Perhaps the biggest advantage of iPaaS is that it allows companies to begin unlocking AI benefits without embarking on a full-scale digital transformation project. Or, iPaaS can serve as a first step toward digital transformation, allowing a company to begin modernizing and achieve some quick wins without going through a giant overhaul or a rip-and-replace transformation. Many organizations have put off cleaning up their data and modernizing their tech stack for a long time, because leaders know it's a complicated, expensive, time-consuming and risky process, especially for organizations carrying a heavy tech debt. 'Companies know they have all this technology that's been cobbled together with duct tape and chicken wire over the last few decades,' Park says. 'Cleaning up data and implementing automation technology to programmatically govern data quality has been long overdue, but it's an intimidating prospect. By leveraging iPaaS, businesses can start preparing their data for AI without the need for a complete modernization from the ground up.' iPaaS offers a practical alternative to full-scale digital transformation because the technology is low- or no-code and relatively easy to implement. iPaaS is a cost-effective and low-risk solution because it requires little to no custom development. Its lightweight nature, especially when compared to other platforms, makes it a practical choice for businesses looking to streamline integration without heavy IT investment. 'A more traditional way of integrating might take 10,000 lines of code; there are zero lines of code in our solution,' Park says. 'So the amount of time savings and the amount of complexity in the solution is dramatically reduced. You don't have to create a new framework or structure to manage security, accessibility and logic— iPaaS provides a pre-built, scalable infrastructure that reduces this complexity so you can focus more on proper data mapping, business rules and data transformation.' Businesses need connected, clean and contextual data to unlock AI's true power. iPaas makes that possible. It cuts through complexity to lay the foundation for AI success.

Checkup Time: The Lurking AI Danger That Can Kill A Successful Business
Checkup Time: The Lurking AI Danger That Can Kill A Successful Business

Forbes

time27-05-2025

  • Business
  • Forbes

Checkup Time: The Lurking AI Danger That Can Kill A Successful Business

Businesses need quality data. If they operate on the wrong data, their tools can guide them in the ... More wrong direction. When's the last time your organization had a thorough data checkup? Not just a quick scan, but a deeper, more thorough search for hidden problems that could lead to disaster? The chances are it's been far too long. Surveys have found that the vast majority of businesses rely on stale data, which ushers in a host of hazards. 'Failing to update data regularly can have significant consequences for businesses, including targeting the wrong contacts, making decisions based on outdated information, and exposing the organization to unnecessary risks,' Moody's warns. While these dangers have existed for decades, they've gotten far bigger in the last couple of years. Businesses are rapidly adopting and deploying AI-powered technologies that offer tremendous benefits, like vastly improved CX (customer experience). But if they operate on the wrong data, these same tools can guide an organization in the wrong directions. Just as AI can attract customers, it can also repel them. Nextiva explained in a research survey The Leader's Guide to CX Trends in 2025, 'AI can predict what customers need, assist during live conversations, and save time through automation without making reps feel marginalized. However, poorly implemented AI will frustrate customers and drive them away.' Data hygiene is key One of the biggest mistakes companies make in implementing AI is poor data hygiene. 'AI learns from the data it's fed, and the results can be damaging if that data is outdated, inconsistent, or biased,' Nextiva explained in a blog post. 'Bad data can lead to inaccurate predictions, skewed insights, and poor personalization. For example, if your CRM (customer relationship management tool) includes duplicate customer records or missing information, an AI-driven email campaign could send irrelevant messages, or worse, the same message twice.' Today's consumers want to feel that brands recognize and respect them. They want interactions to be personalized. Irrelevant or duplicate messages send the exact opposite signal. Working with organizations across a wide array of industries, I see this happen all the time. And these days, data decays (becomes obsolete) more quickly than ever, given the rapid pace of change in both business operations and consumer habits. People constantly update what they want or need, and their expectations keep climbing higher and higher. Sending them the wrong messages is just one way your business can go wrong. Recent research highlights other pitfalls as well. For example, a study on financial risk management found that poor data hygiene 'can expose firms to legal consequences and damage customer trust.' Also, independent researcher Anshul Vyas wrote, 'As AI models increasingly demand real-time or streaming data for predictions, the pressure on IT infrastructure, cloud security, and cyber resilience intensifies.' Then there's the environmental impact. 'Storing and processing data has an environmental cost, so 'data hygiene' is gaining importance,' researchers Sergiu-Alexandru Ionescu, Vlad Diaconita, and Andreea-Oana Radu of the Bucharest University of Economic Studies wrote in a recent study. This problem is especially big for financial institutions, which are 'notorious for hoarding data due to regulatory needs (e.g., storing years of transaction records) and analytical ambitions,' they added. These concerns are likely to grow as businesses put increased focus on climate efforts. In addition to tracking their carbon footprint, some are now looking at their 'nature footprint,' which the World Economic Forum describes as 'a holistic understanding of a company's impacts and dependencies on nature.' How to get a clean bill of data health Businesses should use data cleansing tools to automate validation, eliminate duplicate records, purge stale information, and more. They should also conduct regular audits. It's helpful to assign a data steward as well. As defines it, 'A data steward is an information technology employee who controls the quality of data a company gathers and the method it uses to collect it. These stewards are important for a company's data security by creating and enforcing data policies.' Crucially, businesses should combine all their communications into a single AI-powered tool like a unified customer experience management platform (UCXM). These kinds of technologies can automate numerous processes including for data hygiene, while protecting privacy and security. The importance of data hygiene is a powerful reminder that, even in this era, humans are still in charge. AI tools are packed with potential, but even the best are only as good as the data they run on. With employees overseeing data hygiene, your business can proceed much more confidently into the new world of AI -- and trust that you'll maintain a clean bill of health.

Sifflet Unveils AI Agents to Redefine Data Observability in the Age of AI
Sifflet Unveils AI Agents to Redefine Data Observability in the Age of AI

Yahoo

time22-05-2025

  • Business
  • Yahoo

Sifflet Unveils AI Agents to Redefine Data Observability in the Age of AI

Sentinel, Sage, and Forge bring memory, reasoning, and intelligent guidance to data teams. NEW YORK, May 22, 2025 /PRNewswire/ -- Sifflet, the AI-native data observability platform, today shared an early look at their upcoming system of AI agents designed to help modern data teams scale data quality and reliability, reduce incident response times, and stay ahead of complexity. Built to meet the demands of fast-growing data ecosystems and increasingly lean data teams, the new agents extend Sifflet's core observability capabilities with a new layer of intelligence: Sentinel analyzes system metadata to recommend precise monitoring strategies. Sage recalls past incidents, understands lineage, and identifies root causes in seconds. Forge suggests contextual, ready-to-review fixes grounded in historical patterns. With data volumes doubling year over year and AI workloads moving rapidly into production, data reliability has become a business-critical function and a massive scaling challenge. Sifflet's AI-native approach is already helping customers to handle these workloads with existing functionality. "What impressed us most about Sifflet's AI-native approach is how seamlessly it adapts to our data landscape, without needing constant tuning. The system learns patterns across our workflows and flags what matters, not just what's noisy. It's made our team faster and more focused, especially as we scale analytics across the business," says Simoh-Mohammed Labdoui, Head of Data, Saint-Gobain. Sifflet's AI agents address the growing challenge and go one step further by replacing manual triage, alert sprawl, and static rule sets with context-aware automation that augments human teams. "Sifflet's agent-based approach marks a meaningful evolution in data observability," said Sanjeev Mohan, founder of SanjMo and former VP Analyst at Gartner. "Rather than relying on static monitoring, these agents bring memory, reasoning, and automation into the fold, helping teams move from alert fatigue to intelligent, context-aware resolution. It's a strong signal of where the space is headed." The agentic system is fully embedded in Sifflet's AI-native platform and will soon be available to select customers in private beta. It represents the next phase in Sifflet's mission to make data reliability proactive, accessible, and truly scalable. About Sifflet Sifflet is a leading, AI-native data observability platform that delivers data trust at scale for analytics and AI initiatives by bridging the gap between technical and business users. Global customers like the Penguin Random House, Carrefour, Auchan, CMA CGM and Adaptavist rely on Sifflet to reach and maintain high levels of data quality and reliability needed for business critical analytics and AI. Sifflet is a best-in-class data observability solution for the entire organization, ranked Best estimated ROI and Fastest implementation by G2 users. Learn more about what's keeping you from analytics & AI-ready data at Media Contact:Romain DoutriauxHead of Marketingromain@ Logo - View original content:

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store