
Wellington man gets shock $16,000 bill after using a Google AI-ready tool, Meridian culls 53 jobs as it offshores billing
Wellington tech contractor Drew Broadley says he racked up a surprise $15,760 bill during May using Google BigQuery, an online tool that's part of the Google Cloud Platform (GCP).
The tech giant describes BigQuery as 'a fully managed, AI-ready data platform for managing and analysing data'.
Broadley says it's

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


NZ Herald
3 days ago
- NZ Herald
Wellington man gets shock $16,000 bill after using a Google AI-ready tool, Meridian culls 53 jobs as it offshores billing
Wellington tech contractor Drew Broadley says he racked up a surprise $15,760 bill during May using Google BigQuery, an online tool that's part of the Google Cloud Platform (GCP). The tech giant describes BigQuery as 'a fully managed, AI-ready data platform for managing and analysing data'. Broadley says it's


Techday NZ
3 days ago
- Techday NZ
Snowflake unveils AI tools for data analytics, migration
Snowflake has revealed new artificial intelligence (AI) products aimed at simplifying data analytics and accelerating migration from legacy systems for organisations across Canada and globally. The company announced its expansion of enterprise-grade AI with the introduction of Cortex AISQL and SnowConvert AI, designed to enable customers to extract insights from diverse data types while reducing operational costs and complexity. Cortex AISQL incorporates generative AI directly into database queries, allowing teams to analyse and act on multiple kinds of data—ranging from structured numbers to text, images and audio—while using the SQL syntax familiar to data professionals. The solution is intended to bring what Snowflake describes as "industry-leading performance and up to 60% cost savings when filtering or joining data." Organisations such as Hex, Sigma, and TS Imagine are among those already leveraging the capabilities of Cortex AISQL. SnowConvert AI addresses a common challenge faced by enterprises: moving data from existing warehouse and analytics platforms to modern systems. The tool uses AI automation to ease migrations from providers such as Oracle, Teradata, and Google BigQuery, reducing the need for manual re-coding and lowering the risks typically associated with large-scale data projects. "Every organization recognizes the potential of AI. But too often, harnessing AI means overcoming complex infrastructure, performance limitations, high costs, and a reliance on engineers to build custom pipelines. We're removing those barriers, whether it's enabling anyone to analyze and act on all their data with Cortex AISQL or accelerating migrations off legacy systems through SnowConvert AI. By empowering teams to move faster, work smarter, and turn data into real impact, we're reimagining analytics for the AI era," Carl Perry, Head of Analytics at Snowflake, commented on the new developments. "In capital markets, speed and precision are everything. For years, SQL has been the gold standard for transforming data — and now, with Cortex AISQL, we're extending that power to unstructured text. With AISQL, our teams can analyze documents, extract insights, and build intelligence directly in the language they already know — all without complex engineering workflows. It's a game-changer for how fast we can respond to markets and deliver value to clients, while leveraging the Snowflake architecture for high performing SQL processing," Thomas Bodenski, Chief Operating Officer of TS Imagine, said, sharing his perspective as a customer. Snowflake's Cortex AISQL harnesses generative AI—powered by models from providers such as Anthropic, Meta, Mistral, and OpenAI—to introduce advanced query functions into standard SQL. The result is that data analysts can use AI-powered functionalities within the security perimeter of the existing data cloud, without requiring specialist coding or external tools. According to Snowflake, ongoing performance optimisations have demonstrated between 30% and 70% improvements depending on the dataset, and up to 60% cost savings for certain operations. The integration enables organisations to break down traditional data silos. For example, analysts can merge structured customer data with unstructured data like chat transcripts, images, or social media content, and perform tasks including image classification, call transcript analysis and anomaly detection entirely through SQL queries. SnowConvert AI, meanwhile, is designed to make IT infrastructure upgrades more efficient, automating code conversion, report migration and data validation to streamline the transition to new platforms. By accelerating the code conversion and testing phases by two to three times, the tool is aimed at reducing the overall timeline and resource demands associated with digital transformation. Alongside these tools, Snowflake also announced updates to its platform to further support analytics on open source data formats such as Apache Iceberg tables, and launched Standard Warehouse - Generation 2, which introduces hardware and software improvements to boost analytics performance by 2.1 times over previous editions. With the introduction of these AI-powered features, Snowflake is positioning its data cloud as a central platform for enterprises seeking to modernise analytics and data handling capabilities, and to extract actionable business insights from both structured and unstructured sources.


Techday NZ
4 days ago
- Techday NZ
Fivetran expands SDK to simplify building custom data connectors
Fivetran has expanded its Connector SDK to enable custom connectors for any data source. The update allows developers to build pipelines connecting even unique or internally developed systems, facilitating the centralisation of company data for analytics, artificial intelligence, and business decision-making. With the Connector SDK, data teams now have the ability to build secure, reliable pipelines for a range of sources—from various applications and internal APIs to legacy systems. Developers write integration logic in Python, while Fivetran manages infrastructure elements such as deployment, orchestration, scaling, monitoring, and error handling. The process is designed to allow most connectors to be built and deployed within several hours, removing the need for DevOps support or dedicated infrastructure development. Anjan Kundavaram, Chief Product Officer at Fivetran, discussed the approach companies often take when a prebuilt connector is unavailable. He stated: "When there isn't a prebuilt connector, most teams end up building and maintaining custom pipelines themselves. That DIY approach may seem flexible at first, but it often becomes a long-term burden with hidden costs in reliability, security, and maintenance. The Connector SDK changes that. Now, any engineer can build a custom connector for any source and run it with the same infrastructure, performance, and reliability as Fivetran's native connectors. It gives companies the flexibility they need without the tradeoffs." The SDK offers the same infrastructure that supports Fivetran's managed connectors, handling automatic retries, monitoring, and alerting to ensure the accurate delivery of data to destinations such as BigQuery, Databricks, Snowflake, and other platforms. Babacar Seck, Head of Data Integration at Saint-Gobain, shared his perspective on their experience with the Connector SDK. He said: "The SDK was a huge surprise in the best way. We expected to keep using Azure Data Factory for APIs because it was the only option. But once we saw what we could do with Fivetran's Connector SDK, everything changed. We can now build custom connectors in-house and respond to business needs much faster — all while seamlessly delivering data into Snowflake on Azure." The company noted that the Connector SDK is being demonstrated to the public, with a focus on allowing data engineers to build custom connectors for moving data into cloud destinations tailored for analytics and artificial intelligence workloads. Fivetran is known for working with organisations across various industries, enabling them to centralise data from software-as-a-service applications, databases, files, and additional sources into cloud destinations such as data lakes. The company's approach emphasises high-performance pipelines, security, and interoperability to help organisations enhance or modernise their data infrastructure.