logo
#

Latest news with #Looker

Qlik Widens Interoperable Data Platform With Open Lakehouse
Qlik Widens Interoperable Data Platform With Open Lakehouse

Forbes

time15-05-2025

  • Business
  • Forbes

Qlik Widens Interoperable Data Platform With Open Lakehouse

The people of Inle Lake (called Intha), some 70,000 of them, live in four cities bordering the lake, ... More in numerous small villages along the lake's shores, and on the lake itself. (Photo by Neil Thomas/Corbis via Getty Images) Software comes in builds. When source code is compiled and combined with its associated libraries into an executable format, a build is ready to run, in basic terms. The construction analogy here extends directly to the data architecture that the code is composed of and draws upon. Because data architectures today are as diverse as the software application types above them, data integration specialists now have to work across complex data landscapes and remain alert to subsidence, fragilities and leakage. These software and data construct realities drive us towards a point where data integration, data quality control and data analytics start to blend. Key players in this market include Informatica, SnapLogic, Rivery, Boomi, Fivetran, Tibco, Oracle with its Data Integrator service and Talend, the latter now being part of Qlik. Key differentiators in the data analytics and integration space generally manifest themselves in terms of how complex the platform is to set up and install (Informatica is weighty, but commensurately complex), how flexible the tools are from a customization perspective (Fivetran is fully managed, but less flexible as a result), how natively aligned the service is to the environment it has to run in (no surprise, Microsoft Azure Data Factory is native with Microsoft ecosystem technologies) and how far the data integration and analytics services on offer can be used by less technical businesspeople. As this vast marketplace also straddles business intelligence, there are wider reputable forces at play here from firms including Salesforce's Tableau, Microsoft's Power BI, Google's Looker and ThoughtSpot for its easy-to-use natural language data vizualizations. Where one vendor will tell us its dashboards are simplicity itself, another will stress how comprehensively end-to-end its technology proposition is. Generally cloud-based and often with a good open source heritage, the data integration, data quality and data analytics space is a noisy but quite happy place. Looking specifically at Qlik, the company is known for its 'associative' data engine, which offers freeform data analytics that highlight relationships between data sets in non-linear directions without the need for predefined queries. It also offers real-time data pipelines and data analytics dashboards. The organizations's central product set includes Qlik Sense, an AI-fuelled data analytics platform service with interactive dashboards that also offers 'guided analytics' to align users towards a standard business process or workflow. QlikView is a business intelligence service with dynamic dashboards and reports - and Qlik Data Integration (the clue is in the name) for data integration and data quality controls with a web-based user interface that support both on-premises and cloud deployments. Qlik champions end-to-end data capabilities, that means the tools here extend from the raw data ingestion stage all the way through to so-called 'actionable insights' (that term data analytics vendors swoon over), which are now underpinned and augmented by a new set of AI services. The company's AI-enhanced analytics and self-service AI services enable users to build customized AI models, which help identify key drivers and trends in their data. Not as historically dominant in open source code community contribution involvement as some others (although a keen advocate of open data and open APIs, with news on open source Apache Iceberg updates in the wings) Qlik has been called out for its pricing structure complexity. From a wider perspective, the company's associative engine and its more unified approach to both data analytics and data integration (plus its self-service analytics capabilities) are probably the factors that set it apart. 'Qlik's analytics-centric origins and methodical, iterative portfolio development has made it the BI platform for data geeks and data scientists alike, but thankfully hasn't made it overly conservative. The company has accelerated its product strategy in the past four years, adding data quality with the Talend acquisition and 'AI for BI' with the AutoML acquisition (originally Big Squid). These, plus modernization capabilities for customers who need it - Qlik Sense for accessibility to broader user bases, Qlik Cloud for an as-a-Service model… and the tools to migrate to them, make Qlik worth watching in today's increasingly data-driven and visualization-driven, AI-empowered enterprise market,' explained Guy Currier, a technology analyst at the Futurum Group. Looking to extend its data platform proposition right now, Qlik Open Lakehouse is a new and fully-managed Apache Iceberg solution built into Qlik Talend Cloud. As explained here, a data lakehouse combines the structure, management and querying capabilities of a data warehouse, with the low-cost benefits of a data lake. Apache Iceberg is an open source format technology for managing large datasets in data lakes with data consistency. Designed for enterprises under pressure to scale faster, the company says its Qlik Open Lakehouse delivers real-time ingestion, automated optimization and multi-engine interoperability. 'Performance and cost should no longer be a tradeoff in modern data architectures,' said Mike Capone, CEO of Qlik. 'With Qlik Open Lakehouse, enterprises gain real-time scale, full control over their data and the freedom to choose the tools that work best for them. We built this to meet the demands of AI and analytics at enterprise scale and without compromise.' Capone has detailed the company's progression when talking to press and analysts this month. He explained that for many years, Qlik has been known for its visual data analytics services and indeed, the organization still gets customer wins on that basis. 'But a lot has happened in recent times and the conversation with users has really come around to gravitate on data with a more concerted focus. With data quality [spanning everything from deduplication to analysis tools to validate the worth of a team's data model] being an essential part of that conversation - and the old adage of garbage in, garbage out still very much holding true - the icing on the cake for us was the Talend acquisition [for its data integration, quality and governance capabilities] because customers cleary found it really expensive to cobble all the parts of their data estate together. Now we can say that all the component parts of our own technology proposition come together with precision-engineered fit and performance characteristics better than ever before,' said Capone. Keen to stress the need for rationalized application of technologies so that the right tool is used for the appropriate job, Capone says that the Qlik platform enables users to custom align services for specific tasks i.e. software engineering and data management teams need not use a super-expensive compute function when the use case is suited to a more lightweight set of functions. He also notes that the company's application of agentic AI technology pervades 'throughout the entire Qlik platform'; this means that not only can teams use natural language queries to perform business intelligence and business integration tasks, they can also ask questions in natural language related to data quality to ensure an organization's data model's veracity, timeliness and relevance is also on target. But does he really mean any data tool openness in a way that enables customers the 'freedom to choose the tools' that work best for them? 'Absolutely. If a company wants to use some Tableau, some Informatica and some Tibco, then we think they should be able to work with all those toolsets and also deploy with us at whatever level works for the business to be most successful. Obviously I'm going to tell you that those customers will naturally gravitate to use more Qlik as they experience our functionality and cost-performance advantage without being constrained by vendor lock-in, but that's how good technology should work,' underlined Capone. Freedom to choose your own big data tools and analytics engines sounds appealing, but why do organizations need this scope and does it just introduce complexity from a management perspective? David Navarro, data domain architect at Toyota Motor Europe, thinks this is 'development worth keenly watching' right now. This is because large corporations like his need interoperability between different (often rather diverse) business units and between different partners, each managing its own technology stack with different data architects, different data topographies and all with their own data sovereignty stipulations. 'Apache Iceberg is emerging as the key to zero-copy data sharing across vendor-independent lakehouses and Qlik's commitment to delivering performance and control in these complex, dynamic landscapes is precisely what the industry requires,' said Navarro, when asked to comment on this recent product news. Qlik tells us that all these developments are an evolution of modern data architectures in this time of AI adoption. It's a period where the company says that the cost and rigidity of traditional data warehouses have become unsustainable. Qlik Open Lakehouse offers a different path i.e. it is a fully managed lakehouse architecture powered by Apache Iceberg to offer 2.5x–5x faster query performance and up to 50% lower infrastructure costs. The company says that it achieves this while maintaining full compatibility with the most widely used analytics and machine learning engines. Qlik Open Lakehouse is built for scale, flexibility and performance… and it combines real-time ingestion, intelligent optimization and ecosystem interoperability in a single, fully managed platform. Capabilities here include real-time ingestion at enterprise scale, so (for example) a customer could ingest millions of records per second from hundreds of sources (e.g. cloud apps, SaaS, ERP suites and mainframes and plug that data directly into Iceberg tables with low latency and high throughput. Qlik's Adaptive Iceberg Optimizer handles compaction, clustering and 'pruning' (removing irrelevant, redundant and often low-value data from a dataset) automatically, with no tuning required. Users can access data in Iceberg tables using a variety of Iceberg-compatible engines without replatforming or reprocessing, including Snowflake, Amazon Athena, Apache Spark, Trino and SageMaker. 'Although clearly fairly proficient in across a number of disciplines including data integration, analytics and data quality controls, one of the challenges of Qlik and similar platforms is the limited scope for truly advanced analytics capabilities," said Jerry Yurchisin, senior data science strategist at Gurobi, a company known for its mathematical optimization decision intelligence technology. 'This can mean that users have to take on extra configuration responsibilities or make use of an extended set of third-party tools. Data scientists, programmers, analysts and others really want one place to do all of their work, so it's important for all platforms to move in that direction. This starts with data integrity, visualization and all parts of the analytics spectrum - not just descriptive and predictive, but also prescriptive - which is arguably the holy grail for data management at this level.' Director of research, analytics and data at ISG Software Research, Matt Aslett spends a lot of time analyzing data lakehouse architectures in a variety of cloud computing deployment scenarios. He suggests that products like Qlik Open Lakehouse, which use open standards such as Apache Iceberg, are 'well-positioned' to meet the growing demand for real-time data access and multi-engine interoperability. 'This enables enterprises to harness the full potential of their data for AI and analytics initiatives," said Aslett. 'As AI workloads demand faster access to broader, fresher datasets, open formats like Apache Iceberg are becoming the new foundation. Qlik Open Lakehouse responds to this shift by making it effortless to build and manage Iceberg-based architectures, without the need for custom code or pipeline babysitting. It also runs within the customer's own AWS environment, ensuring data privacy, cost control and full operational visibility. In line with what currently appears to drive every single enterprise technology vendor's roadmap bar none, Qlik has also tabled new agentic AI functions in its platform this year. Here we find a conversational interface designed to give users an evenue to 'interact naturally' with data. If none of us can ever claim to have had a real world natural data interaction, in this case the term refers to data exploration with the Qlik engine to uncover indexed relationships across data. The agentic functions on offer work across Qlik Cloud platform and so offer data integration, data quality and analytics. It's all about giving businesspeople a more intuitive visibility into data analytics for decision making. Also new are an expanded set of capabilities in Qlik Cloud Analytics. These include functions to detect anomalies, forecast complex trends, prepare data faster and take action through what the company calls 'embedded decision workflows' today. 'While organizations continue to invest heavily in AI and data, most still struggle to turn insight into impact. Dashboards pile up, but real-time execution remains elusive. Only 26% of enterprises have deployed AI at scale and fewer still have embedded it into operational workflows. The problem isn't access to static intelligence, it's the ability to act on it. Dashboards aren't decision engines and predictive models alone won't prevent risk or drive outcomes. What businesses need is intelligence that anticipates, explains, and enables action without added tools, delays, or friction. Discovery agent, multivariate time series forecasting, write table, and table recipe work in concert to solve a singular problem: how to move from fragmented insight to seamless execution, at scale,' said the company, in a product statement that promises to target 'critical enterprise bottlenecks' and close the gap between data and decisions. The data integration, data quality, data analytics and AI-powered data services market continues to expand, but we can perhaps pick up on some defining trends here. An alignment towards essentially open source technologies, protocols and standards is key, especially in a world of open cloud-native Kubernetes. Provision of self-service functionalities is also fundamental, whether they manifest themselves as developer self-service tools or as 'citizen user' abstractions that allow businesspeople to use deep tech… or both. A direct embrace of AI-driven services is, of course, a prerequisite now, as is the ability to provide more unified technology services (all firms have too many enterprise apps… and they know it) that work across as wide an end-to-end transept as is physically and technically possible. Qlik is getting a lot of that right, but no single vendor in this space can get everything absolutely comprehensively perfected it seems, so there will always be a need for data integration, even across and between the data integration space.

Grafton preview: Looker keen to add to impressive tally
Grafton preview: Looker keen to add to impressive tally

Courier-Mail

time08-05-2025

  • Sport
  • Courier-Mail

Grafton preview: Looker keen to add to impressive tally

Don't miss out on the headlines from Horse Racing. Followed categories will be added to My News. Country Championship Final winning jockey Ben Looker aims to add the $500,000 The Coast to his resume but not before boosting his extraordinary Grafton tally. The one-time apprentice to local legend and Kosciuszko winning trainer John Shelton, Looker brought up his 1,300th winner when Point Out won at Quirindi 11 days ago. Not surprisingly, Grafton is Looker's most successful track with a current total of 259 wins. The most recent of those came on April 13 when Looker steered the lightly-raced and promising Olivershare to his first win at career start number two. 'He's still learning what it is all about but he was really good to win a Maiden from an awkward gate without a lot of luck in the middle stages,' Looker said. 'When he got to the second horse, he got passed it, then just floated a little bit. The Form: Complete NSW Racing thoroughbred form, including video replays and all you need to know about every horse, jockey and trainer. Find a winner here! 'Going from a Maiden to a 58 is pretty hard to do nowadays with the Benchmark system. 'He's racing horses that have won more than one or two races but he has got a lot of upside and from the one alley he is going to be able to put himself there and he'll look the winner at some stage. 'He's a nice enough horse that I've got a lot of time for.' Looker is naturally eager to reacquaint himself with the well-bred Veandechance in the Brett Bellamy Congratulations On 1000 Benchmark 82 Handicap (1435m). The last time he rode the Colt Prosser-trained relative of Melbourne Cup winner Jezabeel was at her most recent start, winning a slogging mile affair at Taree on March 25. 'She hasn't raced for a bit over six weeks just with all the wet weather there really hasn't been any races for her,' Looker explained. 'Whatever she does over the 1400m, I feel that she is going to improve getting back up to a mile and 2000m down the track. 'But Grafton is rain-affected at the moment and she is very dynamic on a wet track.' Looker's first chance to win on the day comes in Edwards Irrigation Consulting Maiden (1030m) where he pilots the Showtime two-year-old Ol' Mate Coop. 'He went around a while ago at Coffs Harbour,' Looker said. 'Obviously Shaggy won the race and won quite well but the second horse has come out and won a race since and my horse ran third. 'Tony Newing's horses race well at their home track. 'He's a nice horse with a lot of upside.' While Looker's immediate focus centres on Grafton, all roads lead to Gosford on Saturday where the trusty hoop will hop aboard the Dynamic Syndications mare Rapt in a shared quest for the $500,000 The Coast. 'I was meant to ride Overriding in (The Coast) last year but unfortunately Nathan (Doyle) had to scratch,' Looker said. 'Rapt is not a short-price fancy but you can't really fault her form. 'She has raced well at Gosford in the past, she's three from three at the distance, and I thought her run was pretty good in the Provincial Championship Final. 'In a race where not a lot made ground from back in the field, I thought she did a good job to make up the ground she did. 'If she gets the luck, I think she'll beat more home than beat her home.' ■ ■ ■ ■ ■ Donna Grisedale has provided an update on her husband/jockey Jon, weeks after a dangerous incident at a set of post-race barrier trials at Grafton. Grisedale, who has ridden close to 1,350 winners, was dislodged from his mount after it reared and fell on top of him while parading behind the barriers prior to the first of two Heats at Grafton on April 13. 'Jon's good,' Donna Grisedale said. 'It's probably halfway through with his broken leg but he is mending well and now it is a just a matter of time, hopefully another three weeks.' Jon Grisedale will be watching on from home as his wife saddles-up two of the string at Grafton, namely the stable recruit Ostracised together with the speedy Super One son, Super Jaie. Ostracised was bought, and sold, for a fourth time in his career, when knocked down for $10,000 at the Inglis February Online Sale in 2024. The one-time Team Hawkes resident has been a model of consistency since his change of address, posting four thirds in his nine starts for Grisedale. 'We haven't had a jumpout or a trial so he is literally going straight into it pretty much raw but he still should run well, he loves the wet,' the trainer said. 'He's a good money-spinner. His first-up run is usually his worst and after that, he is very consistent. 'We've been trying to find a wet track but they just seem to be getting too wet and washed so hopefully Friday is his day.' Super Jaie, meanwhile, seems blessed with a favourable alley (two) in the last of the ten races on the marathon card. 'Ideally a Good 4 would be what he is looking for opposed to a Heavy track but we are going four and five weeks between runs with every horse so I am just losing fitness tremendously with too many of them,' said Grisedale. Grisedale's gelding's three wins career have come at distance ranging from 1106m to 1250m. Super Jaie's broodmare sires read like a who's who of Australia's most recognisable stallions including Snitzel, Century, Vain, Todman and Heroic. On top of all that, Super Jaie's ninth dam is the 1945 Melbourne Cup winner, Rainbird.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store