Latest news with #Qlik


Forbes
4 days ago
- Business
- Forbes
The role of field CTO is now more common in enterprise software.
Software is now much friendlier. Back in the pre-millennial and perhaps pre-nineties day, enterprises got their suite of applications and data services and had to graft them onto their operational structures as best they could. A degree of customization was always possible, but not in the same way that we can now build 'composable' componetized apps - and not in the way that solidified open source offerings also offer companies a route to building freeform, sometimes-experimental or even esoteric use case software services. The ability to work with software vendors more directly on their platform roadmaps and toolset extensions is now far more prevalent. Big technology vendors host special interest group bodies who examine potential service enhancements, customers are able to provide feedback on use case successes (and failures, or shortfalls) through formalized 'voice of customer' initiatives… and software vendors are just that bit more approachable overall. Because vendors now have to be more engaged, they have to be out in the field more. This has led to the evolution and rise of the field chief technology officer, an individual who works somewhere in between pre-sales (where customers are laying out the scope of their requirements), sales itself and after-sales (somewhat like a luxury car maintenance service, but without the complimentary vacuum cleaning service). But what's really behind the rise of the field CTO, how embedded and formalized is this role, what key functions do they perform and what should they be doing next? Martin Tombs is VP for global go-to-market for analytics and field CTO for EMEA region at Qlik. He explains that the role itself is very much about thought leadership (he's a trained software engineer) and the ability to look for revenue generation opportunities while still working very closely with customers on the implementation of software toolsets to assess what works well, what could work better and what's not working now but could be made to work tomorrow. 'It's not just about trying to encourage customers to stay on platform and avoid customizations, it's more often about understanding the business requirements and technical scope (including incumbent skillsets) that a customer has when they are implementing software,' explained Tombs, who also gets heavily involved with the activities associated in his firm's customer advisory group. Commentators and practitioners in this space talk of significant strategic shifts. These are shifts towards data-centric planning, shifts towards developer-driven decision making and (of course, no surprise) shifts towards deeper penetration of AI. 'The enterprise software market has moved beyond the days when you'd simply buy Oracle or SAP (or other) and run your entire business through it. Today we see that software-as-a-service vendors, especially in the realm of data management, now fiercely compete for specific use cases. Just look at Databricks versus Snowflake with their similar offerings but distinct strengths in ML/AI and reporting analytics, respectively. This shift requires product teams to implement an additional strategic layer connecting existing offerings with customer goals and new capabilities. Field CTOs now operate on two fronts: helping customers navigate externally while reducing internal friction between initial concept and delivered value,' said Viraj Parekh, VP of sales engineering, co-founder and former field CTO at orchestration-centric DataOps platform Astronomer. The field CTO appears to be especially prevalent in platform-type software companies i.e. those selling software services designed to get deeply woven into a customer's digital ecosystem. In this regard, perhaps think about Databricks rather than Zoom. These aren't plug-and-play tools; they're significant commitments that transform how businesses operate. 'As these platforms have become more sophisticated, turning business goals into technical reality has become increasingly complex. That's where I come in,' enthused an upbeat Kenneth Stott, field CTO at Hasura, creators of the PromptQL data agent that enables reliable AI systems to work on data. 'Software vendors need someone in a role who can talk tech strategy without sounding like they're just trying to close a deal. The best field CTOs aren't primarily sales-driven - they're relationship builders. Customers need to see them as trusted peers who can think through all the mess i.e. technical requirements, organizational impact, policy implications, change management challenges - all of it. They're typically partnering with the customer's CTO or their direct reports to make things happen.' So then, is the role becoming more widespread? It appears to be more common in companies selling complex technical platforms, although there aren't many formal studies of this yet (** technology analyst house suddenly thinks about commissioning report**) and many customers aren't always that familiar with the job title. 'Looking at where we are now in field sales (and questioning how formalized and standardized this position is), it has to be said that some field CTOs are somewhere close to being glorified pre-sales engineers, while others are pure strategic consultants. You'll find different reporting structures, influence levels and responsibilities across companies, but they're all operating in that sweet spot between technical expertise and strategic advisory,' details Stott, before listing this role's key function as follows: As this role starts to further cement itself in management structures, the Hasura tech leader thinks there's no urgent need to reinvent it. He suggests that field CTOs are 'naturally positioned to become ecosystem orchestrators' now. Which means that they will help coordinate multiple vendors around customer objectives. As technology ecosystems get more complex, having someone who can see the big picture while understanding the technical details becomes even more important. 'Today we can say that field CTOs have a special opportunity because we aren't trapped in the lab or behind a desk. Since we're at technology industry shows and in the room interacting with customers and prospects, so we get to see what's really happening across the new, unexpected use cases, the rising pressures and needs and so much more. You only learn about those by being out in the world. The value of listening to leaders about how, where, and why they want to use our software services is immense, because it changes the equation. It's not us telling them about how they need to use our software or our latest engineering marvels. Instead, we get to focus on business problems, which is where the real value really comes from,' said Michael Donahue, Pentaho global field CTO. 'The job is a really mix of cheerleading, problem solving and applying front-line feedback directly into how we build and deliver for customers.' Manesh Tailor agrees. He is field CTO for EMEA region at New Relic. Tailor began his career as a developer and has spent close to 20 years immersed in the observability and monitoring space, giving him deep industry expertise and a strong technical foundation. Tailor assumed the field CTO role at New Relic in January of 2025 following a 10-year tenure at the company where he rose through the ranks as a technical account manager, software analytics architect and most recently as the director of field engineering, EMEA. He leads a team of solution architects, who are themselves industry thought leaders. 'The field CTO role at New Relic comprises a deep expertise level across three key areas: our business and technology; our customers' business and technology; and technical thought leadership,' said Tailor. 'But it's not just about technical knowledge and complex problem solving. For me, the role is about leveraging my industry experience to encourage other software engineers to not only succeed, but to be able to differentiate themselves in their markets.' Nick Jablonski, field CTO at Domino Data Lab says that the role gravitates around the need to very responsibly 'bridge and forge the gap' between how customers want to use the company's platform, in relation to what it actually takes to make that process work in complex, real-world environments. 'It's all about crafting an amalgam and mix of technical fluency and domain expertize and being able to translate that into how a platform can help deliver on business impact. That dual (or perhaps even three-level) perspective i.e. guiding customers and influencing our own roadmap, has become more strategic in recent years. I'm now in deeper conversations about how our platform serves specific industries such as life sciences, financial services and the public sector… and how we evolve it to meet their emerging needs.' There's a realization at this point that a Field CTO isn't there to repeat what's in a vendor's technical product spec whitepaper, they have to provide validated evidence of how a product will behave in the real world, under real pressure, in a customer's exact setup. Andy Pernsteiner, field CTO lead at Vast Data says his company's field CTOs (plural, there are more than one) routinely build 'tailored walkthroughs' with exact commands, outputs, and rationale for a client's use case. 'These field professionals run tests on behalf of customers, flag bugs based not on the spec but on how users actually expect things to work and drive feature requests back into engineering based on real operational needs - not imagined ones,' said Pernsteiner. 'We've built a team of field CTOs with deep domain expertize in performance, networking, protocols and AI, not just generalists, but people who've lived the same challenges our customers face. Many of them came from the very organizations that we now support. That context matters. It means they know what's at stake and they know what good (and not just good enough) looks like. The field CTO is there to help us cut through abstraction to deliver software services that hold up under scrutiny.' Hopefully, the need to surface an analysis of this comparatively modern role is clear here. Professional software engineers who work in this position are becoming more prevalent and prominent as they now also start to receive enough media training to talk to the technology press. While the role itself may still be subject to a fair degree of flux, there appears to be a solid understanding of how and why this role is now needed among the software engineering community. The only apparent challenge now is that the software engineering community here is almost exclusively the software engineering fraternity; when we get more women in field CTO roles, things will be solidified.


Business Wire
22-05-2025
- Automotive
- Business Wire
AutoNation Achieves 300% Growth in User Adoption with Qlik Cloud Analytics
PHILADELPHIA--(BUSINESS WIRE)-- Qlik ®, a global leader in data integration, data quality, analytics, and artificial intelligence (AI) has announced that AutoNation, the largest automotive retailer in the U.S., has enhanced its marketing analytics operations by adopting Qlik Cloud Analytics™ and Snowflake. The move has expanded the benefits of Qlik across the business, delivering valuable insights into marketing activities and improving return on investment (ROMI). With the cloud-based solution, AutoNation has unlocked new levels of operational efficiency, allowing its engineers and developers to focus on innovation rather than maintaining infrastructure. As the largest automotive dealer group in the U.S., AutoNation's diverse operations cover everything from new and used vehicle sales to finance, service, and repairs. With its scale and purchasing power, AutoNation offers customers unique advantages that smaller dealerships can't match. The company's marketing operations are central to its success, with the primary focus being the return on marketing investments (ROMI). Upon implementing Qlik Cloud Analytics, AutoNation immediately reaped the benefits. Moving to the cloud resolved connectivity problems, improved efficiency, and provided greater reliability in data processing. Qlik Cloud Analytics allowed users to take full advantage of AutoNation's growing data reserves and Qlik's visualization capabilities. 'The migration was far simpler than I ever expected, and that was down to Qlik DataTransfer,' shared Aaron Corneail, Senior Business Intelligence Developer and Administrator at AutoNation. 'My focus has moved away from fixing broken tools to building new ones, as people ask me to add new tables, dashboards, and maps. And GeoAnalytics allows us to focus on the areas that matter and cut wasteful spending on areas that don't bring us any advantage.' With Qlik Cloud Analytics, AutoNation now enjoys a much faster connection to Snowflake and significantly reduced internal processes. Data is now reloaded and updated at the same time, providing immediate and accurate insights. AutoNation now has access to a reporting platform that empowers its marketing teams to make informed decisions based on up-to-date data. With Qlik Cloud Analytics, the company is able to assess the effectiveness of marketing campaigns, determine ROMI, and adjust future strategies to improve performance. 'The tools we're now giving people help them make better decisions. They could spend hours and hours working through data and not achieve the insights we can with Qlik,' Corneail explained. 'That makes everyone look good and our user base has grown by 300% as a result.' Looking forward, AutoNation is already trialing Qlik Predict™ and exploring the potential of Qlik Answers™, which promises to take analytics even further across the business by tapping into the potential of unstructured enterprise content. 'Engineers are developing tasks in Qlik Predict to bring predictive AI into our dashboards and apps, taking results and building new KPIs and what-could-happen scenarios,' Corneail says. 'Qlik Answers has a huge potential benefit, and it will take the benefits of Qlik further across the business as people see that they can just ask a question and have charts, graphs, and tables fed back to them. That's really going to be the next big thing." About AutoNation AutoNation is an automotive retailer based in Fort Lauderdale, Florida, that supplies new and pre-owned vehicles and related services across the U.S. Founded in 1996 with 12 locations, AutoNation now operates over 300 retail outlets nationwide. About Qlik Qlik converts complex data landscapes into actionable insights, driving strategic business outcomes. Serving over 40,000 global customers, our portfolio provides advanced, enterprise-grade AI/ML, data integration, and analytics. Our AI/ML tools, both practical and scalable, lead to better decisions, faster. We excel in data integration and governance, offering comprehensive solutions that work with diverse data sources. Intuitive analytics from Qlik uncover hidden patterns, empowering teams to address complex challenges and seize new opportunities. As strategic partners, our platform-agnostic technology and expertise make our customers more competitive. © 2025 QlikTech International AB. All rights reserved. All company and/or product names may be trade names, trademarks and/or registered trademarks of the respective owners with which they are associated.


Indian Express
21-05-2025
- Business
- Indian Express
From searching to answers: Qlik CTO explains how AI is reshaping data interaction
'If you look at the evolution of data, the earliest uses were basic. People captured data in spreadsheets and notes to make decisions. What has evolved are the techniques and organisational literacy around leveraging it,' said Sharad Kumar, CTO of Qlik, while describing the evolution of data. Data is no longer just columns and rows; it has moved on from being a unidimensional fact and figure to something more dynamic. Today, almost every aspect of our life is governed by data, and we have arrived at a point where data is enabling decision-making for organisations. On the sidelines of the recently held Qlik Connect 2025 in Orlando, caught up with Kumar, who shared his insights on how AI is shaping data integration and modern business strategy. During the conversation, Kumar outlined three major transformations in data analytics over the years. He shared that it all began with the centralisation phase with data warehousing. 'When we started building data warehouses like Teradata decades ago, it was the first transformational change. We focused on pulling data once, centralising it in one place, and making it easier for people to access. This gave us a backward view of data, which we call descriptive analytics'. The next phase was predictive analytics. Kumar revealed that this was the phase when machines were being trained and building machine learning algorithms on the same data. Later the world moved from a historical view to a forward-looking view that could predict outcomes for smarter decisions. 'Think about recommendation engines on Amazon or Netflix—that's machine learning in action.' According to Kumar, the recent transformation came with the generative AI wave. 'Suddenly having access to ChatGPT two years ago completely changed the landscape.' What fundamentally changed was how humans interacted with data. Now it's not about searching for information; it's about getting answers—a fundamental switch,' he explained, adding that the evolution continues at an accelerating pace. Kumar went on to state that the next wave is already here: agentic AI. With agentic AI, it is not about asking; Kumar feels that one can express their intent, and agents will determine which processes to deploy and in what sequence. 'Going from warehousing to predictive took a long time, but the transitions from predictive to generative and from generative to agentic are happening much faster. The pace of change is compressing,' Kumar said. As generative AI has become a buzzword across the spectrum, we asked Kumar what was hype and what was real when it came to its enterprise use cases. The Qlik executive acknowledged that while generative AI has captured the attention of the C-suite, its implementation hasn't been an easy one for many. Kumar also said that the ground realities are different. 'When you talk to data and AI practitioners, you find that the data is not ready. It's messy, siloed, low quality, not timely, and often can't be trusted. If you build AI systems on bad data, they will fail,' he said, adding that this was indicative of why success rates remain modest. 'Only about 25 per cent of AI projects are truly succeeding in delivering business value. The biggest challenge is the data foundation,' he said. When asked how the gap can be closed, Kumar recommended a two-pronged approach. 'Enterprises that are succeeding are starting with narrow AI use cases that are contained and less risky. At the same time, they're focusing on getting their data foundation right, which is the only way to scale AI effectively,' he said. On being asked how Qlik's platform supports the journey from raw data to business outcomes, Kumar explained that the platform offers a wholesome assistance to businesses through their data journeys. The executive said that the journey begins with data collection. 'First, we provide capabilities to get data from anywhere—databases, SaaS applications, complex systems like mainframe and SAP, files, and streams—at high velocity in near real-time.' Data collection is followed by integration. Kumar said that Qlik allows businesses to join and integrate siloed data. 'Unless you can join data together, you cannot get a complete picture. If customer information is in one system, purchases in another, and return information in a third, you need to connect these to understand your customer.' After integration, building trust in data follows. The company helps businesses by helping them assess data quality, preserving the lineage of data to trace their roots. Later, the Qlik platform enables multiple types of analytics. 'Once you have a trusted data foundation, you can build BI visualisation dashboards for descriptive analytics, machine learning models for predictive analytics, and conversational agents for generative AI,' he explained. Kumar added that finally Qlik enables action, as it allows customers to take insights and automate actions on them. When it came to challenges faced by enterprises in modernising their data, Kumar revealed that there are three primary challenges, such as data migration, skill gaps, and funding. Data migration is a challenge, as most data today, according to Kumar, continues to be in on-premise systems. Getting this data onto the cloud is a considerable challenge for many. On the other hand, with many organisations moving to cloud and AI, Kumar feels that most of them often lack the necessary skills, especially for AI implementation. Lastly, with funding, most companies think that they don't need much budget for AI, as ChatGPT gives the perception that you can quickly apply models. 'What we're finding is that you need a significant budget to fix your data foundation, which is a heavy lift,' he noted. When asked what his recommendations would be for organisations, Kumar said, 'Funding for data foundation should be rolled into their overall AI initiative funding. If you don't properly fund your data initiatives and have the right technology and the right skills, you'll face challenges.' Lastly, on being asked what excites him the most about the future of data and AI, the Qlik executive said that potential applications of AI to streamline data workflows are something that he looks forward to. More broadly, he sees AI transforming every aspect of business and daily life. Bijin Jose, an Assistant Editor at Indian Express Online in New Delhi, is a technology journalist with a portfolio spanning various prestigious publications. Starting as a citizen journalist with The Times of India in 2013, he transitioned through roles at India Today Digital and The Economic Times, before finding his niche at The Indian Express. With a BA in English from Maharaja Sayajirao University, Vadodara, and an MA in English Literature, Bijin's expertise extends from crime reporting to cultural features. With a keen interest in closely covering developments in artificial intelligence, Bijin provides nuanced perspectives on its implications for society and beyond. ... Read More


Tahawul Tech
20-05-2025
- Business
- Tahawul Tech
Qlik launches new Open Lakehouse
Qlik®, a global leader in data integration, data quality, analytics, and artificial intelligence, recently announced the launch of Qlik Open Lakehouse, a fully managed Apache Iceberg solution built into Qlik Talend Cloud. Designed for enterprises under pressure to scale faster and spend less, Qlik Open Lakehouse delivers real-time ingestion, automated optimisation, and multi-engine interoperability — without vendor lock-in or operational overhead. This marks a major step forward in the evolution of modern data architectures. As organisations accelerate AI adoption, the cost and rigidity of traditional data warehouses have become unsustainable. Qlik Open Lakehouse offers a new path: a fully managed lakehouse architecture powered by Apache Iceberg that delivers 2.5x–5x faster query performance and up to 50% lower infrastructure costs, while maintaining full compatibility with the most widely used analytics and machine learning engines. 'Performance and cost should no longer be a tradeoff in modern data architectures', said Mike Capone, CEO of Qlik. 'With Qlik Open Lakehouse, enterprises gain real-time scale, full control over their data, and the freedom to choose the tools that work best for them. We built this to meet the demands of AI and analytics at enterprise scale — without compromise'. Qlik Open Lakehouse is built from the ground up to meet the scale, flexibility, and performance demands of modern enterprises — without the tradeoffs. It combines real-time ingestion, intelligent optimisation, and true ecosystem interoperability in a single, fully managed platform. Real-time ingestion at enterprise scale: Ingest millions of records per second from hundreds of sources — including cloud apps, SaaS, SAP, and mainframes — directly into Iceberg tables with low latency and high throughput. Ingest millions of records per second from hundreds of sources — including cloud apps, SaaS, SAP, and mainframes — directly into Iceberg tables with low latency and high throughput. Intelligent Iceberg optimisation, fully automated: Qlik's always-on adaptive Iceberg optimiser handles compaction, clustering, and pruning automatically, delivering up to 5x faster queries and 50% lower storage costs — no tuning required. Qlik's always-on adaptive Iceberg optimiser handles compaction, clustering, and pruning automatically, delivering up to 5x faster queries and 50% lower storage costs — no tuning required. Open by design, interoperable by default: Access data in Iceberg tables using a variety of Iceberg-compatible engines, including Snowflake, Amazon Athena, Apache Spark, Trino, and SageMaker — without re-platforming or reprocessing. Access data in Iceberg tables using a variety of Iceberg-compatible engines, including Snowflake, Amazon Athena, Apache Spark, Trino, and SageMaker — without re-platforming or reprocessing. Your compute, your cloud, your rules: Runs natively in your AWS VPC with Bring Your Own Compute (BYOC), giving you full control over performance, security, and cost. Runs natively in your AWS VPC with Bring Your Own Compute (BYOC), giving you full control over performance, security, and cost. One platform, end to end: From ingestion and transformation to governance, data quality, and FinOps visibility, Qlik provides a unified lakehouse experience — no patchwork, no handoffs. 'Enterprises are increasingly adopting lakehouse architectures to unify data across on-premises and cloud environments', said Matt Aslett, Director of Research, Analytics and Data at ISG Software Research. 'Qlik Open Lakehouse, which leverages open standards such as Apache Iceberg, is well-positioned to meet the growing demand for real-time data access and multi-engine interoperability, enabling enterprises to harness the full potential of their data for AI and analytics initiatives'. As AI workloads demand faster access to broader, fresher datasets, open formats like Apache Iceberg are becoming the new foundation. Qlik Open Lakehouse responds to this shift by making it effortless to build and manage Iceberg-based architectures — without the need for custom code or pipeline babysitting. It also runs within the customer's own AWS environment, ensuring data privacy, cost control, and full operational visibility. 'Qlik Open Lakehouse initiative is a significant development we're keenly watching', shared David Navarro, Data Domain Architect at Toyota Motor Europe. 'Large corporations like ours urgently need interoperability between diverse business units and partners, each managing its own technology stack and data sovereignty. Apache Iceberg is emerging as the key to zero-copy data sharing across vendor-independent lakehouses, and Qlik's commitment to delivering performance and control in these complex, dynamic landscapes is precisely what the industry requires'. Qlik Open Lakehouse is available now in private preview and is scheduled to be generally available in July 2025. Private preview is limited — early access is encouraged for teams looking to modernise ahead of GA. To learn more and request early access, visit our website or connect with us at Qlik Connect, May 14–17 in Orlando. Image Credit: Qlik


Economic Times
18-05-2025
- Business
- Economic Times
Qlik bullish on India, aims to cross 1,000 customer base this year
Bullish on Indian market growth, leading data analytics and artificial intelligence player Qlik expects its customer base to cross 1,000 this year. "We have doubled our business in the last two years. We have a good representation of 800 plus customers now, including Indian Oil, Ather Energy, NSE and HDFC Life," Qlik Managing Director (India) Varun Babbar told PTI here. Prodded further about crossing the 1,000 customer milestone, he said, "We should be able to do it should happen this year, but it's difficult to say exactly when". There are a lot of small and midsize businesses that are doing quite well, and they are going to be the next set of partners, both on data analytics and AI strategy. From a headcount perspective, he said, India is the third largest employer within the Qlik ecosystem after the US and Sweden. India is prioritised a lot in terms of investment as well, he said, adding that, data centre was opened recently in India with a large investment. Located in Mumbai, this strategic investment enhances Qlik's global cloud infrastructure and deepens its long-term commitment to the Indian market, meeting the growing demand for local data storage, regulatory compliance, and advanced AI capabilities. Besides, he said, the company is making a lot of investment in advisory services, and through that, vertical guidance is provided to customers in terms of data environment and data strategies for achieving greater efficiency. Qlik senior vice president (APAC) Maurizio Garavello said India headcount has doubled in the last 16 months, and hiring would continue as business is growing in the market. Go big is the direction that top management has given as far as the Indian market is concerned, Garavello added. The US-based IT company, earlier this month, unveiled a new agentic experience to drive faster decisions and boost productivity by bringing new simplicity to complex data-driven workflows. Besides, the company launched Open Lakehouse, a fully managed Apache Iceberg solution built into Qlik Talend Cloud. The agentic experience will provide a single, conversational interface, allowing users across the enterprise to interact naturally with data, using specialised AI agents to quickly uncover insights, drive faster decisions, and boost productivity, bringing new simplicity to complex data-driven workflows. At the heart of this continuous innovation is the Qlik engine, a unique technology that indexes relationships across data, enabling the discovery of unexpected connections.