Latest news with #JetBrains


Techday NZ
23-05-2025
- Techday NZ
Azul & JetBrains partner to boost Kotlin performance on JVM
Azul and JetBrains have announced a technical collaboration aimed at enhancing the runtime performance and scalability of web and server-side Kotlin applications. The partnership seeks to improve the interaction between Kotlin-generated bytecode and the Java Virtual Machine (JVM) environment. By leveraging Azul's expertise in JVM performance and JetBrains' development of the Kotlin language, the companies aim to optimise how modern applications execute, particularly in terms of scalability and efficiency. Kotlin, a high-level programming language designed to be fully compatible with the JVM, draws much of its application performance and scaling characteristics from the JVM itself. The collaboration centres on utilising Azul Platform Prime, which features the Zing JDK with Azul's Falcon LLVM-based Just-In-Time (JIT) compiler and the C4 garbage collector. According to data from the TechEmpower Web Framework Benchmarks, applications built in Kotlin and running on Azul Platform Prime exhibited a 23.9% reduction in latency and a 30.5% increase in throughput compared to the same workloads on OpenJDK. These results highlight a measurable performance boost for web and server-side applications using the new environment. Azul Platform Prime is available for x86 and ARM64 processors running Linux, and it is tested for compatibility with enterprise workloads through the industry standard Java Compatibility Kit (JCK or TCK). The platform builds on the OpenJDK code base, with specific improvements targeting JIT compilation, garbage collection, and startup behaviour. The Falcon compiler, integrated within Azul Platform Prime, is based on the LLVM infrastructure and is designed to optimise application code at runtime for increased execution efficiency. The C4 garbage collector, also part of the platform, maintains application execution without disruption across a spectrum of memory requirements, from gigabytes to terabytes, and sustains high allocation rates. This approach is intended to improve throughput and response times while reducing requirements for IT infrastructure and cloud resources. Vsevolod Tolstopyatov, Kotlin Project Lead at JetBrains, commented on the collaboration's significance: "From its inception, Kotlin was designed with the goal of building robust applications, including server-side solutions where performance is critical. Performance has always been a priority for us because it's essential to our users." "Kotlin incorporates specific language features, such as inline functions and inline classes, which directly enhance performance. We've developed the Kotlin Coroutines library to facilitate concurrency and asynchronous programming, enabling efficient, scalable applications. Recognising that the JVM runtime is one of the most critical elements in application performance, we believe our collaboration with Azul will unlock new opportunities to further elevate the performance capabilities of Kotlin applications." Gil Tene, Co-Founder and Chief Technology Officer at Azul, highlighted the performance benefits: "The Azul JVM is the fastest and cheapest way to run Kotlin applications. Azul Platform Prime is built to solve the performance challenges enterprise applications face at scale. By providing a runtime that significantly improves the execution of JVM-based applications, which includes Kotlin-based applications, organisations can deploy applications more rapidly, with less tuning and ensure scalability as demands grow." "Through our strategic collaboration with JetBrains, we help Kotlin teams to significantly boost their DevOps productivity and runtime application efficiency, which enables them to achieve their business priorities while improving the bottom line." For Kotlin developers seeking to assess these performance improvements, Azul Platform Prime Stream Builds are available for download and are free for evaluation and development use.


TechCrunch
21-05-2025
- Business
- TechCrunch
Mistral's new Devstral model was designed for coding
AI startup Mistral on Wednesday announced a new AI model focused on coding: Devstral. Devstral, which Mistral says was developed in partnership with AI company All Hands AI, is openly available under an Apache 2.0 license, meaning it can be used commercially without restriction. Mistral claims that Devstral outperforms other open models like Google's Gemma 3 27B and Chinese AI lab DeepSeek's V3 on SWE-Bench Verified, a benchmark measuring coding skills. 'Devstral […] is trained to solve real GitHub issues,' writes Mistral in a blog post provided to TechCrunch. '[I]t runs over code agent scaffolds such as OpenHands or SWE-Agent, which define the interface between the model and the test cases […] Devstral is light enough to run on a single [Nvidia] RTX 4090 or a Mac with 32GB RAM, making it an ideal choice for local deployment and on-device use.' Mistral's benchmarking results for Devstral. Image Credits:Mistral Devstral arrives as AI coding assistants — and the models powering them — grow increasingly popular. Just last month, JetBrains, the company behind a range of popular app development tools, released its first 'open' AI model for coding. In recent months, AI outfits including Google, Windsurf, and OpenAI have also unveiled models, both openly available and proprietary, optimized for programming tasks. AI models still struggle to code quality software — code-generating AI tends to introduce security vulnerabilities and errors, owing to weaknesses in areas like the ability to understand programming logic. Yet their promise to boost coding productivity is pushing companies — and developers — to rapidly adopt them. One recent poll found that 76% of devs used or were planning to use AI tools in their development processes last year. Mistral previously waded into the assistive programming space with Codestral, a generative model for code. But Codestral wasn't released under a license that permitted devs to use the model for commercial applications; its license explicitly banned 'any internal usage by employees in the context of [a] company's business activities.' Devstral, which Mistral is calling a 'research preview,' can be downloaded from AI development platforms including Hugging Face and also tapped through Mistral's API. It's priced at $0.1 per million input tokens and $0.3 per million output tokens, tokens being the raw bits of data that AI models work with. (A million tokens is equivalent to about 750,000 words, or roughly 163,000 words longer than 'War and Peace.') Techcrunch event Join us at TechCrunch Sessions: AI Secure your spot for our leading AI industry event with speakers from OpenAI, Anthropic, and Cohere. For a limited time, tickets are just $292 for an entire day of expert talks, workshops, and potent networking. Exhibit at TechCrunch Sessions: AI Secure your spot at TC Sessions: AI and show 1,200+ decision-makers what you've built — without the big spend. Available through May 9 or while tables last. Berkeley, CA | REGISTER NOW Mistral says it's 'hard at work building a larger agentic coding model that will be available in the coming weeks.' Devstral isn't a small model per se, but it's on the smaller side at 24 billion parameters. (Parameters roughly correspond to a model's problem-solving skills, and models with more parameters generally perform better than those with fewer parameters.) Mistral, founded in 2023, is a frontier model lab, aiming to build a range of AI-powered services, including a chatbot platform, Le Chat, and mobile apps. It's backed by VCs including General Catalyst, and has raised over €1.1 billion (roughly $1.24 billion) to date. Mistral's customers include BNP Paribas, AXA, and Mirakl. Devstral is Mistral's third product launch this month. A few weeks ago, Mistral launched Mistral Medium 3, an efficient general-purpose model. Around the same time, the company rolled out Le Chat Enterprise, a corporate-focused chatbot service that offers tools like an AI 'agent' builder and integrates Mistral's models with third-party services like Gmail, Google Drive, and SharePoint.


Business Wire
20-05-2025
- Business
- Business Wire
Azul and JetBrains Collaborate to Enhance Runtime Performance for Kotlin Workloads
SUNNYVALE, Calif.--(BUSINESS WIRE)-- Azul, the only company 100% focused on Java, and JetBrains, the leading provider of professional software development tools and creator of the Kotlin programming language, today announced a strategic technical collaboration to enhance the runtime performance and scalability of web and server-side Kotlin applications. This strategic collaboration empowers Kotlin teams to accelerate development cycles and optimize application performance, helping them support their business priorities while driving greater operational efficiency. Joining Forces to Advance Kotlin Performance Azul and JetBrains have joined forces with a shared vision: to reexamine how Kotlin-generated bytecode interacts with the Java runtime (known as a Java Virtual Machine or JVM) and uncover new paths to improve application performance. By combining Azul's deep expertise in the JVM and application performance, with Kotlin's precise control over bytecode generation, the collaboration creates a unique opportunity to optimize the entire execution stack for modern applications. While Kotlin is a cross-platform, general-purpose high-level programming language designed to interoperate fully with the JVM, runtime performance and scalability are derived predominantly from the JVM. By leveraging the proven performance of Azul Platform Prime, which includes the Zing JDK incorporating Azul's Falcon LLVM-based JIT compiler and C4 garbage collector, Kotlin web and server-side applications can see a demonstrable performance improvement. When comparing Kotlin applications on Azul Platform Prime vs. off-the-shelf OpenJDK using the TechEmpower Web Framework Benchmarks, Azul Platform Prime reduced latencies by 23.9% and improved throughput by as much as 30.5%. For more details on the benchmark methodology and results, visit the JetBrains Blog. Azul Platform Prime – Engineered for Speed, Scale and Stability Azul Platform Prime is available for x86 and ARM64 processors running Linux and is rigorously tested for enterprise workloads and compliant with the Java SE version standards using the industry standard Java Compatibility Kit (the JCK or TCK) test suite. It is based on the same 'HotSpot' JVM and JDK code base used by the OpenJDK project, with specific enhancements relating to JIT compilation, garbage collection and startup/warmup behavior. Azul's Falcon is an LLVM-based JIT compiler that delivers highly optimized application code at runtime. C4 (Continuously Concurrent Compacting Collector) is a proven, high-performance garbage collector that maintains concurrent, disruption-free application execution across wide ranges of heap sizes from GBs to multi-TBs, and allocation rates from MBs/sec to tens of GB/sec. Together, these runtime features provide significantly improved application operating characteristics and carrying capacity, improving application throughout and response times while lowering the infrastructure and cloud costs required to run business-critical workloads. 'From its inception, Kotlin was designed with the goal of building robust applications, including server-side solutions where performance is critical. Performance has always been a priority for us because it's essential to our users,' said Vsevolod Tolstopyatov, Kotlin project lead. 'Kotlin incorporates specific language features, such as inline functions and inline classes, which directly enhance performance. We've developed the Kotlin Coroutines library to facilitate concurrency and asynchronous programming, enabling efficient, scalable applications. Recognizing that the JVM runtime is one of the most critical elements in application performance, we believe our collaboration with Azul will unlock new opportunities to further elevate the performance capabilities of Kotlin applications.' 'The Azul JVM is the fastest and cheapest way to run Kotlin applications. Azul Platform Prime is built to solve the performance challenges enterprise applications face at scale. By providing a runtime that significantly improves the execution of JVM-based applications, which includes Kotlin-based applications, organizations can deploy applications more rapidly, with less tuning and ensure scalability as demands grow,' said Gil Tene, co-founder and chief technology officer at Azul. 'Through our strategic collaboration with JetBrains, we help Kotlin teams to significantly boost their DevOps productivity and runtime application efficiency, which enables them to achieve their business priorities while improving the bottom line.' For Kotlin development teams that are looking to enhance the performance of their web or server-side applications, engineers can download Azul Platform Prime Stream Builds, which are free for evaluation and development, to profile the benefits of the Zing JDK with their Kotlin applications. Additional Resources: About JetBrains JetBrains creates intelligent software development tools used by over 11.4 million professionals and 88 Fortune Global Top 100 companies. Its lineup of more than 30 products includes award-winning IDEs like IntelliJ IDEA and PyCharm, as well as the JetBrains AI-powered coding assistant, coding agent Junie and productivity-boosting team tools like YouTrack, Qodana, and TeamCity. JetBrains is also the creator of Kotlin, a cross-platform language used by more than 2.5 million developers worldwide yearly. The company is headquartered in Amsterdam, the Netherlands, and has offices around the world. For more information, please visit About Azul Headquartered in Sunnyvale, California, Azul provides the Java platform for the modern cloud enterprise. Azul is the only company 100% focused on Java. Millions of Java developers, hundreds of millions of devices and the world's most highly regarded businesses trust Azul to power their applications with exceptional capabilities, performance, security, value, and success. Azul customers include 36% of the Fortune 100, 50% of Forbes top 10 World's Most Valuable Brands, all 10 of the world's top 10 financial trading companies. and leading brands like Avaya, Bazaarvoice, BMW, Deutsche Telekom, LG, Mastercard, Mizuho, Priceline, Salesforce, Software AG, and Workday. Learn more at and follow us @azulsystems. JetBrains name is the registered trademark of JetBrains s.r.o. and Kotlin is the trademark of the Kotlin Foundation.
Yahoo
09-05-2025
- Business
- Yahoo
Zencoder Launches Zen Agents: Industry-First Platform for Organization-Wide Custom Agents and Open-Source Marketplace
Breakthrough Technology Empowers Development Teams to Create, Share, and Leverage Specialized AI Tools That Accelerate Software Delivery SILICON VALLEY, Calif., May 9, 2025 /PRNewswire/ -- Today, Zencoder announces the launch of Zen Agents, delivering two groundbreaking innovations that transform AI-assisted development: a platform enabling teams to create and share custom agents organization-wide, and an open-source marketplace for community-contributed agents. This breakthrough significantly expands Zencoder's ability to empower professional developers to ship better software faster through its deep integration with VS Code, JetBrains, and other industry-standard tools. Custom Agents with Organization-Wide Sharing Zen Agents solve real development challenges by letting teams: Eliminate redundant coding tasks: Create specialized agents with tailored instructions and toolsets for specific frameworks, codebases, or workflows. Standardize development practices: Admin-created agents can be deployed to every team member, ensuring consistent practices and standards Reduce Context-Switching Cost: Connect seamlessly to tools through native integrations & MCP to maintain developer focus. "Zen Agents create the perfect harmony between human creativity and targeted AI assistance," said Andrew Filev, CEO and Founder of Zencoder. "By enabling teams to craft agents with specific expertise and then deploy them organization-wide, we're helping developers achieve that elusive technical flow state where complex problems seem to solve themselves." Matt Walker, Co-founder and CTO of Simon Data, shares the real-world impact: "Zen Agents unlocked the next level of productivity for our teams. Through MCP integration with our Jira, Confluence, and GitHub tools, our custom agents automate not just coding but project management and documentation tasks as well. By sharing these specialized agents across teams, we've eliminated hours of context-switching and created an organization that benefits from our collective expertise. Zencoder delivers what modern engineering teams truly need – AI that understands our entire development ecosystem." Open-Source Agent Marketplace Complementing the organization-wide functionality, Zencoder introduces an industry-first open-source marketplace for AI agents: Community-Driven Library: A public GitHub repository under MIT License where developers can discover and share agent configurations Transparent Submissions: Streamlined process for contributing custom agents via pull requests, with expert review by Zencoder Web-Based Discovery Portal: Dedicated section on the Zencoder website for browsing the expanding agent ecosystem Unlike other AI coding platforms that limit sharing capabilities, ZenAgents creates a truly open ecosystem where teams can publish their specialized agents to the marketplace or discover ready-made agents from other contributors. This collaborative approach exponentially multiplies the value of custom agents while fostering a community that builds on collective expertise. "Our mission goes beyond building products—we're fostering a community where engineering knowledge multiplies," noted Filev. "The open-source marketplace means everyone benefits from specialized expertise, whether you're building microservices, working with machine learning pipelines, or optimizing legacy code." Use Cases That Deliver Measurable Results Zen Agents deliver concrete value for development teams through specialized AI assistants including: Framework Experts: Agents that deeply understand React, Django, Spring, or other frameworks, instantly recalling best practices and common patterns Testing Specialists: Focused agents that generate comprehensive test suites aligned with organization-specific testing approaches Refactoring Architects: Specialized tools that implement organization-specific refactoring patterns and practices Documentation Craftsmen: Agents that create standardized, high-quality documentation that follows team conventions These specialized agents leverage Zencoder's industry-leading integration capabilities: Visual MCP Tool Configuration: Intuitive interface for creating custom Model Context Protocol connections Pre-vetted Tool Library: Curated collection of productivity-enhancing MCP servers, including support for select third-party registries. End-to-End Pipeline Support: Connect agents to your entire development workflow with 20+ DevOps integrations Availability & Resources Zen Agents are available now as part of Zencoder's platform. For more information or to start a free trial, visit The Zen Agents open-source marketplace is accessible at under an MIT license, with initial contributions from Zencoder and a growing community of developers. Browse available agents and learn more about contributing at About Zencoder Zencoder, headquartered in Silicon Valley, delivers AI coding and testing agents that integrate directly into developers' existing workflows. Founded by Andrew Filev and driven by a global team of 50+ experienced engineers, Zencoder combines deep code understanding (Repo Grokking™) with iterative AI approaches (Agentic Repair) to help organizations ship impactful software products faster. Zencoder holds ISO 27001, SOC 2 Type II certification as well as ISO 42001 for responsible AI management systems. Media Contact Ignacio Ramirez, Founder @ Switch PR, email: ignacio@ Phone: 415-517-6708 All product and company names herein may be trademarks of their respective owners. Zencoder is not affiliated with or endorsed by GitHub. View original content to download multimedia: SOURCE Zencoder


Geeky Gadgets
07-05-2025
- Geeky Gadgets
New JetBrains Junie AI Coding Assistant That Adapts to Your Coding Style
What if your coding assistant could not only save you time but also anticipate your needs, adapt to your style, and seamlessly integrate into your workflow? Enter Junie, the latest AI-powered innovation from JetBrains, designed to transform the way developers work in PHPStorm. Unlike traditional tools that often feel rigid or disconnected, Junie promises a level of personalization and efficiency that sets it apart. Whether you're generating complex migrations, running terminal commands, or crafting elegant frontend designs, Junie is engineered to handle it all with minimal effort on your part. It's not just a tool—it's a partner in your development journey. In this overview, we'll explore how Junie redefines productivity in software development. From its adaptive coding capabilities to its deep integration with PHPStorm, this assistant offers features that cater to both backend and frontend needs. You'll discover how Junie automates repetitive tasks, eliminates the hassle of switching between tools, and even learns your unique coding preferences over time. But is it truly the 'Cursor alternative' developers have been waiting for? Nuno Maduro unpacks its standout features and see how it measures up to the growing demands of modern development. Junie: AI Coding Assistant TL;DR Key Takeaways : JetBrains introduced Junie, an AI-powered coding assistant integrated into PHPStorm, designed to enhance productivity and streamline workflows with features like code generation, validation, and terminal command execution. Junie automates repetitive tasks such as generating migrations, models, and controllers while adapting to user-defined coding standards, reducing errors and saving time. The tool supports frontend development by generating layouts using Blade templates and Tailwind CSS, bridging the gap between backend and frontend workflows. Junie personalizes its functionality by learning and adapting to individual coding styles, offering tailored outputs that align with user preferences over time. Seamlessly integrated into PHPStorm, Junie eliminates the need for external tools, providing a unified, efficient, and minimalist development environment. Key Features of Junie Junie offers a comprehensive set of features tailored to meet the diverse needs of developers. Its core functionalities include: Code generation and validation: Automates the creation of code while adhering to user-defined standards and preferences. Automates the creation of code while adhering to user-defined standards and preferences. Terminal command execution: Executes commands directly within PHPStorm, including automated linting and testing processes. Executes commands directly within PHPStorm, including automated linting and testing processes. UI and frontend enhancements: Generates layouts using Blade templates and Tailwind CSS for visually appealing designs. Generates layouts using Blade templates and Tailwind CSS for visually appealing designs. Customization: Learns and adapts to your unique coding style and preferences over time. Learns and adapts to your unique coding style and preferences over time. Seamless integration: Operates entirely within PHPStorm, providing a unified development environment. Streamlined Code Generation and Validation Junie transforms the process of writing code by automating repetitive tasks such as generating migrations, models, controllers, and other essential components. It adapts to your specific coding standards, making sure the generated code aligns with your preferences. For example, if you avoid using fillable attributes or require specific array structures, Junie adjusts its output accordingly. This level of precision reduces the need for manual edits, minimizes errors, and saves valuable time. By automating these tasks, Junie enables developers to focus on solving complex problems rather than spending time on routine coding activities. New AI Editor by JetBrains: Junie (Cursor alternative) Stay informed about the latest in AI-Powered Coding Assistant by exploring our other resources and articles. Efficient Terminal Command Execution One of Junie's standout features is its ability to execute terminal commands directly within PHPStorm. This eliminates the need to switch between tools, creating a more cohesive workflow. Its 'Brave Mode' allows commands to run without requiring confirmation, significantly speeding up processes such as migrations, testing, and deployment. However, this feature is designed for experienced users who can manage the risks associated with automated command execution. Additionally, Junie automates linting and testing, making sure your code adheres to established standards before deployment. By integrating terminal functionality into the IDE, Junie enhances overall efficiency and reduces the cognitive load associated with managing multiple tools. Enhanced Frontend Development Junie extends its capabilities to frontend development, offering tools to generate layouts using Blade templates and Tailwind CSS. Whether you need a simple structure or a design inspired by popular styles like Revolut, Junie adapts to your requirements. This feature simplifies the creation of visually appealing and functional user interfaces, allowing developers to focus on backend logic without compromising on frontend quality. By bridging the gap between backend and frontend development, Junie provides a holistic solution for building modern web applications. Personalized Customization A defining characteristic of Junie is its ability to learn and adapt to your coding habits. Over time, it refines its outputs based on your feedback, making sure the generated code aligns more closely with your style. For instance, if you prioritize specific database constraints or prefer a particular array format, Junie incorporates these preferences into its workflow. This personalized approach makes Junie a dynamic tool that evolves alongside your development needs, offering a tailored experience that enhances productivity and consistency. Seamless Integration with PHPStorm Junie's deep integration with PHPStorm ensures a cohesive and unified development experience. Operating directly within the IDE, it eliminates the need for external tools, allowing developers to code, test, and execute commands in one environment. This seamless integration not only boosts productivity but also ensures that Junie's features are readily accessible whenever they are needed. By consolidating essential tools into a single platform, Junie simplifies the development process and reduces the friction associated with switching between multiple applications. Minimalism and Efficiency in Development Junie embodies the principles of minimalism and efficiency, focusing on delivering high-quality outputs with minimal configuration. Its 'vibe coding' approach prioritizes functionality and accuracy, avoiding unnecessary complexity. This makes it an ideal choice for developers who value streamlined workflows and precise results. By automating repetitive tasks and reducing the cognitive load, Junie allows developers to concentrate on more complex and creative aspects of their projects. Its emphasis on simplicity and effectiveness ensures that it meets the needs of both experienced developers and those new to the field. Transforming the Development Landscape Junie represents a significant advancement in AI-powered development tools. Its robust feature set—ranging from automated code generation and terminal command execution to frontend enhancements and personalized customization—provides a comprehensive solution for optimizing workflows. By integrating seamlessly with PHPStorm, Junie delivers a cohesive and efficient development experience that adapts to the unique needs of each user. Whether you are an experienced developer or just starting your journey, Junie's focus on minimalism, functionality, and adaptability ensures it is a valuable addition to your toolkit. Media Credit: Nuno Maduro Latest Geeky Gadgets Deals Some of our articles include affiliate links. If you buy something through one of these links, Geeky Gadgets may earn an affiliate commission. Learn about our Disclosure Policy