News
JetBrains Expands AI Assistant to Boost Developer Productivity with Claude and OpenAI Models
- By John K. Waters
- February 12, 2025
JetBrains has unveiled a significant expansion to its AI-powered coding tool, AI Assistant, with support for the latest AI models from Anthropic and OpenAI. The update integrates Claude Sonnet 3.5 and Claude Haiku 3.5, along with OpenAI's o1, o1-mini, and o3-mini models. Additionally, AI Assistant now allows software developers to connect to locally hosted models via LM Studio, enhancing flexibility and data privacy.
For software developers, AI Assistant is meant to be more than just an incremental upgrade. It's designed to perform as an intelligent co-pilot that streamlines the coding workflow. Originally developed as an enhancement to JetBrains’ profiling tool, AI Assistant now offers an integrated AI chat, code explanations, automated documentation generation, intelligent naming suggestions, and commit message drafting.
By dynamically selecting the optimal model for each task, AI Assistant can ensure a balance between performance and cost-efficiency. JetBrains’ AI service intelligently routes queries to the most suitable model, leveraging an architecture that supports seamless upgrades to newer AI models without requiring developers to change providers. The latest update capitalizes on this capability by incorporating Anthropic’s Claude 3.5 Sonnet and Haiku models, provisioned through Amazon Bedrock. This global infrastructure ensures developers experience minimal latency and uninterrupted access to AI services while adhering to data residency requirements across 17 regions worldwide.
Anthropic’s Claude 3.5 Sonnet, the flagship of the 3.5 series, has set industry benchmarks in graduate-level reasoning, undergraduate-level knowledge comprehension, and coding proficiency. Developers benefit from its improved understanding of nuance, humor, and complex instructions, all key attributes for refining code and enhancing software logic.
For developers seeking speed and cost-efficiency, AI Assistant now supports OpenAI’s o3-mini and o1-mini models. These compact models prioritize rapid processing and are designed to be particularly well-suited for coding, scientific computing, and mathematical tasks.
Perhaps the most developer-centric enhancement is AI Assistant’s new ability to interface with locally hosted models via LM Studio. This feature was developed to provide greater control over AI-assisted coding environments, allowing developers to work with custom models while maintaining strict data privacy. The ability to run AI workloads locally means developers can fine-tune their coding assistants to align with proprietary standards and unique project requirements.
Embedded within JetBrains IDEs, AI Assistant has the potential to become an essential tool for developers aiming to optimize their workflows. Whether generating code snippets, debugging issues, or refactoring functions, developers using AI Assistant report saving up to eight hours per week, according to the company, a productivity boost that underscores AI’s growing role in software development.
Prague-based JetBrains makes a lineup of more than 30 intelligent software development tools, including the popular IntelliJ IDEA IDE for Java developers and PyCharm for Python devs. The company is also the creator of Kotlin, a popular cross-platform, statically typed, general-purpose high-level programming language with type inference. The company's tools are used by more than 11.4 million professionals and 88 of the Fortune Global Top 100 companies.
About the Author
John K. Waters is the editor in chief of a number of Converge360.com sites, with a focus on high-end development, AI and future tech. He's been writing about cutting-edge technologies and culture of Silicon Valley for more than two decades, and he's written more than a dozen books. He also co-scripted the documentary film Silicon Valley: A 100 Year Renaissance, which aired on PBS. He can be reached at [email protected].