Azul Honors Global Innovators with Inaugural Java Hero Awards

Open-source Java platform provider Azul just announced the winners of its first-ever Azul Java Hero Awards, which recognizes exceptional achievements in Java deployments worldwide. The company named 17 organizations and individuals who "who achieved innovative world-class results with Java to help their businesses become more cost-effective, successful and efficient."

As far as I can tell (and I've looked), Azul is the only company out there right now that is 100% focused on Java. So, even though these awards were given to users of Azul's Platform Prime, a Java virtual machine and runtime platform built on OpenJDK (formerly known as "Zing"), and not Java generally, I think it's worth giving the winners a spotlight. Lest we forget, a great big chunk of the world still runs on Java.

Among the winners highlighted in Azul announcement was a company called Workday, which snagged the award for "Application Performance in Operations & Efficiency" by leveraging Azul Platform Prime, to dramatically reduced application pause times and improved scalability, resulting in a 95% increase in operational efficiency. This transformation not only saved the company millions of dollars, Azul noted in the announcement, it also freed developers to focus on more productive tasks.

In the public sector, Newcastle City Council (NCC) was recognized for its "Best Industry Use Case." NCC, the local government authority providing services to more than 300,000 residents of Newcastle, the largest city in northeast England, addressed critical security vulnerabilities by transitioning from Oracle Java to Azul Platform Core, thereby ensuring the resilience and compliance of its Java-based systems used by more than 1,100 employees. This move significantly reduced security risks across 5,000 desktops, earning high praise from NCC’s head of ICT & Digital, Jenny Nelson.

"Through our strategic partnership with Azul, we significantly reduced our security risk level with our Java applications and Java-based infrastructure, which certainly helps me sleep better at night," Nelson said in a statement.

Travelport, a key player in the global travel technology sector, won the award for "Application Performance in Customer Experience." After implementing Azul Platform Prime, Travelport saw a substantial improvement in application response times and a significant reduction in server usage, enhancing its ability to serve travel content to customers efficiently.

SGX FX, a leading provider of foreign exchange trading technology, was honored for "Best Industry Use Case in Trading Exchanges." The company’s transition to Azul Platform Prime enabled it to handle an enormous volume of transactions with minimal latency, setting new benchmarks for speed and reliability in the industry.

Jagex, the British gaming company behind the popular MMORPG RuneScape, received the award for "Best Industry Use Case in Gaming." By utilizing Azul Platform Prime, Jagex eliminated detectable pauses during gameplay, improving performance by 20% and significantly enhancing the player experience.

Other winners include LMAX Group, recognized for its remarkable performance in high-frequency trading, and Taboola, which received the "Best Industry Use Case in AdTech" award for its success in improving the efficiency and environmental impact of its server infrastructure.

The awards also highlighted individual achievements, such as those of Jeff Korpa from Teledyne Controls, who earned the "Java Migration Trailblazer" award for his strategic move away from Oracle Java SE, resulting in significant cost savings and enhanced security for his company.

"I’m inspired to see the remarkable ways in which companies around the world are leveraging Java to drive innovation, deliver exceptional performance, and optimize costs," said Scott Sellers, co-founder and CEO at Azul, in a statement. "The winners of our inaugural Azul Java Hero Awards embody the best of what can be achieved with Java, showcasing its enduring power and versatility in enterprise software. Their achievements highlight why Java remains the go-to language for businesses striving for excellence and efficiency."

The Java Hero Awards span multiple categories, recognizing excellence in Java migration, cloud cost savings, application performance, and specific industry use cases. These awards underscore the critical role Java continues to play in driving innovation and efficiency in various sectors.

For a full list of winners and categories, visit Azul's website.

Posted by John K. Waters on September 3, 20240 comments


Sway AI's Low-Code/No-Code Platform Comes to Microsoft Azure

Low-code/no-code AI platform provider Sway AI announced this week the integration of its namesake offering with Microsoft Azure, giving Azure customers a unique way to build and deploy secure AI and machine learning (ML) applications directly within the Azure ecosystem.

Sway AI is a fascinating application of low-code/no-code (LCNC) development. It's not a traditional development environment used for writing and managing code. Instead, it serves as a dev platform specifically for creating AI-driven solutions. It's not about coding, debugging, or version control, but developing AI models, performing data analysis, and deploying AI applications. Users can develop AI models by selecting pre-built algorithms, configuring them, and training them on their data, all without writing code.

Sway AI also allows users to automate complex workflows involving data processing, model training, and deployment. And the platform supports the integration of AI models into existing systems and their deployment in production environments, all handled through a no-code interface.

It's an LCNC tool that makes AI accessible to businesses and individuals across various industries—those citizen developers my colleague Howard M. Cohen writes about in his column.

The drag-and-drop interface is key, of course. It's designed to enable users to create AI workflows by simply connecting pre-built modules. This visual approach allows users to define data inputs, apply machine learning algorithms, and configure outputs without needing to write any code. The platform also comes with pre-built templates for common AI tasks, such as data analysis, image recognition, and natural language processing. Users can select and customize these templates according to their specific needs. It also automates many of the complex steps involved in AI model training and deployment. Users can upload their data, select an appropriate algorithm, and let the platform handle the training process. Once the model is trained, deployment is also simplified with just a few clicks. And the platform supports integration with a range of data sources and external applications through APIs, which makes it possible for users to connect their AI models to existing systems without needing to write integration code.

The integration of Sway AI with Azure's cloud infrastructure makes it possible for businesses to create, test, and deploy AI models without requiring extensive programming or data science expertise. This collaboration is expected to streamline AI workflows, reduce development time, and lower costs, making AI more accessible to a wider range of industries.

By integrating with Azure, Sway AI offers seamless compatibility with a bunch of Azure services, such as AKS Managed Kubernetes, Key Vault, and Virtual Networks, which allows organizations to leverage their existing data and infrastructure while benefiting from Sway AI's simplified AI development process.

I think it's fair to call Sway AI a cutting-edge platform, and its integration with Azure a significant step towards making AI development more accessible and secure for enterprises.

Posted by John K. Waters on August 20, 20240 comments


JetBrains Enhances AI-Powered Coding Tools in Latest 2024.2 IDE Updates

It has been said that coding can sometimes feel like trying to explain quantum physics to a cat. If any dev tool maker understands this, it's JetBrains, which is no doubt why they’ve supercharged their AI Assistant in the 2024.2 updates for their suite of IDEs. The latest version introduces advanced and faster code completion for Java, Kotlin, and Python, alongside a smarter AI chat powered by GPT-4o.

The AI Assistant integrated into the JetBrains IDEs can generate code, suggest fixes, refactor functions, and even help you come up with cool names for your variables. It’s like having a genius coder living in your computer, minus the annoying habits. Plus, it’s integrated with GitLab, so it can help you manage your code repositories without breaking a sweat.

JetBrains has trained its own large language models to improve code completion for Java, Kotlin, and Python, but more languages are on the way, the company says. The AI chat is now smarter, multilingual, and can even understand complex questions, thanks to the GPT-4o upgrade. It also boasts new tricks like AI-assisted VCS conflict resolution and customizable prompts for documentation and unit tests.

The updated IDEs also come with a shiny new user interface designed to be easier on the eyes and the brain. The new UI reduces visual clutter, making it simpler to find what you need, while still keeping the old-school UI available. The Search Everywhere feature has been enhanced to let you preview the elements you’re hunting for, and the IDEs now auto-detect and use your system’s proxy settings, so you don’t have to fiddle with them yourself.

Each IDE in JetBrains’ lineup has received individual love and attention. For instance, IntelliJ IDEA 2024.2 Ultimate can now run Spring Data JPA methods directly for instant repository query verification. It also features advanced cron expression autocompletion and a souped-up HTTP Client using the GraalJS execution engine.

PyCharm 2024.2 has revamped its Jupyter notebooks and introduced new AI cells for quicker data analysis. It also improved support for Hugging Face models and added the ability to connect to Databricks clusters. PhpStorm now has command auto-completion for Laravel, Symfony, WordPress, and Composer, and GoLand supports the latest Go features and method refactoring.

CLion’s updates include new features with the ReSharper C++ language engine, remote development via SSH, and collaborative development tools. WebStorm now lets you run and debug TypeScript files directly and supports frameworks like Next.js.

DataGrip users will find that the AI Assistant can help improve SQL queries by attaching a database schema for context. Aqua adds Playwright support for Python and Java, while RubyMine supports Hotwire Stimulus and completion for Kamal configuration files.

Finally, Rider has introduced LLM-powered single-line code completion for various languages and support for .NET 9 Preview SDK and C# 13 features.

JetBrains’ 2024.2 updates are packed with features that make coding less of a hair-pulling experience and more of a smooth, brainy ride. These awesome updates will be available soon, the company says.

I know you know this, but JIC: Prague-based JetBrains makes a lineup of more than 30 intelligent software development tools, including the popular IntelliJ IDEA IDE for Java developers and PyCharm for Python devs. The company is also the creator of Kotlin, a popular cross-platform, statically typed, general-purpose high-level programming language with type inference. The company's tools are used by more than 11.4 million professionals and 88 of the Fortune Global Top 100 companies.

Posted by John K. Waters on August 6, 20240 comments


The Hidden Vulnerability in Your Software Supply Chain

I read a lot of industry reports based on surveys of one group or another, mostly developers, but it's not often I lay my eyes on one that makes me laugh and shudder at the same time.

The report, "Know the Enemy: What Execs Need to Understand to Secure their Software Supply Chain," was sent to me by the folks at JFrog, best known for Artifactory, a universal DevOps solution for hosting, managing, and distributing binaries and artifacts, but currently billed more expansively as a universal software supply chain platform for DevOps, Security, and MLOps. The report organizes the findings of a global survey of C-level and senior executives, managers, and individual contributors (analysts, specialists, developers, programmers, engineers, etc.) conducted by Atomik Research on behalf of the company.

Here's the funny bit: The research revealed "significant disconnects between senior executives/managers and developers regarding enterprise application security." No! Really? That execs and devs have divergent views on the state of their organizations' security is a hilarious understatement.

Now, the scary part: According to the researchers, malicious actors see the software supply chain (SSC) as the new "soft target," because there are fewer protections in place than in other enterprise systems. They support this conclusion with a troubling statistic: Nearly a quarter of respondent (23%) to a June 2023 survey said their organization experienced some type of SSC breach, which is an increase of 241% from 2022.
Perhaps even scarier, less than a third of respondents (30%) indicated that a vulnerable software supply chain was a top security gap.

This lack of alignment and communication between decision-makers and the teams implementing security protocols is exacerbated, the survey suggests, by the diversity of programming languages and the integration of AI and machine learning (ML) models into software further. More than half of the surveyed organizations use four to nine different programming languages, and a third use more than ten. This variety not only broadens the attack surface but also challenges the ability to maintain consistent security standards.

The report further quantified the disconnect between senior execs and the devs on the ground when it comes to open-source security: While 92% of the responding executives believe their companies have measures in place to detect malicious open-source packages, only 70% of responding developers agree.

The solution might seem obvious: Talk to each other! But the researchers were a bit more specific in their recommendation: Companies should take steps now to adopt a comprehensive, end-to-end application security platform. This platform would unify security practices across the software development lifecycle, integrating automated scanning tools to identify vulnerabilities, unauthorized changes, and compliance issues. And almost as important, it would foster a culture of security awareness and collaboration across all levels of the organization.

The rapid adoption of AI/ML technologies, particularly in regions like the United States, underscores the urgency for robust SSC security frameworks, the report's authors conclude. Executives must recognize that the future of their company hinges, not only on innovation, but also on the resilience and trustworthiness of their software ecosystems.

In the end, securing the software supply chain is not just a technical challenge; it's a strategic imperative. As the lines between development and security blur, the need for cohesive, proactive measures becomes ever more critical. For businesses striving to stay ahead in a competitive market, the message is clear: secure your software supply chain, or risk becoming the next headline in a data breach story.

The folks at JFrog are hosting a webinar focused on the findings of this report on August 20, with Paul Davis, JFrog field CISO, and Aran Azarzar is JFrog's Chief Information Officer. You can register here.

Posted by John K. Waters on July 24, 20240 comments


JetBrains Launches Self-Hosted Version of Qodana

Software development tools maker JetBrains has announced the availability of a self-hosted version of its Qodana code quality platform. An extension of the cloud version launched last summer, this release is also based on the static code analysis engine of JetBrains' IDEs. The platform supports native integration with both those IDEs and VS Code, allowing developers to build quality gates in any CI environment, which helps to enforce coding standards enterprise-wide.

To state the obvious, code quality platforms are tools designed to evaluate the quality of a developer's code. They provide a general assessment of the effectiveness, reliability, and maintainability of the code, as well as how well it adheres to established coding standards. High-quality code is more readable, comprehensible, and modifiable, which reduces the likelihood of errors and enhances its adaptability to changes.

With Qodana, developers can identify issues as a part of their CI/CD pipelines and resolve them from within their IDEs, ensuring the code aligns with established quality standards. This is a time-saving feature meant to enhance overall code quality and reduce the risk of security failures and production issues while accelerating the delivery of new functionality.

Since the company launched the cloud version of Qodana last year, JetBrains has been bombarded with requests for a self-hosted version, Valerie Kuzmina, Product Marketing Manager in JetBrains Qodana and IDE Services group, said in a blog post

"With Qodana, we are on a mission to create an exceptional experience for development teams, making the entire journey – from setup to result analysis and fixes – easier and more enjoyable, increasing the adoption of server-side analysis," Kuzmina wrote.

The Qodana platform was developed to address a number of factors that contribute to the low adoption of static code analysis tools among developers, Kuzmina explained, which poses risks in product quality. Server-side analysis results are either ignored or, at best, grudgingly tolerated, she wrote because of the number of false positives, conflicts with IDE inspections , misaligned code quality guidelines, convoluted setups, an inability to fix issues quickly. And outdated UIs—all leading to what she called a "suboptimal developer experience."

"Following successful Beta tests with some of our clients," she wrote, "we're now launching the first release of Qodana Self-Hosted, allowing you to manage, maintain, and upgrade Qodana entirely on your end."

Currently, Qodana Self-Hosted supports Amazon Web Services (AWS). Additional hosting options will be added in future versions, the company says. If you're interested, you can request a demo here.

Prague-based JetBrains makes a lineup of more than 30 intelligent software development tools, including the popular IntelliJ IDEA IDE for Java developers and PyCharm for Python devs. The company is also the creator of Kotlin, a popular cross-platform, statically typed, general-purpose high-level programming language with type inference. The company's tools are used by more than 11.4 million professionals and 88 of the Fortune Global Top 100 companies.

Posted by John K. Waters on July 10, 20240 comments


Qt Group and LG Electronics Team Up to Revolutionize In-Car Entertainment

When I hear the word "infotainment," I automatically think of TV shows like "Animal Planet" or "The Daily Show." But it's also a term of art in the auto industry referring to in-car systems that combine entertainment, such as radio and music, with driving information, such as navigation. Modern in-vehicle infotainment systems connect with smart automotive technologies, such as Advanced Driver Assistance Systems (ADAS) and Vehicle-to-Everything (V2X) technology, which use sensors, cameras, and wireless connectivity to allow cars to connect to and communicate with their drivers and surroundings.

The entire automotive industry is developing technologies to enable better connectivity solutions, improve vehicle safety, and enhance the "in-vehicle user-experience," so the announcement that Qt Group and LG Electronics (LG) are collaborating to embed the Qt software framework within LG’s webOS-based in-vehicle entertainment platform, ACP, was not surprising. This partnership aims to equip automotive OEM developers and designers with the tools needed to create cutting-edge, immersive content-streaming services for vehicles.

This new initiative leverages Qt’s existing support for LG’s highly customizable, open-source webOS, which has been a staple in consumer electronics like smart TVs, signage, smart monitors, and home appliances. Historically, LG has utilized the Qt framework to develop user-friendly interfaces and intuitive user experiences. Now, the focus shifts to LG’s ACP, a platform specifically designed for enhancing the in-car content-streaming experience.

The collaboration with Qt is set to play a pivotal role in the continued evolution of this automotive content platform as it is integrated into more brands’ infotainment systems. Qt is a cross-platform application development framework for desktop, embedded and mobile. Its robust out-of-the-box features accelerate development processes, offering faster boot times, enhanced performance, and efficient memory usage, thus ensuring reliable and powerful capabilities.

"The development of advanced software is crucial for enhancing in-vehicle experiences, and the partnership between LG and Qt will increase our capabilities in this all-important area of mobility innovation," said Sang-yong Lee, senior VP of R&D at LG, in a statement. "LG will continue to collaborate with innovative partners like Qt to create immersive in-cabin experiences that meet the diverse demands of automakers and their customers."

This announcement coincides with new market research projections, predicting that the global infotainment market will reach USD 35.4 billion by 2030. More broadly, software-defined vehicles are expected to generate more than $650 billion in value for the auto industry by the same year. To support this growth, Qt has recently expanded access to its design and development tools for automotive brands such as General Motors and Mercedes-Benz. Earlier in 2024, Qt’s human-machine interface development platform was also added to the AWS Marketplace.

"LG has been a trusted Qt partner and leader in infotainment innovation for years, so we’re excited to help them enhance immersive in-car experiences," said Juha Varelius, CEO of Qt Group. "There’s a big ecosystem of developers making web-based applications for cars, but with Qt integrated into LG’s ACP powered by webOS, they can more easily build and run these applications natively within the OS. Most automotive players already have Qt-based assets in their software, and this partnership marks another significant milestone for us in the industry."

Helsinki-based Qt Group’s suite of tools for designing, developing, and ensuring product quality aims to foster closer alignment between developers and designers. These tools were created to streamline workflows, enabling concurrent work within the same framework, and are particularly suited for cross-platform development, especially for low-powered and embedded devices.

This partnership between Qt and LG represents a significant step forward in the infotainment space, promising to deliver more innovative and engaging experiences for drivers and passengers alike. But the real message is that developers will have to tools they need to leverage their skillsets as demand increases for so-called modern in-car experiences.

Posted by John K. Waters on June 26, 20240 comments


Eclipse Foundation Announces New Release of Eclipse Temurin Java SE Runtime

The folks at the Eclipse Foundation, in collaboration with the Adoptium Working Group, recently unveiled the latest release of Eclipse Temurin, the working group's OpenJDK distribution. This is the largest release to date; it with support for 54 version/platform combinations and five major OpenJDK versions, highlighting a commitment to diverse and comprehensive builds across Linux, Mac, Windows, and various architectures, including x64, ARM, and RISC-V.

"The incredible growth of Eclipse Temurin reflects a strong demand among developers for secure, high-quality, and community-driven open-source Java runtimes," said Thabang Mashologu, vice president of Community and Outreach for the Eclipse Foundation, in a statement. "The Adoptium Working Group’s efforts have been instrumental in delivering enterprise-ready runtime binaries and expanding the potential use cases for open-source Java. Eclipse Temurin is one of the first open-source Java distributions to support RISC-V, introducing new opportunities for Java in Industrial IoT and beyond."

The Eclipse Foundation is one of the world’s largest open-source software foundations. The Adoptium Working Group, which is the successor to AdoptOpenJDK, promotes and supports high-quality, TCK certified runtimes and associated technology for use across the Java ecosystem. Since it was established back in 2021, Adoptium has become the leading provider of high-quality OpenJDK-based binaries.

The list of key updates and developments in this Temurin release includes:

Unprecedented Growth and Adoption: "Growth" was a key word in this announcement. Eclipse Temurin is currently the fastest-growing open-source Java SE runtime, with more than 23 million downloads per month and more than 380 million downloads to date. According to a recent report by New Relic (New Relic, State of the Java Ecosystem, April 2024), Temurin has experienced 50% year-over-year growth, now representing 18% of the Java market as the second most popular JDK vendor.

Security Enhancements: Eclipse Temurin is pioneering software supply chain security practices, with nominated platform builds independently verified and inclusive of a comprehensive software bill of materials. The Foundation published a case study that underscores this commitment.

RISC-V Support: The new release supports RISC-V microprocessors, expanding its applications to embedded technologies, IoT, machine learning, automotive software, and high-performance computing.

The stats cited in that New Relic study are well worth noting:

In 2020, Oracle was the most popular JDK vendor, comprising roughly 75% of the Java market. There was a noticeable movement away from Oracle binaries after the more restrictive licensing of its JDK 11 distribution (before the return to a more open stance with Java 17), and we’ve seen a steady decline year-over-year (YoY) ever since then. While Oracle retained the top spot in 2022 (34%), it slipped to 29% in 2023, and it’s now at 21%—which represents a 28% decrease in one year.

The use of Amazon increased to 31% of the market in 2023 (up from 2.2% in 2020 and 22% in 2022), but has dropped to 18% in 2024, which represents a 43% decrease YoY.

The rising star this year is Eclipse Adoptium, adoption of which rose 50% YoY from 12% to 18%. Because Eclipse Adoptium is community-managed, this JDK tends to be updated more frequently than the Oracle and Amazon JDKs.

Eclipse Temurin is currently available for a wide range of platforms and Java SE versions. Multiple commercial support options are available for Temurin, with enterprise-grade support provided by members of the Adoptium Working Group, including Azul Systems, IBM, Open Elements, and Red Hat.

Posted by John K. Waters on June 4, 20240 comments


Java 22 Packs a Punch with 12 JEPs and Support for GenAI

Last month, Oracle dropped Java 22, adding a fresh batch of performance, stability, and security features to the venerable programming platform. This latest iteration introduces 12 JDK Enhancement Proposals (JEPs) aimed at refining everything from the Java language to its array of development tools. Though not a long-term support (LTS) release (the next LTS is Java 23), this release is a significant upgrade that includes new features focused on better enabling the use of Java for building AI applications.

Under the hood, Oracle is delivering language improvements from OpenJDK Project Amber, enhancements from Project Panama, features related to Project Loom, core libraries and tools capabilities, and performance updates.

"The new enhancements in Java 22 enable more developers to quickly and easily build and deliver feature-rich, scalable, and secure applications to help organizations across the globe grow their businesses," said Georges Saab, senior vice president in Oracle's Java Platform group and chair of the OpenJDK governing board, in a statement. "By delivering enhancements that streamline application development and extend Java's reach to make it accessible to developers of all proficiency levels, Java 22 will help drive the creation of a wide range of new applications and services for organizations and developers alike."

Among the stand-out features in this release, noted Sharat Chander, senior director of product management for the Java Platform at Oracle, is JEP 463, " Implicitly Declared Classes and Instance Main Methods," which is going to provide a kind of on-ramp for the next generation of developers to become familiar with Java, and eventually, active users.

"I'm very excited about  JEP 463," Chander told me. "It sort of welcomes a whole new generation of developers that might think Java is outdated. Of course, it's far from [that], based on all the innovation that have been coming since we started this six-month release cadence."

"The original founders of this entire concept and technology realized that you have to build a user base and a community to have something that's active," he added, "and you see a lot of modern languages, platforms, and solutions that have now realized that the secret sauce is to emulate what Java has done. For us, this is paramount to [Java's] success."

Arnal Dayaratna, research vice president in IDC's software development group, underscored Java's "versatility and comprehensive toolset," which enables it to support the development of production-grade, mission-critical applications at scale. This "positions it as a key enabling technology for innovative use cases such as generative AI."

"After nearly three decades, Java's ability to support complex development tasks that span a wide range of use cases makes the platform as relevant as it has ever been," Dayaratna said in a statement. "

Java 22 includes a number of features in preview, which Saab told me reflects the benefits of the six-month release cadence Oracle continues to maintain.

"In the old days, it was always a problem getting feedback on features," he said. "We would work on something and have early access builds, which a few passionate people would download and try. But people have day jobs! Most just weren't going to download something, try it, and give feedback. And if they did, it was probably too late to do anything about it. Now, the whole intent of the preview feature is put it in something that people are downloading anyway and make it super easy to try it and give us feedback on it."

The list of updates delivered in Java 22 includes:

  • JEP 447: Statements before super(…) (Preview): Gives developers the freedom to express the behavior of constructors. By allowing statements that do not reference the instance being created to appear before an explicit constructor invocation, this feature enables a more natural placement of logic that needs to be factored into auxiliary static methods, auxiliary intermediate constructors, or constructor arguments. It also preserves the existing assurance that constructors run in top-down order during class instantiation, helping ensure that code in a subclass constructor cannot interfere with superclass instantiation. In addition, this feature does not require any changes to the Java Virtual Machine (JVM) and relies only on the current ability of the JVM to verify and execute code that appears before explicit constructor invocations within constructors.
  • JEP 456: Unnamed Variables & Patterns: Helps improve developer productivity by enhancing the Java language with unnamed variables and patterns, which can be used when variable declarations or nested patterns are required but never used. This reduces opportunities for error, improves the readability of record patterns, and increases the maintainability of all code.
  • JEP 459: String Templates (Second Preview): Simplifies the development of Java programs by making it easy to express strings that include values computed at run time, while also improving the security of programs that compose strings from user-provided values and pass them to other systems. Additionally, the readability of expressions mixed with text is enhanced, and non-string values computed from literal text and embedded expressions can be created without having to transit through an intermediate string representation.
  • JEP 463: Implicitly Declared Classes and Instance Main Methods (Second Preview): Helps accelerate learning by offering a smooth on-ramp to Java programming to enable students to write their first programs without needing to understand language features designed for large programs. With this feature, educators can introduce concepts in a gradual manner and students can write streamlined declarations for single-class programs and seamlessly expand their programs to use more advanced features as their skills grow.

Project Loom Features

  • JEP 462: Structured Concurrency (Second Preview): Helps developers streamline error handling and cancellation and enhance observability by introducing an API for structured concurrency. This helps promote a style of concurrent programming that can eliminate common risks arising from cancellation and shutdown – such as thread leaks and cancellation delays – and improves the observability of concurrent code.
  • JEP 464: Scoped Values (Second Preview): Helps increase ease-of-use, comprehensibility, performance, and robustness of developers' projects by enabling the sharing of immutable data within and across threads.

Project Panama Features

  • JEP 454: Foreign Function & Memory API: Increases ease-of-use, flexibility, safety, and performance for developers by introducing an API to enable Java programs to interoperate with code and data outside of the Java runtime. By efficiently invoking foreign functions such as code outside the Java Virtual Machine, and by safely accessing foreign memory (i.e., memory not managed by the JVM), the new API allows Java programs to call native libraries and process native data without requiring the Java Native Interface.
  • JEP 460: Vector API (Seventh Incubator): Enables developers to achieve performance superior to equivalent scalar computations by introducing an API to express vector computations that reliably compile at runtime to vector instructions on supported CPU architectures.  

Core Libraries & Tools Features

  • JEP 457: Class-File API (Preview): Helps developers improve productivity by providing a standard API for parsing, generating, and transforming Java class files.
  • JEP 458: Launch Multi-File Source-Code Programs: Enables developers to choose whether and when to configure a build tool by enhancing the Java application launcher to enable it to run a program supplied as multiple files of Java source code.
  • JEP 461: Stream Gatherers (Preview): Helps developers improve productivity by enhancing the Stream API to support custom intermediate operations, which will allow stream pipelines to transform data in ways that are not easily achievable with the existing built-in intermediate operations. By making stream pipelines more flexible and expressive and allowing custom intermediate operations to manipulate streams of infinite size, this feature enables developers to become more efficient in reading, writing, and maintaining Java code.

Performance Updates

  • JEP 423: Region Pinning for G1: Helps reduce latency by allowing some garbage collection to happen during some native library calls that would have otherwise needed to pause the collector. This is achieved by tracking which objects need to be blocked during these native library calls and "pinning" just the regions that contain these objects. This allows garbage collection to continue normally in unpinned regions, even during what would have otherwise been a blocking native library call.

 

Posted by John K. Waters on April 10, 20240 comments


A Prompt by Any Other Name: IBM's Watsonx Gets a Generative AI Enhancement

When I first began using the term "prompt engineering" last year, I thought the eye rolling would knock the planet off its axis. I got a similar reaction a dozen years earlier when I proposed writing a book on "social media" to an east coast publisher. And don't get me started on the initial feedback on "the cloud."

Technology nomenclature is a writhing beast, and prompt engineering hit the zeitgeist like a breaching humpback soaking eager whale watchers. This discipline, essentially undifferentiated before the precipitous rise of ChatGPT and other advanced machine learning large language models (LLMs) we're calling "AI," is now commanding a salary range of between $250k and $375k USD, according to Forbes

All of which is a slightly self-aggrandizing way of getting to the news that IBM is set to integrate a prompt tuner into a component of its watsonx enterprise AI and data platform.

Big Blue created the aptly named "Tuning Studio" to help users write better prompts for generative AI. It will be included in the watsonx.ai component of the platform. As the name implies, organizations will be able to use it to "tune" their foundation models with labeled data for better performance and accuracy.

According to the HR software provider Workable, a prompt engineer specializes in designing, developing and refining AI-generated text prompts, optimizing prompt performance, and improving the AI prompt generation process for a range of applications. (Exactly how "engineer" got tacked onto the job of creating input instructions for genAI engines is beyond me. Like I said, writhing beast.)

IBM's watsonx is an enterprise-focused AI platform the company distinguishes from the generative AI used for "entertainment," such as writing song lyrics or seeing how a version of your wedding vows would sound if written by Hunter S. Thompson. The company debuted the platform in July of this year with three components:

  • watsonx.ai: This new studio for foundation models, generative AI and machine learning can help organizations train, validate, tune, and deploy foundation and machine learning models.
  • watsonx.data: This is for scaling AI workloads, for all data, anywhere with a fit-for-purpose data store built on an open lakehouse architecture.
  • watsonx.governance: This enables responsibility, transparency and explainability in data and AI workflows, helping organizations to direct, manage and monitor its AI activities.

The watsonx.ai component will get Tuning Studio in the third quarter of this year, the company says. The other two components of the platform will also receive some upgrades:

  • watsonx.data: Planned generative AI capabilities in watsonx.data will help users discover, augment, visualize and refine data for AI through a self-service experience powered by a conversational, natural language interface. The company plans to issue a tech preview in the fourth quarter of this year. It also plans to integrate a vector database capability into watsonx.data to support watsonx.ai retrieval augmented generation use cases, again in a tech preview in the fourth quarter.
  • watsonx.governance: Model risk governance for generative AI: This is yet another tech preview, in which clients can explore capabilities for automated collection and documentation of foundation model details and model risk governance capabilities. IBM said these help stakeholders view relevant metrics in dashboards of their enterprise-wide AI workflows with approvals, so humans are engaged at the right times.

IBM is also enhancing the watsonx platform with some AI assistants to help users with things like application modernization, customer care, and human resources. And the company plans to embed watsonx.ai tech across its hybrid cloud software and infrastructure products.

Is prompt engineering a "game-changing skill," as some feverish tech reporters have suggested, or will it fizzle as more specialty tools like Tuning Studio emerge? I suspect that both are true... sort of. Generative AI is already changing the way developers work. GitHub Copilot and Amazon's CodeWhisperer are just two examples of a type of AI-supported coding assistant that is certain to become ubiquitous. And the ability to develop and refine AI-generated text for modern applications and systems is likely to find its way into a lot of developer toolboxes.

Posted by John K. Waters on October 9, 20230 comments