Spring Authorization Server Set for November GA

The Spring Security team says it will release version 1.0 of its long-in-the-works Spring Authorization Server in November of this year.

The new authorization framework, which was announced in April 2020, provides implementations of the OAuth 2.1 and OpenID Connect 1.0 specifications and other related specs. It's built on top of Spring Security, which is a highly customizable authentication and access-control framework. The result, say the project's leaders, is a secure, light-weight, and customizable foundation for building OpenID Connect 1.0 Identity Providers and OAuth2 Authorization Server products.

This version of the framework will come with a full feature set (it's a long list), and the APIs have stabilized and matured since the project was launched, said Joe Grandja, Spring Security senior engineer, in a blog post. " A lot of effort and care was put into this project to ensure that it can grow and adapt over the next few years," he wrote.

Spring Authorization Server 1.0 will be based on Spring Security 6.0, which will be based on Spring Framework 6.0 (it takes a village). It will require a minimum of Java 17 at runtime, as well as a minimum of Tomcat 10 or Jetty 11 (for Jakarta EE 9 compatibility). Also, this release will inherit the VMware Tanzu OSS support policy Commercial support, which offers an extended support period, is also available from VMware.

When the project was first announced, the team was careful to give credit where credit was due regarding the projects Spring Authorization Server would effectively be replacing:

"Almost a decade ago, we brought in a community-driven, open-source project, Spring Security OAuth, and made it part of the Spring portfolio of projects," Rob Winch, Spring Security project lead, wrote in a blog post at the time. "Since its inception, it has evolved into a mature project that supports a large portion of the OAuth specification, including resource servers, clients, login, and the authorization server. It is no wonder that it has become the basis for UAA, which, among other things, acts as the identity management service for all Cloud Foundry installations. The Spring Security OAuth project has become a model project and is a testament to what our wonderful community can accomplish."

The need for the new framework emerged gradually. As Winch explained, the original support for OAuth open-standard authorization protocol was provided very early, and the team could not have anticipated the myriad ways in which it would need to be used. With the new framework, the team was able to address the needs of the entire Spring portfolio and provide a single cohesive OAuth library, Winch explained

The Spring Security team has posted the release schedule for the Spring Authorization Server on GitHub.

"Over the next couple of months, we will focus on fine-tuning the public APIs and enhancing the configuration model to allow for easier configuration and greater extensibility," Grandja said. "We will also make some minor API changes, resulting in breaking changes, which may require updates to consuming applications."

The Spring Framework continues to be one of the most popular programming and configuration models for building modern Java-based enterprise applications on any type of deployment platform. It's an open-source, layered Java/J2EE framework based on code published in SpringSource founder Rod Johnson's book Expert One-on-One Java EE Design and Development (Wrox Press, October 2002).

Posted by John K. Waters on August 22, 20220 comments

Microsoft Joins Eclipse Jakarta EE and MicroProfile Working Groups

Microsoft boosted its support for Java developers yet again this week by expanding its participation in the Eclipse Foundation to include memberships in two working groups: the Jakarta EE Working Group, which focuses on the overall evolution of enterprise Java, and the MicroProfile Working Group, which focuses on optimizing enterprise Java for a microservices architecture.

"Our goal is to help advance these technologies to deliver better outcomes for our Java customers and the broader community," said Julia Liuson, president of Microsoft's Developer Division, in a blog post. "We’re committed to the health and well-being of the vibrant Java ecosystem, including Spring (Spring utilizes several key Jakarta EE technologies)."

Joining these working groups complements the company's participation in the Java Community Process (JCP) "to help advance Java SE," Liuson said, adding, "We believe our experience with running Java workloads in the cloud will be valuable to the working groups, and we look forward to building a strong future for Java together with our customers, partners, and the community."

Eclipse working groups provide the governance structure for Eclipse projects, making it possible for organizations—even competitors—to collaborate on new technology development. The working groups provide a set of basic services, including intellectual property management and licensing, development processes, IT infrastructure, and ecosystem development.

Microsoft has been a member of the Eclipse Foundation since 2016, when it joined as a Solutions Member. The company became a Strategic Member in 2021. Among other privileges, Strategic Members have a seat on the foundation's board of directors, its architecture council, and expanded board voting rights on key aspects of the Eclipse ecosystem, including licensing, governing policy development, and amendments to membership agreements and bylaws.

"Microsoft has warmly embraced all things Java across its product and service portfolio, particularly Azure," said the foundation's executive director, Mike Milinkovich, in a statement. "Its enterprise customers can be confident that they will be actively participating in the further evolution of the Jakarta EE specifications, which are defining enterprise Java for today's cloud-native world."

Microsoft has been investing in its support for Java and related technologies for a number of years, including Jakarta EE, MicroProfile, and Spring technologies on Azure in collaboration with its strategic partners. With Red Hat, for example, the company built a managed service for JBoss EAP on the Azure App Service, Liuson noted. Redmond is also collaborating with Red Hat to enable solutions for JBoss EAP on Virtual Machines (VMs) and Azure Red Hat OpenShift (ARO). Working with VMware, Microsoft jointly develops and supports Azure Spring Apps, a fully managed service for Spring Boot applications. And with Oracle and IBM, the company has been building solutions for customers to run WebLogic and WebSphere Liberty/Open Liberty on VMs, Azure Kubernetes Service, and ARO (WebSphere).

"It is great to see Microsoft officially join both MicroProfile and Jakarta EE, as they'd been informally involved in these efforts for a long time," said Mark Little, vice president of the Software Engineering group at Red Hat, in a statement. "I hope to see Microsoft's participation bring experience from their many users and partners who have developed and deployed enterprise Java applications on Azure for several years."

The Eclipse Foundation announced the released the first Jakarta EE specification in August 2019, almost exactly two years after Oracle declared its intention to transfer the responsibility for enterprise Java to that open-source standards organization.

Posted by John K. Waters on July 14, 20220 comments

Fifth Annual Call for Code Challenges Devs to Use their Powers for Good

Organizers of the fifth annual Call for Code Global Challenge have launched their annual invitation to software developers from around the world to create open-source solutions that accelerate sustainability and combat climate change.

Given the growing animus toward so-called Big Tech in some quarters and what I think can fairly be described as generalized opposition to technological innovation, it’s never been more important to remind the world that tech can be an incredibly powerful force for good. The annual Call for Code has grown since the first challenge was announced to become one of the world’s largest “tech for good” programs. It now attracts developers from 180 countries responding to this clarion call to use advanced technologies to design cutting-edge open source-powered hybrid cloud and AI solutions that can tackle the world’s most pressing societal issues.

There’s a refreshing idealism in this program. Call for Code participants are invited to identify the particular sustainability issue they want to solve, form a team, and start building by registering on the new Global Challenge resource site hosted by BeMyApp. Once they’ve registered, participants will be able to attend Challenge Accelerator events to help fast-track their projects, learn from subject matter experts, access exclusive skills-building materials, and use exclusive toolkits, APIs, and data sets from The Weather Company and participating IBM Ecosystem partners.

But that idealism is undergirded by a pragmatic understanding that we need technology to address problems that are having a global impact. Ruth O. Davis, director of the Call for Code Challenge in IBM’s Worldwide Ecosystems group, put it succinctly in a press release, “Technology is the catalyst for scaling solutions to global problems,” she said, “from climate change to humanitarian issues, and even the global pandemic.”

“Of course, the people who participate in the Challenge are idealists in some ways,” Davis told me in an interview. “They’re very passionate about what they’re doing and want what they’re doing to make a difference. But they also know they need resources to make that happen.”

The awards to the winners of this year’s competition are commensurate with the stakes (you know, saving the world). The Grand Prize is $200,000 plus solution implementation support from IBM Ecosystem partners. First runner up gets $25,000, and third and fourth runners up get $10,000. It’s big money focused on solving big problems.

But even those participants who don’t manage to nab the brass ring have access to some incredible resources while they develop their ideas. They get a trial IBM Cloud account for 2022 that provides access to many free services without a credit card, including the ability to create Kubernetes clusters. They have access to toolkits, APIs, and data sets from Call for Code sponsors. And there are expert webinars, skill-building plans, and even mentors available.

Among the most exciting components of this program are the Challenge Accelerator events. Each Accelerator is a roughly two-week competition designed to help fast-track participants’ projects towards submission to the Global Challenge. (Global Challenge submission is not required). Each participant builds a project to address a specific and targeted use case​ under the theme of “Sustainability.” Each Accelerator is different; some may include technical workshops, mentoring, and additional educational content. And participants may be eligible for additional prizes.

College students will also have the opportunity to compete for the University Prize in a program created by IBM and the Clinton Global Initiative University. In 2021, more than 90,000 students across hundreds of universities around the world surpassed the program goal by nine times, the organizers said. 

David Clark, the CEO of David Clark Cause, is the original Call for Code organizer. He founded the program in 2018, and launched it with IBM, the United Nations Human Rights group, and the Linux Foundation. The list of organizations supporting Call for Code this year includes: Arrow ElectronicsClinton Foundation and Clinton Global Initiative UniversityClemson University, Esri, EYIngram MicroIntuitthe Linux Foundation, Morgan StanleyNew RelicPersistent SystemsTeach For All, United Nations Human Rights, and United Nations Office for Disaster Risk Reduction, among others. 

It's worth noting, too, that Call for Code has been selected as the preferred innovation platform of the Right Here, Right Now Global Climate Alliance, one of the largest public/private climate partnerships in the world. 

Solutions can be submitted to this year’s event any time before the deadline of October 31, 2022. You don’t need to be on an existing team to participate. The organizers will be hosting a team building session to help participants form and build teams.

Must-read information about Call for Code winners is available here.

Posted by John K. Waters on May 16, 20220 comments

Why Should You Care About JDK 18?

The latest update of the Java Development Kit (JDK 18) goes GA next week, and though it's not a Long-Term Support (LTS) release, it does implement nine JEPs (listed here). And while it's probably also true that your organization is going to want you to wait for the LTS coming in September 2023 (JDK 21), the JEPs implemented in this release are worth a look.

I joined a Zoom presser this week with two Java mavens, Simon Ritter and Steve Poole, to talk about the latest incarnation of the JDK and what it brings to developers.

Ritter is the Deputy CTO of Azul Systems, one of the leading open-source Java development tools and runtimes providers. He's a former Head of Java Technology Evangelism at Oracle, and he's a Java champion who's been working with the language and platform for more than two decades—all the way back to his days at Sun Microsystems.

Ritter said the very fact that this release isn't a headline grabber is a demonstration that Oracle's decision to provide JDK updates on a six-month release cadence is working.

"It's a time-based release model, rather than a feature-based release model," he said. "It doesn't mean that, since JDK 17, we've had six months of development and the people at Oracle and the rest of the contributors to the JDK haven't really been very busy. They've all been getting on with things. It's just that certain features haven't got to the point where they're ready for inclusion in the JDK in this six-month cycle."

Ritter pointed to Foreign Function & Memory API (JEP 419) as one of the more important JEPs implemented in this release, because it's one of those incubated components being included in Project Panama. Those following this years-long project will know that Panama is about simplifying the process of connecting Java programs to non-Java components. This particular feature, in its second incubation iteration, introduces an API through which Java programs call native libraries and process native data without the brittleness and danger of the Java Native Interface (JNI).

"This is a big thing because it's part of Project Panama," Ritter explained. "But also, because replacing the JNI is one of those features that will really help us as Java developers, because there are lots of libraries out there not written in Java—important things, like machine learning, for example."

Poole agreed that the JNI has been the Achilles Heel of Java ever since it was created. And he should know: He was there when it happened.

Currently a developer advocate at Sonatype, a leader in the DevSecOps and repository management space, Poole has been working on Java software development kits and JVMs for 25 years—since the dawn of Java, you could say. He has also been a developer advocate at Red Hat and IBM, as well as a member of the AdoptOpenJDK group, which is now the Eclipse Adoptium project, championing community involvement in OpenJDK.

"The JNI was deliberately created to be complicated," Poole said. "But you have to look at it in context. When Java came out, there was all this legacy code people wanted to connect to. But at the time, we did not want to encourage developers to use dated languages and propagate those environments. So that's the history, but since then, Sun, IBM, Oracle, and others have spent years experimenting with different ways of getting around this JNI thing. And we have to do it. If you look at, say, Python; it can call native code really, really easily. And that makes it very valuable. I would love to see this all finally hit the streets as a good solid practical API, because it's way overdue."

The only JEP implemented in this release that actually impacts the Java language is Pattern Matching for switch (JEP 420), which was first previewed in Java 17 (this is the second preview). Its purpose is to "enhance the Java programming language with pattern matching for switch expressions and statements, along with extensions to the language of patterns."

"We've seen in the last couple of iterations of the platform the introduction of much more pattern machine," Ritter said. "We're going have pattern matching for records and arrays, and I'm sure there will be other situations where we'll use pattern matching. This is one of those things that, again, is really helping developers, because it takes some of those rough edges off the language and eliminates boilerplate code."

JEP 420 is another example of an incubated components that's part of a larger project, in this case Project Amber, which aims to bring features to the language that can make writing Java code more readable and concise, and target specific use cases such as using generic enums or data classes.

"It's those little steps that are keeping things moving along quite nicely," Ritter said.

They both also pointed to JEP 421: Deprecate Finalization for Removal, which deprecates finalization for removal in a future release.Although the feature remains enabled by default for now, it can be disabled to facilitate early testing.

Finalization is a method used to perform cleanup operations on unmanaged resources held by the current object before the object is destroyed. It allows Java developers to perform "postmortem" cleanup on objects that the garbage collector has found to be unreachable. It has typically been used to reclaim native resources associated with an object.

"Let's be honest," Poole said, "two things were hacked into Java—and I do use the work 'hacked'—way back at the very beginning. One was serialization; the other was finalization. And again, finalization was added because, at the time, there were lots of resources that were in, say, C code, or database handles, and there needed to be a way to explicitly tell databases to close their resources or whatever. And we wanted to do it when Java objects were no longer needed when they were being destroyed. Also, it wasn't specified. The behavior was completely VM-specific, GC-specific, and what threading model you have—so it's a complete nightmare. Now it's finally being deprecated, and I really, really hope that by now there's nothing out there that relies on finalization, because that would be a very bad thing."

Both Poole and Ritter expect few people to use JDK 18 in production, because it's not an LTS release. The JDK 17 LTS release was much more significant, so there was something of a surge in the uptake of that release. Also, Oracle announced last year that it would begin providing an LTS every two years instead of every three, which means next LTS release (JDK 21) will ship in September 2023.

"For this release," Poole said, "mostly we'll see people kicking the tires."

Posted by John K. Waters on March 16, 20220 comments

Open Source Security Foundation Grows After White House Summit

It's less than two years old, but the Open Source Security Foundation (OpenSSF,) a cross-industry group hosted at the Linux Foundation, is attracting an impressive (and growing) roster of members signing up to pitch in on efforts to identify and fix security vulnerabilities in open-source software (OSS), while improving everything from tooling and training to research and vulnerability disclosure practices.

This week, the OpenSSF announced that 19 new organizations have joined that effort, including Citi, Huawei Technologies, Spotify, Alibaba Cloud, and JFrog, bringing the total current membership (by my count) to 60. They're joining a group that already includes Google, Microsoft, AWS, Meta, Cisco, GitHub, Intel, Red Hat, and Snyk. (A complete list of members is available here.)

"The importance of open-source software security is well recognized by the customer, industry, and government," said Dr. Kai Chen, chief security strategist at Huawei, a new Premium Member of the OpenSSF, in a statement. "It is time for the community to take strategic, continuous, effective ,and efficient actions to advance the open-source software security posture…."

The foundation's expanding membership represents what the OpenSSF calls "cross-industry momentum," spurred at least in part by the White House Open Source Security Summit in January. The OpenSSF was there, representing hundreds of communities and projects by highlighting collective cybersecurity efforts and sharing their desire to work with the administration across public and private sectors.

Brian Behlendorf, executive director at OpenSSF, was optimistic about that meeting when I talked with him last week. He said the participants from the administration were well informed on the topic.

"They asked good questions, and we tried to make the point that the government is a major user of open-source software," he told me. "And consequently, has a vested interest in improving its consumption of that software. But also, that there are increasing amounts of code being contributed by governments, or by them through contractors, so they're effectively publishers of open-source software, actually a peer in the community. And we talked about what role they should play."

Behlendorf, who assumed his current role in October, is probably best known as a primary developer of the Apache Web Server and a founding member of the Apache Software Foundation. "We're calling this job 'general manager' to de-emphasize that title," he said. "But even that overstates it. Orchestrator, maybe? I'm really more of a circus ringmaster."

The OpenSSF combines the Linux Foundation’s Core Infrastructure Initiative (CII), an effort to improve OSS security in response to the 2014 Heartbleed bug, and the Open Source Security Coalition (OSSC), which was founded by the GitHub Security Lab to build a community to support open-source security for decades to come.

"As all industries increasingly rely upon open-source software to deliver digital experiences, it is our collective responsibility to help maintain a vibrant and secure ecosystem," said Lena Smart, chief information security officer at MongoDB, a new general member of the foundation. "You can have all the tools in the world, but at the end of the day, it is people across multiple organizations around the world working together that will ensure an expansive cybersecurity program…"

Since it was launched in August 2020, the OpenSSF has reached some important milestone across a variety of its technical initiatives, including:

Alpha-Omega Project Launch
The Alpha-Omega Project focuses on improving global OSS supply chain security by working with project maintainers to systematically look for new, as-yet-undiscovered vulnerabilities in open-source code, and get them fixed. The "Alpha" component will work with the maintainers of open-source projects to help them identify and fix security vulnerabilities and to improve their security posture. The "Omega" component aims to identify at least 10,000 widely deployed OSS projects for which it can apply automated security analysis, scoring, and remediation guidance in their open-source maintainer communities. Microsoft and Google are supporting the project with a $5 million investment.

Scorecards Increases Scans to 1 million Projects
Scorecards is an OpenSSF project that helps open-source users understand the risks of the dependencies they consume. GitHub and Google recently announced Scorecards v4, and the project has increased the scale of its scans from 50,000 projects to one million projects identified as most critical based on their number of direct dependencies.

Sigstore Project Gains Momentum
Sigstore is a set of tools developers, software maintainers, package managers and security experts. The recently released a project update reported nearly 500 contributors, more than 3,000 commits, and more than one million entries in Rekor.

Nearly 1,000 Codes for Free MFA Tokens
The Securing Critical Projects Working Group coordinated the distribution of nearly 1,000 codes for free multi-factor authentication (MFA) tokens donated by Google and GitHub to developers of the 100 "most critical" open-source projects. "This is a small but critical step in avoiding supply chain attacks based on stolen credentials of key developers," the foundation said in a press release.

Posted by John K. Waters on March 2, 20220 comments

DevOps Institute Announces New Certifications, Expanded Lineup of 'Educational Experiences'

I recently had a great Tech Talk with Stephen Walters, Solution Architect at xMatters, which was recently acquired by Everbridge ("DevSecOps: Securely Navigating a Shifting Landscape"). Among his other credentials, Stephen is a DevOps Institute Ambassador, so when I saw that the Institute's lineup for 2022 events and webinars included plans for two new DevOps certifications, I just had to pass along the news.

The DevOps Institute is a professional member association and certification authority "for advancing the human elements of DevOps." Basic membership is free, and there's a fee from Premium membership ($199, aimed at or full or part-time employees working in the DevOps field), Enterprise (based on team size), and Government ($99). Lots of goodies here, even for basic members, who get access to the Assessment of DevOps Capabilities (ADOC), the entire library of SKILbooks, the DevOps Institute Career Center, Perks Marketplace, and a 30% discount on exams.

The institute's goal in 2022, according to the announcement, is "to advance the humans of DevOps through skills, knowledge, ideas, and learning," with is the "SKIL Framework."

"In 2022, DevOps Institute continues to lead the charge toward human transformation with an exciting lineup of new and expanded opportunities for DevOps professionals," said Jayne Groll, CEO of DevOps Institute, in a statement. "As we ramp up our education and certification programs, we aim to empower the global member community with the skills and knowledge they need to further their careers and advance the DevOps initiatives at their organizations."

This, of course, is great news for anyone who believes in the potential of the DevOps model, now about 20 years old, and yet still not as fully (or effectively) embraced as is probably should be. You know that thing that has been making it possible for developers to collaborate with operations to deploy software into production faster and with fewer errors? You know.

The list of new certifications the DevOps Institute announced includes:

  • DevSecOps Practitioner is the next level in the DevSecOps certification series. Building on DevSecOps Foundation, the Practitioner certification covers advanced DevSecOps practices and methods, architecture and infrastructure, technical implementation, practical maturity guides, and metrics to deliver better DevSecOps outcomes.
  • DevOps Engineering Foundation explains many aspects of DevOps engineering that leaders and practitioners can execute upon. An engineering approach is critical to DevOps journeys. This certification covers the foundations of knowledge, principles and practices needed to engineer a successful DevOps solution.

Learn more about the Institute's certifications here.

Under the category of "educational experiences," the Institute is adding:

  • SKILup Educational Experiences: IT professionals have always dealt with change, but never at the speed of the current digital transformation. The humans of DevOps are being asked to learn and implement new technologies at a pace that often outruns their current skill level. Upskilling has never been more important.

"SKILup Educational Experiences" are DevOps-focused events designed to provide what the institute calls "just-in-time insights" and education needed by DevOps pros in a range of disciplines. The Institute "aims to disrupt the typical technical conference format and focus on providing relevant content and learning in a safe and fun environment." These are insights attendees "can immediately put… into practice to meet the demands of business agility.

The list of SKILup Educational Experiences include:

  • SKILup Days: One-day virtual micro conferences with a singular, how-to focus. Featuring experts from the industry as well as enterprise DevOps leaders, SKILup Days include all elements of an in-person conference, including virtual sponsor booths, competitions and networking opportunities with other attendees and Speakers.
  • SKILup Hours: Educational Webinars for IT Professionals. Each SKILup Hour includes a panel session that is moderated by industry experts; providing discreet buildable how-to knowledge on topics crossing people, process and technology.
  • SKILup Festival 2022: A Live DevOps Educational Experience: DevOps Institute is excited to announce that our in-person experiences include high-level content as well as deep-dive technical sessions and workshops with some festival fun and entertainment mixed in. (Dates and locations to be determined.)

The DevOps Institute considers itself "a unifying force of an open and growing professional community of IT practitioners, consultants, talent acquisition and executives helping pave the way to support digital transformation and the New IT."

I do, too.

Posted by John K. Waters on January 20, 20220 comments

Microsoft Joins the Java Community Process

You'd think I'd have seen it coming. All the signs were there. There was the day Microsoft announced that it had joined the OpenJDK project back in 2019. Then there was the company's decision to upgrade its status at the Eclipse Foundation to Strategic Member in August of this year. And when Microsoft CEO Satya Nadella proclaimed in September that "We use more Java than one can imagine," I just should have known that Redmond would soon be joining the venerable technology standards and specifications organization behind the evolution of the Java language and platform, the Java Community Process (JCP).

Bruno Borges, Principal Program Manager for Microsoft's Java Engineering Group, revealed that the company had signed the Java Specification Participation Agreement (JSPA) to officially join the JCP in a blog post earlier this month.

"As we have collectively learned since the announcement of the Microsoft Build of OpenJDK in April 2021," Borges said, "Java usage within Microsoft has grown way beyond Minecraft. We have more than 500,000 JVMs in production running hundreds of internal Microsoft systems. In addition to significant internal Java usage, there are many customers and developers coding and running Java on Microsoft Azure and GitHub. Joining the JCP is a major, yet natural step forward for Microsoft in helping shape the future of the Java Platform."

And the Chair and Director of the JCP, Heather VanCura, gave the new member her blessing: "For the past 23 years, the JCP program has guided the specification of the Java platform in cooperation with the international Java developer community. The JCP program welcomes participation and membership from corporate, open source, individual, and Java User Group participants. We are delighted to welcome Microsoft to the JCP program; it continues to represent the vibrant Java ecosystem. We look forward to seeing their contributions.

I sent her an email, but she hasn't gotten back to me. When she does, I'll try for a less PR-sculpted comment. To be fair, the JCP has been through the ringer over the past decade, and VanCura helmed that troubled ship through some treacherous waters. She helped developers and vendors adapt to the faster Java release cadence, spending most of 2019 demonstrating, teaching, and working with developers and teams. She also led the JCP through the often-painful process of untangling that JSPA Nadella and company just signed, which was notoriously byzantine, and which her predecessor, Patrick Curra,n once described to me simply as "big and scary."

Then there's that other Bruno, Mr. Souza, the one in South America who founded the Brazil-based SouJava, largest Java User Group (JUG) in the world. He was one of the initiators of the Apache Harmony project to create a non-proprietary Java virtual machine. He serves on the Executive Committee of the JCP, and was one of my first guests on "The WatersWorks Podcast."

"The JCP is the place where we define and discuss the future of Java, and where we need the collaboration of all the Java community," Souza said. "Microsoft has been an important part of this community, with their involvement in OpenJDK but also supporting Java User Groups and community events. Because of all that, Microsoft has become a strong partner of SouJava, and we are excited to have them go even deeper on their commitment with the Java community."

RedMonk analyst James Governor sees this development as further evidence of Microsoft's commitment to a future in which Java continues matter. "Java remains a key context for IT today and for the foreseeable future," Governor said.

Boy howdy.

Posted by John K. Waters on November 16, 20210 comments

The Eclipse Foundation Partners with China's OpenAtom on a New Operating System

The Eclipse Foundation today announced the launch of a top-level project to develop a new open-source, vendor-neutral OS designed to provide an alternative to existing IoT and edge operating systems.

Called Oniro, the new OS is an implementation of OpenHarmony, a distributed multi-kernel operating system developed by OpenAtom, China’s first open-source foundation. The purpose of Oniro is to provide the same operating system across a much wider range of devices, Mike Milinkovich, the executive director of the Eclipse Foundation, told me, such as a tiny leak sensor in a home and a Raspberry Pi.

"The interesting thing about Oniro from a technical point of view is that it's a single operating system that will run on multiple kernels," Milinkovich said. "The two we're working on first are Yocto, which is, of course a variant of Linux that's particularly relevant in the embedded space. And the second one is Zephyr, which is a sort of a lightweight operating system that you would put on much smaller devices."

The Eclipse Foundation announced that it would be collaborating with the OpenAtom on the OS last September.

According to its website, OpenAtom is a non-profit, independent legal entity "dedicated to public welfare undertakings in the open-source industry." The purpose of the OpenHarmony project is "to build an open, distributed operating system framework for smart IoT devices in the full-scenario, full-connectivity, and full-intelligence era."

The HarmonyOS is a commercial distribution of OpenHarmony developed by Huawei, the Chinese telecom giant. The company announced the developer preview release of HarmonyOS 3.0 last week. Version 2.0 was launched in June of this year, and Huawei has been rolling out HarmonyOS on selected smartphone models that offer users an alternative to Google's Android platform.

The main code base for OpenHarmony is hosted on Gitee, China's version of GitHub. The maintainers of the project wanted to grow its addressable market beyond China, Milinkovich explained, and they needed a Europe-based partner to do that. The Eclipse Foundation, now based in Belgium, was a natural partner, he said.

"I think this is evidence that our strategy of moving to Europe was the right one," Milinkovich said. "If we had still been a North American organization, I doubt that this opportunity would have come to us. People who would never have thought of us before are coming to us with projects."

The Eclipse Foundation announced that it would be moving its legal headquarters from the US last year and formally established its official headquarters in Belgium in January of this year.

To facilitate the governance for the Oniro device ecosystem, the Eclipse Foundation is also launching a new dedicated working group. The Eclipse Foundation’s working group structure provides the vendor neutrality and legal framework that enables transparent and equal collaboration between companies, Milinkovich said.

The initial working group membership roster includes Eclipse, OpenAtom, Linaro, a UK-based open-source organization focused on Linux for Arm-based devices, and Seco, an Italian IoT device manufacturer.

"To my knowledge, this is the first time three open-source foundations (Eclipse, OpenAtom, and Linaro) have collaborated on a single piece of technology," Milinkovich said.

Although he acknowledged that there's "a ton of work to do" on this project, Milinkovich emphasized that it's not starting from scratch.

"I saw some numbers today, and it's like 50 percent of the packages that are going into the initial Oniro build are essentially identical to what you'd get in a Debian distribution," he said. "And we're building initially on the Yocto and Zephyr kernels. I always say, don't reinvent the wheel, stand on the shoulders of giants. And that's what we're trying to do here with as much reuse as possible from all the existing work that has been done."

The roadmap for the project includes the development of a number of "blueprints" targeting an initial set of devices, Milinkovich explained.

"That's how we're going to grow the developer enablement and build out the ecosystem," he said, "by making it as simple as possible for developers to grab a blueprint that closely matches their requirements, and then modify it to deliver the piece of functionality they're working on."

I asked Milinkovich what it was like working with a Chinese organization.

"Other than getting phone calls really early in the morning, it's not so bad," he quipped. "But seriously, we don't think of China as a place where open source starts, but primarily as a consumer of open source. I think this is sort of a step in their maturation, of them becoming a first-class citizen in the global supply chain of open-source software, which is really driving innovation everywhere around the world. So, from that, from that point of view I think this is a major step."

Davide Ricci, director of the Huawei’s Consumer Business Group European Open-Source Technology Center, expressed his organization's enthusiasm for the project the press release.

"It is so exciting to see everything moving under the expert governance of the Eclipse Foundation," he said. "Under the Eclipse Foundation the project will have its greatest chance at onboarding new contributing members and bringing real products on the shelves of consumer electronics stores around the world. We reckon Oniro is not a sprint, rather a marathon, and we are thrilled and committed to this world changing journey."

Posted by John K. Waters on October 26, 20210 comments

New Trusted AI Dev Tool from IBM Research Communicates 'Uncertainty'

IBM Research added to its growing family of "trusted AI" tools recently with the release of a new open-source developer toolkit called Uncertainty Qualification 360 (UQ360). The new toolkit focuses on what IBM believes will be the next big area of advancing trust in artificial intelligence: communicating an AI's "uncertainty."

Uncertainty quantification is just what it sounds like: a determination of the level of confidence an AI system has in its decisions. The new UQ360 toolkit was designed to give data science practitioners and developers a set of algorithms to streamline the process of quantifying, evaluating, improving, and communicating uncertainty of machine learning models.

What we're talking about here, IBM AI researchers Prasanna Sattigeri and Q. Vera Liao explained in a blog post, is a way to enable an AI system or application to express that it is unsure, "giving it intellectual humility and boosting the safety of its deployment."

IBM is billing UQ360, which was released at the 2021 IBM Data & AI Digital Developer Conference, as one of the first toolkits designed to provide both a comprehensive set of algorithms for quantifying uncertainty and the capabilities to measure and improve uncertainty quantification to streamline the development process. The tool comes as a Python package with a taxonomy and guidance for choosing these capabilities based on a developer's needs, the company says.

UQ360 is just the latest toolkit to emerge from IBM Research, alongside AI Fairness 360, the Adversarial Robustness ToolboxAI Explainability 360 and AI Factsheets 360, all released over the last few years to advance various dimensions of AI trust.

"Trust" in this context refers to the ability of humans to have confidence in the output of an AI-enabled app or system. AI systems have traditionally been black boxes, but, as IBM puts it, "To trust a decision made by an algorithm, we need to know that it is fair, that it’s reliable and can be accounted for, and that it will cause no harm." That level of trust requires transparency.

The fatal highway crash of a Tesla vehicle operating in self-driving mode in June threw another spotlight on the AI safety issue and the growing interest in shining a light in the AI black box. But Sattigeri, with whom I spoke over Zoom, said "miscalibrated uncertainties" are about more than just this kind of obviously critical application of AI.  

"The self-driving example is a scary one," he allowed, "but take the loan approval process, where somebody is using an AI system to assist them in making a prediction that impacts your interest rate. Or in a healthcare setting, where the doctor needs to trust the AI to assist in making a diagnosis."

Quantifying uncertainty can show gaps in the knowledge of the training model, Sattigeri said, so the model can be improved.

"If we know [that the systems] are overconfident or underconfident," he said, " we can use recalibration algorithms to make them either loser, so you're increasing the margin of error, or [tighter] so you're decreasing the margin of error. And then it's up to the decision maker how they want to use it. If the uncertainty is too large, the loan officer can go ahead and do certain other investigation, maybe collecting addition information about the person."

If you've never visited Big Blue's R&D division website, you've just gotta. On the Trusted AI page alone, you'll find projects ranging from AI Explainability to Adversarial Robustness, Casual Inference to AI Fairness—all concepts behind research projects leading to the development of tools "to make AI more explainable, fair, robust, private, and transparent," IBM says.

AI software development continues to be a land of evolving concepts and esoteric nomenclature that coders with little to no experience in this terrain are increasingly required to navigate. But even AI road warriors need effective tools to keep up with the accelerating pace of software delivery that increasingly includes AI, machine learning, and deep learning. With its open-source trusted AI toolkits, IBM has put up some useful signposts.

Posted by John K. Waters on August 6, 20210 comments