It took two and a half years, but the NetBeans Java-based IDE has finally graduated to Top-Level Project (TLP) status at the Apache Software Foundation (ASF). The popular open source development environment, tooling platform and application framework now comprises the largest codebases at the ASF. (It's 20 years old, after all.)
As a first-class-citizen in the ASF, the NetBeans project will now be able to receive more contributions from the open source community. Its previous status as an Incubator Project provided more limited access. That's the official entry path for projects and code bases whose supporters want them to become part of the ASF. It's where those projects are vetted to make sure they comply with the ASF legal standards and their support communities adhere to the ASF's guiding principles.
The stewardship of the IDE shifted from Oracle to the ASF in late 2016. NetBeans 8.2 was the last release by Oracle. It's worth noting that individual contributors from Oracle continue to work on the project "as part of the worldwide community of individual contributors, both self-employed as well as from other organizations," the ASF said in a statement.
NetBeans is literally the first Java IDE. It was originally demoed back in 1998, about two years after Sun Microsystems created the Java language. Sun acquired the IDE in 1999 with the goal of evolving the tooling along with the Java platform.
NetBeans 11.0 was released in April 2019. It's the project's third major release since the IDE entered the Apache Incubator.
"Being part of the ASF means that NetBeans is now, not only free and Open Source software, it is also uniquely, and for the first time, part of a foundation specifically focused on enabling open governance," said Geertjan Wielenga, vice president of Apache NetBeans, in a statement. "Every contributor to the project now has equal say over the roadmap and direction of NetBeans. That is a new and historic step and the community has been ready for this for a very long time. Thanks to the strong stewardship of NetBeans in Sun Microsystems and Oracle, Apache NetBeans is now ready for the next phase in its development and we welcome everyone to participate as equals as we move forward."
Wielenga was an Oracle product manager and developer advocate for open source projects. He reportedly did a lot of the heavy lifting when Oracle donated the NetBeans code to the ASF.
Posted by John K. Waters on 05/08/2019 at 10:46 AM0 comments
Red Hat has again stepped in to assume the stewardship of OpenJDK projects no longer supported, long-term, by Oracle. The Raleigh, N.C.-based open source solutions provider and long-time Java community leader has taken on the role of steward of OpenJDK 8 and OpenJDK 11 update releases. Red Hat will "work with the community to enable continued innovation in Java," the company said in a statement.
Red Hat assumed stewardship of OpenJDK 6 in March 2013, and OpenJDK 7 in June 2015. The company has actually been involved in the OpenJDK, which is an open source implementation of Java, since 2007, when it signed Sun Microsystems' OpenJDK Community TCK License Agreement. The TCK (Technology Compatibility Kit) is the official test suite for compliance of implementations of Java Specification Requests (JSRs); they can only be provided by the spec lead of a JSR. Red Hat was the first big software vendor to license the TCK.
Red Hat is among the largest contributors to the OpenJDK project. Long-time Java technical lead and lead technical engineer of Red Hat's Java Platform team, Andrew Haley, was appointed as project lead for OpenJDK 8 and OpenJDK 11 in February 2019. He has been an active member of the OpenJDK governing board for seven years.
In addition to its work within individual OpenJDK communities, Red Hat leads the upstream development of Shenandoah, a high-performance garbage collector that is now part of OpenJDK 12.
In December 2018, Red Hat announced commercial support for OpenJDK on Microsoft Windows. Red Hat plans to launch OpenJDK in a Microsoft installer in the coming weeks and distribute IcedTea-Web, the free software implementation of Java Web Start, as part of the Windows OpenJDK distribution. JDK Mission Control is available as part of Red Hat Software Collections and for Windows through the Red Hat Customer Portal, enabling developers and administrators to collect and analyze data from Java applications running locally or deployed in production environments.
"My guess is we can expect more news on the transition of Java stewardship over the next few years," said Ovum analyst Michael Azoff, in a statement, "and I believe Red Hat is a safe pair of hands to take on that role. It's also a better fit with Java being open source and Red Hat being a leader in the open source software community."
Rich Sharples, Red Hat's senior director of product management acknowledged frequent public doubts have been expressed about the future of Java, but rumors of its demise, he told ADTmag, have been greatly exaggerated.
"From our point of view, there's still another 20 or 30 years of life left in Java," said Rich Sharples, Red Hat's senior director of product management. "There's a very long way to go for Java, and that's why we're continuing to put our resources behind it and the community."
Red Hat recently addressed some specific concerns that Java isn't competitive in emerging cloud-native architectures, such as microservices, containers, and serverless, with its newly released Quarkus framework, which is designed to significantly reduce the footprint and latency of Java applications.
"We're still all-in on Java," Sharples said.
Sharples also noted that Oracle's new faster release cadence is giving the community of Java users additional opportunities to reconsider what they're currently running on.
"People are going to be very consciously upgrading or thinking about upgrading much more frequently," he said. "Quite frankly, if you're upgrading version numbers, you've got to also be thinking about which vendor you source the JVM from."
"Java is in a renaissance moment," Mike Piech, vice president and general manager of Red Hat's Middleware group, said in a statement. "It continues to evolve and be a key component of new, emerging architectures. There is a developer hunger to bring Java into the next generation of development, and Red Hat is a leader in this movement through our involvement in the OpenJDK project. We are helping to lead the way in our efforts to enable users of JDK to have support and innovation in their existing environments. Red Hat remains committed to Java and is excited to have the opportunity to help steward the OpenJDK community."
Posted by John K. Waters on 04/24/2019 at 9:02 AM0 comments
In what is sure to be the last chapter in the seemingly unending courtroom drama that is Oracle v. Google, Oracle has responded to Google's hail-Mary request, filed with the Supreme Court in January, to review the appeals court's ruling that the Alphabet subsidiary infringed on Oracle's copyrights over its use of 37 Java APIs in the Android OS. In its 46-page petition, Oracle disputed Google's claim that the lower court's decision will harm software developers.
"Google claims the Court of Appeals' decision imperils the future of 'interoperable' software," the petition reads. "But Google has conceded that it purposely made its platform incompatible with Oracle's. So this is no case to consider the copyright implications of interoperability ... ."
The Oracle petition also shot down Google's fair use argument: "Google cites not a single case -- in any court -- that has ever held that copying this volume of code (or this much structure and organization) into a competing work is fair."
And it re-asserted that Google's actions "inflicted incalculable market harm" and represented "the epitome of copyright infringement."
The petition wraps up with a zinger: "Google's theory is that, having invested all those resources to create a program popular with platform developers and app programmers alike, Oracle should be required to let a competitor copy its code so that it can coopt the fan base to create its own best-selling sequel. That argument would never fly with any other copyrighted work ... ."
Oracle originally sued Google in 2010, but Google argued that its use of the Java APIs owned by Oracle was allowed under the "fair use" provisions of the federal copyright law, and therefore did not infringe on Oracle-owned copyrights. But that argument failed to persuade the court. "There is nothing fair about taking a copyrighted work verbatim and using it for the same purpose and function as the original in a competing platform," a panel of three Federal Circuit judges wrote in that opinion.
If my inbox is any indication, more than a few ADTmag readers are tired of this drama, and I don't blame them. But it's important to keep in mind what was originally at stake here; the court effectively ruled that copyright protections do extend to software interfaces. The impact of that decision is still to come.
Posted by John K. Waters on 04/10/2019 at 10:01 AM0 comments
Oracle announced the general availability of JDK 12 last week, which is the third release under its still newish accelerated release cadence. JDK 10 was the first feature release of the new cycle; Java 11, which was released six months later, was the first long-term support (LTS) release. The next LTS release will be Java 17, scheduled for release in September 2021.
Three releases into the new cycle seemed like a good time to touch base with Georges Saab, VP of the Java Platform Group at Oracle and chairperson of the OpenJDK governing board, about the impact of the faster Java release cadence on the developer community and his own team at Oracle.
We last talked about the new release schedule back in 2017 when Mark Reinhold [chief architect of Oracle's Java Platform Group] publicly proposed the idea. But you had already been laying the groundwork with Java 9. You said at the time that you felt "the time is right for the Java ecosystem to make this work." Were you right about that?
Back before we started this, before they actually experienced it, people were understandably skeptical. But in the past few months, especially since Java 11 came out, the developers I talk to are saying that, once they did the hard work of getting up to Java 9 , they were able to move forward with almost no difficulty at all. Java 9 was the last of the major disruptions. For people who have something running on Java 9, moving to 10, 11 and 12 are small steps that are akin to what we used to call "minor releases."
But people still have to make their way from Java 8 to Java 9 to experience all this easy-upgrade goodness.
A couple of years ago, I heard a lot of people saying, well, maybe Java 8 is good enough, and we should just stay on that forever. But now I hear from a lot of people who are very excited about the roadmap we have, because they're seeing concrete evidence on a regular basis that we're making good on that roadmap. The things we're talking about -- Valhalla, Panama, Loom, Metropolis -- are coming into releases. Those projects are exciting, and they understand that getting on a modern release train is going to put them in a great place to take advantage of those things as they come out.
So, you'd say the new release cadence is achieving the goals you originally had for it?
You have to keep in mind what our goals are, and where we were coming from when we decided to do this. In the past, we would work on one of these major releases for two to three, even four years. And we would have a bunch of changes and improvements that landed as a major, disrupting release, which was very hard on everyone. What we've done is found a better way of improving our processes and the way new versions of the spec are revved, so that we can get those changes out in smaller chunks to people much more rapidly. What this is really about is making sure we keep Java relevant and that we're able to deliver functionality into the hands of developers quickly.
This is a big change, and I think lots of people are still having to stop and remind themselves that the latest release isn't going to be this massive, disruptive challenge. Do you think your decision to rev the major release number each time has caused a bit confusing in this regard?
I do, actually. Because we chose to streamline the numbering, eliminate "major releases," and not use, say 11u20, 11u30, 11u40, there is that automatic expectation that these are major disruptions coming every six months. But, of course, our intention is the opposite. We're providing small, evolutionary steps forward and incremental change that is easy to adopt but getting you the functionality you need. And we're continuing to evolve the language and platform so that Java continues to be relevant for another two decades.
But another advantage with this approach, which I've heard you talk about, is how these incremental changes often will constitute pieces of a bigger picture.
A really good example of that is Switch Expressions [which was included in JDK 12]. People are very excited about this feature. It's small, it makes your code easier to write, and more importantly it makes your code easier to read. We're including it as a preview so we can get feedback and make sure it's as good and solid as it can be when it's finalized, and informed by real, live use cases.
But if you look at JEP 325, way down at the bottom of the page, you'll see under "Dependencies" it says "Pattern Matching (JEP 305) depends on this JEP." It's not an end in itself, but a step toward a much larger and more ambitious improvement to Java. This is something that we would have done previously in one of those large, disruptive releases, all at once. The faster cadence allows us to do this kind of thing in small improvements across multiple releases.
But the faster release cadence also makes it easier to leave features out of a release, as you did this time around with Raw String Literals.
That's right. This is a good example of a key advantage of this model. If this had been a two-to-three-to-four-year major release, the pressure to keep it in would have been extremely high, because if it didn't, you'd be waiting for it for years. The stakes of leaving it out were much, much higher.
How has your team adapted to the faster release cadence?
They've adapted wonderfully. Having the predictability of knowing when features come out means we can reduce the overhead of producing them tremendously. We're not constantly having to shuffle different schedules, planning and re-planning when a feature drifts out of scope. So, for the team, it has been quite liberating.
Any doubters in the bunch?
It has been a bit of a cultural shift. Some people on my team were apprehensive about [the faster release cadence] before we moved into it. But now they've seen how it's actually making their lives much easier. Our end game for releases has never been calmer than it has been in the last year and a half. And people love that.
One thing people worried about before we got started was what we would do if we had a release where there wasn't anything ready -- no new JEPs came in? We went through that thought experiment before we introduced the six-month releases. And we realized pretty quickly that the minor updates we did in the past were effectively on a six-month cadence. They sometimes involved quite significant implementation changes or improvements, but they always included a bunch of bug fixes. So, we decided, even if we don't have any JEPs come into a release, we're always going to have somewhere in the neighborhood of 1,500 to 3,000 bug fixes. That in and of itself is worthy of a release.
When the new six-month cadence was announced there was some talk about "release fatigue." Have you seen that in the Java community?
It's sort of like asking, if your kids had Christmas twice a year, do you think they'd experience "Christmas fatigue?" The parents might, I guess. What I'm hearing people say now is that they are seeing so much evidence that updating to 9 and finding the move to 10 and 11 so smooth, they're excited about the new cadence and what's coming down the pike.
Posted by John K. Waters on 03/26/2019 at 1:15 PM0 comments
I last sat down with Heather VanCura, chair of the Java Community Process (JCP), in 2017, roughly a year after she'd stepped up to head the Java language and platform standards organization. A lot has happened in the Java community since then. (I mean, a lot.) Now that some of the dust kicked up by all that change has settled, it seemed like a good time for another conversation.
You're about two years into the job now. During that time, the Java community has seen a remarkable amount of change. I'd argue more than at almost any other time in its 20-plus year history -- Java EE moving to the Eclipse Foundation, Oracle's amped up release cadence, etc. How has the JCP managed to keep pace without losing its footing?
To be clear, the organization has been evolving throughout its history, even if that evolution was often pretty slow in the early days. But there's no doubt that things have sped up in recent years. I think the key for us has been constant communication with the community.
When you say "communication," what do you mean?
I mean presenting and interacting at conferences, trade shows, and events, as well as showing up, on-the-ground, to support Java communities around the world.
It's kind of an understatement to say that you travel more than your predecessor.
I like to say I've been to every continent on behalf of the Java community except for Antarctica. Actually, I said that in a tweet and then was actually invited to submit for a kind of tech-conference cruise event that does go there. It's a 10- or 12-day thing, which sounds kind of exhausting, but if it was a Java community event, I'd definitely do it.
Why is this kind of hands-on interaction with the community so important?
We're getting back to our roots, back to solving real-world problems for real-world developers. But there are many different kinds of developers using Java all over the world today. It used to be that most of our base came from the U.S. and western Europe, but we really see it diversifying, and people have different problems in Nigeria, Morocco, and Ivory Coast than they do in Belgium and the Netherlands. It's important to nurture these communities, and often the best way to do that is to show up at their events when it's possible to do so.
You're talking about Java User Groups (JUGs)?
What do you do at these events?
I talk about changes that are happening, of course, but also about how they can participate in this community and why they should.
I understand that, in addition to showing up for events in person, you interact with these communities virtually.
Yes, I do a lot of virtual JUG visits. I've found that it's sometimes useful to do virtual events in advance to give the groups ideas of how to engage with their own communities. I encourage them discuss things among themselves. What are your particular interests? What technologies are most important to you? What kinds of activities would you like to participate in? And then maybe, if they plan something like a hack day, I come in person for that. For example, I finally made it to Israel for the first time last year, but they had participated in a virtual hack day a couple of years ago. It sort of sets up things up for a productive visit.
And this is what you mean by "constant" communication?
We try to interact with the Java community wherever and whenever we can, and as much as we can. If I'm at a conference, I always try to engage with the local Java user group, to see if they want me to come and speak -- which they usually do. These communities are already invested in the platform; I want them to also feel connected to the process, to feel like their needs are being addressed by it, and that they can be valuable participants in it.
The message is, the process is community driven?
That's right. We want them to understand that their input is actually very important. But sometimes they need help understanding how to provide that input. That's sometimes the missing piece I try to provide. There's the infrastructure, the projects are there and they're open, now exactly what do I do to get my input in?
This strategy must provide you with lots of feedback on what's happening on the ground in the Java community.
It does. We get questions and comments right after a presentation, of course, and that's useful. But also, I keep in contact with the user group leaders and ask them what sort of feedback they got from a particular presentation, and they also keep me informed in terms of what they plan to do. For example, the London JUG let me know that they wanted to do a Valhalla workshop, and I was able to coordinate my travel so I could be there for that.
For several years now, the JCP has been working to improve itself, most recently under JSR 387 (Streamline the JCP Program"), the latest release of which (2.11) was announced last December. Where are you now in this process?
We've been looking at the JSR lifecycle, which was really based on an artifact of the waterfall development methodology. That's just not how spec leads are using the development process anymore. So, we've been modernizing that process with fewer discrete milestones. We're keeping the open development period that's transparent and the community review followed by a vote by the EC. But we're definitely moving to more of a continuous delivery model.
What do you think about the faster release cadence implemented by Oracle last year?
I think there are two sides to that story. Faster releases mean you get innovation quicker. You're putting new technology into the hands of developers every six months instead of every three years -- which is the expectation of developers using other languages, tools, and frameworks these days. The faster release cadence combats the perception that Java is for an older generation of developers.
But for some it's been hard to adapt to this new pace of change. It's kind of entrenched in the Java culture that we do this only every three or four years, and when we do it, it's a huge project involving 100-plus changes. With the faster releases, it's only, say, 15 changes. People are having to adapt to a different way of migrating between versions.
What about Oracle's new subscription model?
Choice is the whole foundation of the JCP. I talk about the compatibility triangle. We have the spec, the reference implementation, and the TCK. That's the structure the JCP provides. It's what creates the whole Java ecosystem. Oracle's offer is the reference implementation, but other vendors can have their implementations, and that's what creates the choice for developers. Obviously, Oracle provides a strong foundation, but I think having other choices helps to solidify Java's position in the market, because you're never tied to any single vendor.
How has Oracle's decision to move enterprise Java to the Eclipse Foundation affected the JCP?
It's a big change in some ways, but the platform is the foundation, so we definitely still have a lot to do. We continue to adapt the JCP program itself to meet the needs of developers. We continue to support the spec leads who are developing the core platform through the JCP, as well as the stand-alone JSRs. We continue to run the Executive Committee. And we're committed to the ongoing engagement with the community that I've talked about.
Which is not to say that we're not watching the progress of Jakarta EE closely and responding to questions about it from the community. The transition is actually still ongoing. The Foundation is still defining all the process that will take place. But it's clear that they recognize the value of multiple implementations and the importance of standards.
How is the JCP addressing the demands on developers of the rise of AI and machine learning?
We are addressing AI in a couple of ways. There's Project Panama in the OpenJDK, which addresses big data and machine learning, and there's JSR 381 [Visual Recognition (VisRec) Specification], which is a community-lead effort. The approach is to talk a specific piece of the AI/ML space -- visual recognition -- and create a Java API for that as a stand-alone JSR not as part of the platform.
In all your travels, what's the number one concern you're hearing from Java jocks around the world?
People are wondering about all the changes and how they're going to impact them. They want to know if Java is still a good choice for them. Am I with the right technology? What can I expect going forward? People are using multiple tools these days, and they need to be constantly learning, so I think it's fair to ask whether the investment you're making in Java is the right one.
It is, of course.
Posted by John K. Waters on 03/13/2019 at 1:06 PM0 comments
The JCP Expert Group in charge of JSR 107, the specification for the Java Temporary Caching API, better known as JCache, recently submitted the maintenance review for JCache 1.1.1. This is just an errata release and not particularly newsworthy, but it marks something of a milestone for the longest running spec request in the history of Java. And it seemed like a good time to talk with its co-author.
Greg Luck is the CTO of Hazelcast and co-spec lead on JSR 107 with Oracle software architect Brian Oliver, who works on the Coherence team. Hazelcast develops, distributes and supports a leading open source in-memory data grid (IMDG), also called Hazelcast. Oracle was the original submitter of the JCache spec, and Luck is the creator of Ehcache, a well-known version of that spec.
How did you become involved with this JSR?
I was doing some work for a big online share-trading company at the time, and they wanted me to create a JCache adapter for Ehcache. I tried to do it, but the spec wasn't complete, so, I told them that, and they said, well, can't you do some work on the spec? And that's how I got roped into it, though I didn't really have time to work on it until I was at Terracotta.
When you started working on caching technologies back in the day, it had sort of bad reputation, didn't it?
At the time, people felt, in the Java community, very, very strongly, that caching was a dirty trick, that if you were to do that, there was something wrong with your architecture. It was the very strong prevailing view. One of the reasons engineers felt that way is that caching is very seldom black and white. It's probabilistic in nature, a bit like AI techniques using statistics.
But it's a completely different era now. People came to realize that anything done at Web scale must be done with caching. Every single cloud provider now has, as a standard component, a caching service along with the different implementations they provide. It's now well understood that caching has a purpose.
Do I have it right that JSR 107 was the longest-running spec request?
It was, but I think it was worth the wait. It's now considered one of the most -- if not the most -- successful stand-alone specs, ever.
How do you measure that success?
In terms of the number of implementations. There are 13 now, and we recently saw IBM implement JCache for its eXtreme Scale API. It really has been enormously successful. The spec has been implemented now by so many people that it's a standard that can just be used and leveraged in all these different products.
One of those implementations -- Blazing Cache -- was created by the travel Web site Trivago. These guys actually created a cache for their own purposes for that huge site.
At Hazelcast we actually have some of our biggest customers using us for our JCache. I get in trouble if I say who these companies are.... In fact, our biggest production cluster in the world, which is an online store that services billions of people, uses us for our JCache.
At this point there is only one significant implementation in the caching world that has not adopted it: Pivotal's Gemfire, and its open source variant, Apache Geode. I don't entirely understand why. The guys tend to be Spring-centric, and Spring itself has got good support for JCache. I guess you could say that JCache is ubiquitous with one exception.
What's going on with this release?
It's a self-serving comment, but I have to say that 1.0 was pretty well done. There were a few things found in 1.1. And now 1.1.1 is really just an errata release. This thing is pretty mature and stable, and I think it's enormously important to the Java community. It's been out there now for four years, it can be used as a foundation.
I understand that you'd like to see JCache implemented in Jakarta and MicroProfile, both of which are now at the Eclipse Foundation.
If you look at the surveys, JCache is the number one thing people are asking to be included in the Jakarta spec. And I've been personally lobbying the MicroProfile people to add caching, because it's incredibly important. We [Hazelcast] joined MicroProfile, and I wrote a specification to include JCache in MicroProfile. Tons and tons of Hazelcast gets used with microservices, so it's clear to me that having the framework directly support it would be great. But the MicroProfile guys don't currently see caching as a priority, so nothing has happened yet. Jakarta has become energized under the stewardship of Eclipse, so we'll see what happens there.
What's your pitch to the MicroProfile folks?
With microservices, you take a monolithic application and you break it into pieces. You essentially have a container that you started from Java, and it's probably running in Docker. If it's a busy microservice, you'll want to scale it. With microservices, you can easily scale just the bit you want. Let's say you started with three and you want to scale it up ten or fifteen. If you pop an in-memory data grid (IMDG) in there, it can run in process, so it's in each of the nodes. And then, as you scale the thing, the IMDG scales along with it, literally.
So, IMDGs, through their embeddable nature in Java, are the perfect fit for microservices. If someone wants to build a microservice, if they use JCache, they can use their IMDG and they can use a first-class annotations library that's in JCache, and they can swap the IMDG at a very low cost.
The MicroProfile people are just leaving this as something the implementor has to deal with. But having caching as a first-class citizen inside the MicroProfle framework would be very useful for a lot of people.
And yet, one of the relatively unique qualities of JCache is that it's a stand-alone spec.
That is true. Because it's one of the few stand-alone specs, people can very easily just plug it in and use it. If you use a caching API in your code, if you use JCache, you can swap out implementations at no cost, which is always the promise of these things.
Posted by John K. Waters on 02/13/2019 at 9:53 AM0 comments
The Apache Software Foundation's (ASF) release of NetBeans 10.0 (incubating) at the end of December launched the venerable Java (now polyglot) IDE into 2019 with a slew of enhancements, including support for JDK 11, the addition of a JUnit 5 library and new PHP features.
The list of JDK 11 enhancements in this release includes:
- update of nbjavac module
- removal of Java EE and CORBA modules from the JDK
- var support for implicitly typed lambda expressions
This release also adds JUnit 5.3.1, the latest generation of the JUnit testing framework for Java, as a new library. JUnit 5 is now the default version of the framework for Maven projects without existing tests. JUnit 5 @Testable annotation is also supported, as is the default JUnit 5 test template.
NetBeans 10.0 also adds new features for PHP developers, including support for PHP 7.0 through 7.3, PHPStan and Twig, as well as new editing and debugging enhancements.
This release also includes a number of OpenJDK support features, including:
- Automatically detect JTReg from OpenJDK configuration
- Register the expanded JDK as a Java Platform
- And various improvements to make the OpenJDK project "work better."
This release is the second under the aegis of the ASF, which assumed the stewardship of NetBeans in October 2016. Apache NetBeans 9.0 wasn't released until July of last year, because of the heavy lifting involved in migrating the 20-plus-year-old development environment to the ASF, which provides support for an enormous range of technologies. That first release took as long as it did in no small part because so many files needed to be audited before they could be donated to Apache, explained Geertjan Wielenga, Oracle product manager and developer advocate for open source projects, at the time. Consequently, the decision was made to donate NetBeans in pieces. Because NetBeans is modular, an incremental donation was relatively easy to architect, he added.
Also, that first release had to wait for the approval of the Podling Project Management Committee (PPMC), a group of community members charged with helping a nascent ASF project, called a "podling," learn how to govern itself. According to the ASF, a PPMC works like a regular PMC, but reports to the Incubator PMC instead of the ASF Board. Initially, this group includes the podling's mentors and initial committers. The PPMC is directly responsible for the oversight of the podling, and it also decides who to add as a PPMC member.
Posted by John K. Waters on 02/12/2019 at 9:54 AM0 comments
The Eclipse Foundation yesterday announced the release of GlassFish 5.1, considered a major milestone release belying the modest increase in its version number. GlassFish 5.1 comprises the full migration of GlassFish and associated Technology Compatibility Kit (TCK) code to Foundation stewardship.
This release of the open source Java EE reference implementation is the first since the Foundation became the steward of enterprise Java last year. Now called Eclipse GlassFish, it has been fully tested under both the newly open source TCK and the proprietary Oracle Java EE 8 TCK. It represents more than 13 million lines of code and 95,000 files, the Foundation said in a statement.
"We were able to onboard all of GlassFish, which has a huge, very mature code base," said Mike Milinkovich, executive director of the Eclipse Foundation, in a statement. "And we open-sourced the Java EE TCKs, which was an enormous change for the Java EE ecosystem. Shipping Eclipse GlassFish is a major milestone in fully establishing the Jakarta EE specification process, a major advance for the future of enterprise Java."
In an earlier ADTmag interview, Milinkovich underscored the importance of open sourcing the TCKs, which had been confidential and proprietary. "The fact that the TCKs are now open source and have become an integral part of migrating this code base forward -- that ability to do this open, public testing all the time is really going to help us innovate," he said. "And the community now has the ability to inspect tests and give feedback, and to give us more, better tests, which means we can expand the test coverage over time."
This release of GlassFish was tested with the open TCKs running on Eclipse Foundation hardware, Milinkovich said, ensuring that Eclipse GlassFish is Java EE 8 compatible.
The migration of GlassFish was an enormous engineering challenge, but also a legal one, because of Oracle's previously proprietary interest in the technology.
With this release, the Eclipse GlassFish code base has been re-licensed from CDDL+GPL and Classpath to Eclipse Public License 2.0 plus GPL with the Classpath Exception.
The migration effort actually started with EclipseLink and Yasson, which were already at the Eclipse Foundation. The first projects that were transferred from Oracle GitHub were JSONP, JMS, WebSocket and OpenMQ, work that was finished in January 2018. The GlassFish repository and CTS/TCK repositories were transferred in September 2018.
Because GlassFish is the reference implementation of Java EE, it supports Enterprise JavaBeans, JPA, JavaServer Faces, JMS, RMI, JavaServer Pages and Servlets.
The next version, Eclipse GlassFish 5.2, will be a Jakarta EE 8 compatible, the Foundation said, thanks to the support of all of the major vendors with Java EE 8 compatible versions of their commercial products. The companies have all committed to ensuring their products are Jakarta EE 8 compatible as well, the Foundation said.
In a great blog post, Payara's Arjan Tijms, a long-time Java EE developer, provides a chronicle of GlassFish's evolution, from the 1996 release of the Kiva enterprise Java app server.
GlassFish 5.1 can be downloaded here.
Posted by John K. Waters on 01/30/2019 at 7:44 AM0 comments
Alphabet's Google subsidiary has petitioned the Supreme Court to review its long-running copyright dispute and re-evaluate a Federal Circuit court's decision that copyright protections extend to software interfaces, and whether, as a jury found, "petitioner's use of a software interface in the context of creating a new computer program constitutes fair use."
If you're tired of this seemingly immortal struggle between Oracle and Google over those 37 Java APIs, you're not alone (it's been nine years), but the stakes are existentially high. In a blog post announcing the move, Google's SVP of Global Affairs and Chief Legal Officer Kent Walker rightly asserted that the Court's decision on the copyrightability of software "will have a far-reaching impact on innovation across the computer industry."
"Standardized software interfaces have driven innovation in software development," Walker wrote. "They let computer programs interact with each other and let developers easily build technologies for different platforms. Unless the Supreme Court steps in here, the industry will be hamstrung by court decisions finding that the use of software interfaces in creating new programs is not allowed under copyright law."
Oracle originally sued Google in 2010. Google's argument that its use of the Java APIs was allowed under the "fair use" provisions of the federal copyright law, and therefore did not infringe on Oracle-owned copyrights failed to persuade the court. "There is nothing fair about taking a copyrighted work verbatim and using it for the same purpose and function as the original in a competing platform," a panel of three Federal Circuit judges wrote in their March opinion.
The U.S. Copyright Office defines fair use as "a legal doctrine that promotes freedom of expression by permitting the unlicensed use of copyright-protected works in certain circumstances."
What the appeals court found initially was that the declaration code in Oracle's API packages, which Google copied verbatim, was copyrightable. Google developed the implementation code independently, so that wasn't at issue. The court found that the Oracle code had not been merged with the functions performed by the code; that combinations of short code phrases, such as those used in the APIs, can be copyrightable; and the fact that the code serves a function does not preclude its copyrightability if, as the court put it, "the author had multiple ways to express the underlying idea" at the time of creation of the code.
In its latest filing, Google asserts the following:
"Google has never disputed that some forms of computer code are entitled to copyright protection. But the Federal Circuit's widely criticized opinions -- in an area in which that court has no specialized expertise -- go much further, throwing a devastating one-two punch at the soft- ware industry. If allowed to stand, the Federal Circuit's approach will upend the longstanding expectation of software developers that they are free to use existing software interfaces to build new computer programs. Developers who have invested in learning free and open programming languages such as Java will be unable to use those skills to create programs for new platforms -- a result that will undermine both competition and innovation. Because this case is an optimal vehicle for addressing the exceptionally important questions presented, the petition for a writ of certiorari should be granted."
In his blog post, Walker points to support for Google's position on the copyright question from companies such as Red Hat (now owned by IBM), Yahoo and others, as well as a long list of computer scientists and academics, who have spoken out and filed court petitions of their own.
"The U.S. Constitution authorized copyrights to 'promote the progress of science and useful arts,'" Walker wrote, "not to impede creativity or promote lock-in of software platforms."
What happens next could be the final stake in the heart of this conflict, but I wouldn't bet on it. Meanwhile, check out our coverage of this case on the WatersWorks blog and throughout the ADTmag Web site.
Posted by John K. Waters on 01/29/2019 at 9:13 AM0 comments