Google researchers claimed to have reached a major milestone in the evolution of quantum computing called "quantum supremacy" in a paper published last week in the journal Nature. IBM quickly disputed that claim in a blog post that threw shade on the Alphabet subsidiary's conclusions. The headlines they generated notwithstanding, the claim and counterclaim involve a branch of computing most industry watchers say is still in the budding stage, so why should anyone care who's right?
"Whether or not Google achieved true quantum supremacy or it's more like quantum advantage, what's exciting here is that what they did shows real progress," said Gartner analyst Matthew Brisse. "Assuming the paper survives scrutiny by all the academic folks, this is a fantastic achievement by Google."
Brisse is a Research Vice President for the Data Center and Cloud Infrastructure group within the Gartner for Technical Professionals service. He provides strategic and technical advice for CIOs and tech pros on a range of topics, including quantum computing. Gartner currently has several analysts tracking 62 companies in the quantum computing space, he said, from hardware and software providers to consultants.
"Our guidance to CIO's is that they shouldn't ignore this space, because it's likely to be a real competitive differentiator in five to ten years," he said. "But they wouldn't want to go all in just yet, because it's not clear exactly what it's going to do for them. Let the hardware and software mature, let the algorithms start unfolding. But don't throw a lot of money at it now."
Developers, however, should take the leap sooner rather than later, Brisse advises.
"If you're a programmer interested in quantum computing, get involved now," he said. "Take advantage of the free quantum systems that are available -- things like Microsoft's QDK, D-Wave System's Ocean SDK, Rigetti's Forest SDK, and IBM's Qiskit. Microsoft has developed a domain-specific programming language for expressing quantum algorithms called Q#. And there's a plethora of libraries out there. They should check out the Quantum Algorithm Zoo, for example, which is a repository of quantum algorithms."
The Quantum Open Source Foundation maintains a curated list of open-source quantum software projects on GitHub. Last month, IBM opened a quantum computational center in Poughkeepsie, New York, designed to "support the growing needs of a community of over 150,000 registered users and nearly 80 commercial clients, academic institutions, and research laboratories to advance quantum computing and explore practical applications," the company said in a statement.
Another reason to get started now, Brisse said, is that quantum computing is complicated. It takes time to learn quantum computing algorithm development, and mapping business problems to quantum computing is difficult to get right. Plus, there's a lot of physics involved.
"There's a shortage of physicists in the industry today who know computers and business," Brisse said. "So we're seeing organizations like Microsoft and IBM actually going into the universities and cultivating a new type of quantum computing engineer."
Gartner defines quantum computing as a type of "nonclassical" computing that operates on the quantum state of subatomic particles. The particles represent information as quantum bits (qubits). In classical computing, bits represent information as either 0s or 1s; qubits represent both at the same time until they are read, thanks to a quantum state called superposition. Qubits can be linked with other qubits, thanks to another quantum property called entanglement. As Gartner explains it, "Quantum algorithms manipulate linked qubits in their undetermined, entangled state, a process that can address problems with vast combinatorial complexity."
In other words, quantum computing has the potential to solve some of mankind's greatest technical and scientific puzzles and problems. That potential might explain why companies like Google, IBM, Intel, and others are investing heavily in the technology. An analysis by Nature found that in 2017 and 2018 quantum technology companies received at least $450 million in private funding, a fourfold increase from the $104 million of the previous two years.
So Google's efforts are about more than just bragging rights to the coveted quantum supremacy, a kind of black belt earned by computing devices that can solve problems no classical computer can handle. (Quantum advantage is the brown belt; faster, but not unbeatable, though it's sometimes used as a synonym.) In that peer-reviewed Nature article, Google researchers describe how a team led by experimental physicist John Martinis used Google's Sycamore, a 53-qubit quantum processor, to solve a random number generation problem in 200 seconds, a calculation they said would take a state-of-the-art supercomputer 10,000 years to complete.
"For those of us working in science and technology, it's the 'hello world' moment we've been waiting for," wrote Google CEO Sundar Pichai in a blog post, "the most meaningful milestone to date in the quest to make quantum computing a reality."
In its quickly published rejoinder, IBM claimed that its Summit supercomputer, which it built for the Department of Energy, can do the calculation in two and a half days with greater fidelity.
"This is, in fact, a conservative, worst-case estimate, and we expect that with additional refinements the classical cost of the simulation can be further reduced," Big Blue's team wrote. "Because the original meaning of the term 'quantum supremacy,' as proposed by John Preskill in 2012, was to describe the point where quantum computers can do things that classical computers can't, this threshold has not been met."
Preskill, a professor of theoretical physics at the California Institute of Technology, wrote about Google's claim and the apparently controversial term he coined in a recent Quanta Magazine column. He regrets how the word "exacerbates the already overhyped reporting on the status of quantum technology," he wrote. But that didn't stop him from coining another term a few sentences later. The term is "NISQ," which rhymes with "risk" and stands for "noisy intermediate-scale quantum."
"Here intermediate-scale refers to the size of quantum computers that are now becoming available: potentially large enough to perform certain highly specialized tasks beyond the reach of today's supercomputers," he explained. "Noisy emphasizes that we have imperfect control over the qubits, resulting in small errors that accumulate over time; if we attempt too long a computation, we're not likely to get the right answer."
"The Google team has apparently demonstrated that it's now possible to build a quantum machine that's large enough and accurate enough to solve a problem we could not solve before," he added, "heralding the onset of the NISQ era."
Posted by John K. Waters on 10/29/2019 at 11:07 AM0 comments
Google says the Supreme Court should ignore the recent recommendation of the Solicitor General of the United States, which advised the court to refuse to review a 2016 appeals court's rulings that Google infringed on Oracle's copyrights to Java code in its Android mobile operating system.
Google filed a writ of certiorari with the Supreme Court earlier this year, asking for a review of the judgment of the U.S. Court of Appeals for the Federal Circuit in this case. Then, last month, the Solicitor General filed an amicus curiae brief to express the views of the United States that Google "identifies no sound basis for further review” by the court.
That this nine-plus-year-old case is still alive should surprise no one. If that earlier ruling stands, Google and its parent company, Alphabet, lose billions—$8.8bn to be exact; if it's overturned, Oracle won't recover the billions it claims to have lost.
Still, the argument Google made in a supplemental brief, filed last week, is a reminder that money isn't the only thing at stake here, and that the company's position is shared by some thoughtful people.
"The Solicitor General's further effort to cabin the Federal Circuit's fair use ruling as fact-bound is refuted by the 175 individuals, companies, and organizations that filed 15 amicus briefs in support of the petition to explain that it is imperative that this Court grant certiorari,” Google's petition reads. "Those submissions recognize that the Federal Circuit has effectively prohibited the widely accepted industry practice of reimplementing software interfaces, inevitably causing serious harm to current practices and future innovation in the software industry.”
But the SG's response to Google's argument is worth keeping in mind: "[L]ower courts have wrestled with issues, not presented here, about whether making temporary copies of existing code to ‘reverse engineer' a system, in order to create compatible works that do not incorporate the pre-existing code, constitutes fair use…. But here, petitioner took lines of code from a rival software platform to make a competing platform that is not interoperable with the Java platform.”
Following this story has been improving my vocabulary -- and not just my Latin. Who knew "cabin” could be used as a verb? (It means "confine within narrow bounds.”) And it has been a long haul. Oracle originally sued Google in 2010, and the search engine giant's argument that its use of 37 Java APIs was allowed under the fair use provisions of the federal copyright law, and therefore did not infringe on Oracle-owned copyrights, failed to persuade the court. "There is nothing fair about taking a copyrighted work verbatim and using it for the same purpose and function as the original in a competing platform,” a panel of three Federal Circuit judges wrote in their opinion.
Why doesn't the U.S. Solicitor General, Mr. Noel Francisco, feel compelled to weigh in on this case? Judging from the amicus curiae brief, he buys Oracle's argument, but why get involved in this long, long battle? My calls to the SG's office were not returned by press time, but it's a good question, so I'll keep asking, though I'm pretty sure the end is near.
Update: The Solicitor General hasn’t returned my calls, but a reader sent me an email answering my question. The SG is responding to an order from the Court itself requesting his views on whether the Court should hear the case. Here’s a link to the April 29 order list. Apparently, the Court gets a lot of these. Also, Hannah Coleman, who was an intern at the National Immigration Law Center at the time, wrote a nice article in 2017 explaining how a “Call for the Views of the Solicitor General” (CVSG) works. “Even though CVSGs are described as ‘invitations,’ the Solicitor General’s Office views them as orders, and the Solicitor General responds to every invitation it receives from the Supreme Court,” she wrote.
Posted by John K. Waters on 10/23/2019 at 9:35 AM0 comments
The Eclipse Foundation announced the released the Eclipse Jakarta EE 8 specification just over two months ago and we’re already seeing the solidifying outlines of Jakarta EE 9 -- not especially fast by current release-cadence standards, but warp speed when you consider how enterprise Java had languished before the Foundation accepted the stewardship of the platform two years ago.
Oracle software architect and Eclipse committer Bill Shannon posted his company’s proposed plan for Jakarta EE 9 on October 1 on the jakartaee-platform-dev mailing list, sparking a great deal of discussion in the community. About a week later, he returned to the mailing list with a “recast plan” based on the feedback he received.
“Nothing here is final,” he wrote in that first post, “and we're open to discussion on all these items. Feedback is strongly encouraged. Counterproposals are welcomed. Our intent is to put a stake in the ground to start the planning for Jakarta EE 9.”
Shannon’s subsequent post included a bunch of changes, which suggests that Big O won’t be bigfooting the process. He received the most feedback, said, around the removal of SOAP support.
“I think we've adequately explained that products can continue to provide SOAP support based on the Jakarta versions of the corresponding specifications,” he wrote, “which will have no changes to their APIs and will continue to use the javax namespace.”
If platform project committers believe SOAP support must be included in version 9, he added, they should speak up now, but also keep in mind that including SOAP support “would have a significant impact on the amount of work to be done for Jakarta EE 9.”
Oracle’s modified plan would “prune” several specs from the version 9 release, including:
Jakarta XML Registries
Jakarta XML RPC
- Jakarta Management
Jakarta Enterprise Bean entity beans
Jakarta Enterprise Bean interoperability
- Jakarta Enterprise Bean 2.x and 1.x client view
- Jakarta Enterprise Web Services
The plan would also not add some API’s corresponding to Java SE 8 APIs, including:
- Jakarta XML Web Services
- Jakarta SOAP Attachments
- Jakarta Web Service Metadata
- CORBA and RMI-IIOP
Two APIs corresponding to Java SE 8 APIs will be added to Jakarta EE 9:
- Jakarta XML Binding
- Jakarta Activation
Oracle favors the “big bang” approach to package naming -- switching everything from javax.* to jakarta.* all at once.
Some sort of backwards compatibility will be required, of course, to allow existing applications to work unchanged on Jakarta EE 9 products using the jakarta.* namespace, but Oracle is against defining that backwards compatibility in a Jakarta EE spec.
“We strongly encourage the creation of an open source project to produce backwards compatibility support that can be shared by multiple implementations of Jakarta EE 9,” Shannon wrote.
Although no additional profiles are being proposed for Jakarta EE 9, Oracle is apparently open to the idea of creating two additional profiles “to help additional vendors enter the Jakarta EE market.”
“We believe that the need of developers to ‘right-size’ their application deployment is best addressed by the use of Java platform modules, not profiles, defined by a future Jakarta EE release,” Shannon wrote.
Oracle also believes all Microprofile work should be done under the Eclipse Foundation Specification Process, should be considered additions to the Jakarta EE platform under that process, and should be deferred to a future Jakarta EE release. When and how Microprofile APIs should be added to the Jakarta EE platform is “an open issue.”
Big O is also backing what appears to be strong community support for the idea of splitting up the Jakarta EE TCK project, so that each specification project can manage its own TCK. But the company believes the work involved would be too much to include that change in this release.
“Possibly incremental progress can be made in the Jakarta EE 9 timeframe,” Shannon allowed. “It's essential that work in this area not make it more difficult to test a Jakarta EE 9 product.”
Oracle also wants to keep up the pace with a Jakarta EE 9 release12 months or less after the release of Jakarta EE 8.
Posted by John K. Waters on 10/22/2019 at 7:13 PM0 comments
The U.S. Justice Department has urged the Supreme Court to deny Google's latest petition to review rulings that the Alphabet subsidiary infringed on Oracle's copyrights to Java code -- rulings that could cost Google billions.
This petition keeps alive the nine-year legal struggle over 37 Java APIs Google used to develop its Android operating system, which Oracle has maintained were copyrighted, and for the violation of which Oracle is demanding $9 billion in damages. The courts have gone back and forth on the issues in this case, but eventually came down on the side of Oracle's claim.
Google filed a writ of certiorari with the Supreme Court earlier this year, asking for a review of the earlier judgment of the United States Court of Appeals for the Federal Circuit in this case.
"Above and beyond the broader implications for copyright law, this case warrants the Court's attention for its sheer practical importance," the petition reads. The ruling " ... threatens the prevailing approach to building computer software ... "
Google cited changing practices among software developers to support its petition. "Developers are not coding programs entirely from scratch," the company argued, "as they may have been in the early days of programming. Instead, new programs now incorporate and rely on preexisting interfaces to trigger certain functions, which saves the wasted effort of reinventing and retesting what came before."
In late September, the Solicitor General filed an amicus curiae brief to express the views of the United States on Google's petition. "Computer code can be used in transformative ways," it reads, "such as by excerpting it in a textbook to illustrate a coding technique. And lower courts have wrestled with issues, not presented here, about whether making temporary copies of existing code to 'reverse engineer' a system, in order to create compatible works that do not incorporate the pre-existing code, constitutes fair use ... . But here, petitioner took lines of code from a rival software platform to make a competing platform that is not interoperable with the Java platform."
"Petitioner copied 11,500 lines of computer code verbatim, as well as the complex structure and organization inherent in that code, in order to help its competing commercial product. The record demonstrates, moreover, that petitioner's unauthorized copying harmed the market for respondent's Java platform ... ."
"Petitioner identifies no sound basis for further review of the fair-use issue," the Justice Department concluded.
Oracle originally sued Google in 2010, and the search engine giant's argument that its use of the Java APIs was allowed under the fair use provisions of the federal copyright law, and therefore did not infringe on Oracle-owned copyrights, failed to persuade the court. "There is nothing fair about taking a copyrighted work verbatim and using it for the same purpose and function as the original in a competing platform," a panel of three Federal Circuit judges wrote in their opinion.
The Supremes are certainly free to consider Google's petition, but they have stayed out of the fight so far. We're surely approaching the conclusion of this seemingly immortal struggle, but given what it will cost the loser, I wouldn't bet on it.
Posted by John K. Waters on 10/09/2019 at 11:11 AM0 comments
One of the most-mentioned keynote topics at this year's Oracle OpenWorld-adjacent Code One 2019 conference, wrapping up this week in San Francisco, was the rapid release cadence for Java SE and the JDK, which Oracle announced in 2017 and launched in March of last year.
The rapid release cadence has proved to be one of those slap-the-forehead innovations that revivified the process of evolving the Java platform. Instead of allowing years to pass between releases and putting intense pressure on, well, every contributor in the community, to deliver and bet on big, fully formed enhancements, the new process runs on a cycle that calls for a feature release every six months, update releases every quarter, and a long-term support release every three years.
"That's fast enough to minimize the pain of waiting for the next [release] train," wrote Mark Reinhold, chief architect of Oracle's Java Platform Group, in the blog post in which he first proposed the faster cadence, "yet slow enough that we can still deliver each release at a high level of quality, preserving Java's key long-term values of compatibility, reliability, and thoughtful evolution."
JDK 10 was the first feature release of the new cycle; Java 11, which was released six months later, was the first long-term support (LTS) release. The next LTS release will be Java 17, scheduled September 2021. Java 13, which went GA this week, is not an LTS release, which means it will be obsoleted with the release of Java 14 in March, 2020.
And yet it seems people are still getting used to the idea, which I suppose underscores the depth of the change. Java Language Architect Brian Goetz reminded conference attendees during his Code One keynote to remember that the rapid release cadence imposed a truly fundamental change on a long-established process.
"A lot of people, including, quite honestly, us, were pretty skeptical at first," Goetz admitted. "It seemed inconceivable that we could turn a ship as big as Java that quickly. There were even fears that Java 10 and 11 might have no features at all. But looking back, it would be really hard to overstate what a significant change the rapid release cadence has been."
Among the benefits of the new cadence, he said, are the obvious: "We can deliver value more often, and you don't have to wait as long for any particular feature to come out." And the not-so-obvious: "It has brought about this fundamental change in how we design, plan, and deliver new features by lowering the cost of a feature missing the boat. Missing by six months is a whole lot different from missing by several years. [The faster release cadence] has drastically reduced our release-management overhead internally, which allows us to spend more energy on designing and building features and less energy on managing releases."
"And at the same time, having more releases has encouraged us to learn how to break down complex features into smaller ones, so we can deliver in phases," he added. "The results, which has sort of been an unexpected one, is that our feature pipeline has become richer than it has ever been. The whole thing has been one big virtuous circle."
Adapting to the faster release cadence has required the Java community to "recalibrate our expectations," Goetz said, "about what constitutes something worth upgrading.
"In the old world, when we had these big releases every few years, and those big releases tended to have big features, like Generics and Lambda's, there was already plenty of motivation to upgrade. The reality now is, we're not going to be seeing a lot of those big features in the future. And that's not because we're not innovating. It's because those big features are going to get broken up into smaller features and delivered in phases. There's just as much innovation going on -- perhaps more -- but it's going to be spread out over a large number of smaller deliveries."
During his Code One presentation, Georges Saab, VP of Oracle's Java Platform Group and OpenJDK chairperson, also reminisced about the launch of the new release cadence into skeptical waters.
"Many folks were excited about the faster cadence," he said. "Many folks were apprehensive about it. And there are a lot of people who were actually both ... . Some of the most apprehensive people were on my team at Oracle working on Java! When we floated the idea of six months releases, they really thought I was crazy. They said because of the Java Community Process (JCP), through which the Java platform evolves, it just takes too long. This process requires that there be a specification, a reference implementation, and a certified compatible testing kit ... .But I had faith that, like us, many in the JCP Executive Committee (EC) wanted to see a faster and more gradual evolution of Java."
And then he brought out the big guns: Bruno Souza, leader of the Brazilian Java community known as SouJava, and Gil Tene, CTO and co-founder of Azul Systems, maker of the Zing Java runtime, among other products, the only vendor focused exclusively on Java and the JVM. Both are JCP EC members. (The lively Souza wore a cape that has yet to be explained.)
Tene said he and the other EC members were very skeptical when they first heard about the rapid release cadence for Java. "The benefits of the faster cadence were obvious," he said, "if it could be done, but the shortest we had ever done a spec process at the time was 10 months."
The challenge for the EC, Souza said, came down to one question: "How do we do a fast open-source model and at the same time retain the stability Java has."
"As we moved to a fast cadence process," he said, "we had to take a very critical look at what [we were] doing a the JCP. And so we removed everything that prevented us from going to an open-source-friendly-early-release-often model, but at the same time we kept and improved on everything that guarantees the stability, long term, of the specification that is so important to the Java community."
Tene agreed that the rapid release cadence was a welcome idea that had to prove itself to the EC in specific ways. The JCP is where the specification, the reference implementation, and the Technology Compatibility Kit (TCK) come together, he said.
"The reason the Java ecosystem has such a rich set of code and libraries -- the reason Maven Central works on all the differentiated cases out there -- is because we do this work, because the specification is specific, and because the TCK lets us verify that all these implementations really will run the same way," he said.
How did it work out?
"We think it's worked out great," Tene said.
Posted by John K. Waters on 09/20/2019 at 4:42 PM0 comments
The Eclipse Foundation today announced the released the first Jakarta EE specification, almost exactly two years after Oracle declared its intention to transfer the responsibility for enterprise Java to that open source standards organization.
Two years seems like a long time, but the Foundation's executive director, Mike Milinkovich, says that's just how long it took to get the specs right. I talked with Milinkovich last week about the road to the Eclipse Jakarta EE 8 release.
I've been calling the Foundation's adoption of enterprise Java and the development of a standards process for it "the road to Jakarta EE 8." I'm just assuming it was a bumpy one, but how was it really?
When we started down this road, we always said that our first order of business would be to ship a set of specs that were exactly the same as Java EE 8, so we had that baseline. When we first started saying that, it just seemed like a good, solid, conservative thing to do to make sure that we got it done. It turns out that we were really darned lucky we did it that way. It would have been borderline crazy to do anything else.
How do you mean?
The first time you run through a very large and complex process you inevitably run into unforeseen circumstances. In this release cycle, we took exactly the same specs and ran them through a new process, but we changed a few things -- the TCKs are now open source, the specifications are under a new license, there's no more reference implementation, there are multiple compatible implementations -- but those are all process changes. We're not delivering new technology to developers yet. But every one of those process changes, introduced somewhere along the way, had its own little bit of complexity that you just couldn't foresee.
Seems like it was a ton of work.
We rewrote our IP policy -- twice -- we developed a spec process from scratch, and we updated all of our contribution agreements and got tens of thousands of people to resign them. So, yeah, you could say that.
And, as you've often said, you weren't doing this alone.
I can't stress enough how many good people pitched in and worked hard to make this possible. Oracle, Red Hat, IBM, Tomitribe, Payara, Fujitsu, to name a few. It was a community effort to get this thing out the door.
There are a lot of people with skin in this game.
Well, Java is old technology -- more than 20 years old -- but it's a multi-billion-dollar ecosystem, and there are millions of developers who have skills with this platform. Basically, everything we're doing right now is about establishing a baseline that's going to allow us to re-invigorate this platform for the next 20 years.
Why is re-invigorating the enterprise Java platform so important?
Enterprises want to see that there's a way to modernize their applications, in many cases, to take what they have now inside the corporate firewall to the cloud. They want to leverage this new infrastructure model with existing applications that are known to solve their business problems. Also, enterprises have thousands of developers with skills on this platform -- people who understand the businesses they work for. We need to demonstrate to those people that the technology platform they have skills in is still relevant and will keep them gainfully employed for the next 20 years. And then there's the need to attract young talent to a platform by making it exciting again.
So, all the work you've done on this release is essentially about setting the stage, so to speak, for innovations to come?
We had to get this part right. Large companies are making big bets on their product plans and this technology, so we focused on doing something concrete that would reassure the market and the community.
And no small part of your challenge was the fact that the Eclipse Foundation was not a specification organization when you accepted the stewardship of enterprise Java.
That's right! We were creating a brand-new specification process from a blank piece of paper, rewriting our IP policy to handle patents correctly, which is very different from how you do things in open source. Luckily, we were doing all this with smart engineers, experienced standards people, and lots of opinionated lawyers from software companies around the world.
Talk about "Incremental versus Big Bang."
What we ended up with in our negotiations with Oracle, remember, is that every time we add a new API, or we make a change to an existing API, that has to happen in the Jakarta namespace. The javax package names cannot be evolved by the Jakarta EE community. So, do we switch everything from javax.* to jakarta.* all at once -- that's the Big Bang -- or do we make the change a little bit at a time, as needed -- in other words, incrementally? In other words, do we rip the band aid off, or do we deal with these compatibility issues with every single release for the next 20 years?
Sound like you favor the Big Bang.
That's my personal preference. Let's just get it over with. But there are valid arguments for an incremental approach. Ultimately, this is about what's best for the customers, what's best for the ecosystem. The vendors have varying opinions on this, based on what they think their customers perspective would be. But what a lot of it really boils down to is, to what degree can we offer a reasonable set of technical solutions to the backwards compatibility problem? This is a really big decision that has to be made over the next couple of months.
The decision to move enterprise Java to the Foundation elicited a largely positively reaction from the community. Did you get any significant pushback?
Java developers are not shy, and they always let us know what they think. But they were onboard for this.
Were you surprised at how long it took to get here?
I was a little bit. I'm an optimist by nature, and I had a very clear idea from the beginning of where I wanted this end up. And we got very close to what I hoped for. But it takes a long time to pull a thing like this together. I remember someone saying, "There's no way you're going to get this done in less than a year." And they were right, of course. You can't change legal documents willy-nilly, and you have to take the time to explain to the community -- thousands of people around the world -- what the changes are and why we're making them. But I believe it was time well spent.
Posted by John K. Waters on 09/10/2019 at 9:30 AM0 comments
The Eclipse Foundation is gearing up for the Sept. 10 release of Jakarta EE 8, the first version of the enterprise Java platform under the Foundation's stewardship, with, among other things, a livestream event.
The JakartaOne Livestream is a one-day virtual conference aimed at developers and technical business leaders interested in "the current state and future of Jakarta EE and related technologies," with a focus on developing cloud native Java applications.
Perhaps named with a nostalgic nod to the venerable JavaOne conference, which is now CodeOne, (I'd like to think so, anyway), JakartaOne is scheduled for the same day as the Jakarta EE 8 release. The event features sessions and keynotes organized by an all-star program committee that includes Reza Rahman, principal program manager for Java on Azure at Microsoft, and one of the founders of the Java EE Guardians (and a personal fav speaker); veteran Java SE/EE developer, Java Champion, and popular YouTube educator Adam Bien; Arun Gupta, principal technologist at Amazon Web Services and the guy responsible for the Cloud Native Computing Foundation (CNCF) strategy within AWS (and founder of the Devoxx4Kids chapter in the US); Ivar Grimstad, Java Champion and PMC lead for the Eclipse Enterprise for Java Project (EE4J), the top-level project for Jakarta EE within Eclipse; Josh Juneau, developer, system analyst, and a fav blogger and author; and Tanja Obradovic, who joined Eclipse Foundation as Jakarta EE Program Manager June 2018.
Juneau, Grimstad, Bien, and Rahman will be presenting at the event, which will include keynotes by the Eclipse Foundation's executive director, Mike Milinkovich, and the Father of Java, James Gosling. There's also an industry keynote featuring vendors with lots of skin in the enterprise Java game, such as IBM, Oracle, Payara, Tomitribe and Fujitsu. Payara CEO Steve Millidge will be presenting, as will Tomitribe founder David Blevins, JCP star specification lead Dmitry Kornilov, and Java EE Guardians Arjan Tijms and Markus Karg, among others
Session topics range from Jakarta EE 8 features to a state-of-the-union for MicroProfile, Quarkus to Helidon. And the cloud, of course. Among the key findings of an Eclipse Foundation enterprise developer survey, published earlier this year: "The future of the Java ecosystem and Jakarta EE is increasingly driven by new cloud workloads and capabilities."
"Java continues to dominate as the language of choice for organizations deploying applications in production environments," Milinkovich at the time, "and this latest survey shows the same level of support as our 2018 survey. What's most interesting is to see the acceleration in the adoption of Java in new cloud native architectures. Clearly the future of Jakarta EE is cloud native."
In fact, this free, virtual, time-zone spanning conference was developed, in part, as a reaction to what Obradovic described in a blog post as "huge interest" in the Cloud Native track at the October EclipseCon Europe 2019.
Enterprise Java jocks -- or anyone, really -- can register now for what many (including me) consider to be a must-attend event.
Posted by John K. Waters on 08/14/2019 at 12:59 PM0 comments
Jelastic, the Java-focused cloud hosting platform provider, today announced new support for several Java runtimes, including AdoptOpenJDK, Liberica, Zulu, Corretto, OpenJ9 and GraalVM.
Jelastic built the runtimes as certified and secure container-based images with pre-configured automatic vertical scaling, explained Tetiana Fydorenchyk, Jelastic’s VP of marketing, in a blog post. The company made them available across all existing production platform installations, and Jelastic PaaS users are now able to choose the type and version of the OpenJDK distribution while creating the environment or easily change it by redeploying containers afterward.
The Palo Alto, Calif.-based Jelastic (short for Java Elastic), which was founded in 2010 by Hivetext, a Zhytomyr, Ukraine-based start-up focused on Java application development in the cloud, bills itself as the only cloud company whose underlying platform is Java. Originally a Java-based Platform-as-a-Service provider, the company has been evolving a Platform-as-Infrastructure strategy that combines PaaS with Infrastructure-as-a-Service. Jelastic’s unlimited PaaS and container-based IaaS platform is designed to allow developers to deploy Java, PHP, Ruby, Node.js, Python, and .NET enterprise apps for private, public or hybrid cloud.
Although its platform is now multilingual, the company maintains “a major focus on Java,” the company has said.
“We love organizations like Jelastic, who, like us, were created out of a community need by developers for developers,” said Martijn Verburg, director at AdoptOpenJDK, CEO of jClarity, co-organizer of the London JUG, and a member of the Java Community Process (JCP) Executive Committee, in a statement. “With well over 50 Million downloads, AdoptOpenJDK has become the defacto hub for the community to collaborate on, and we’re very happy to be added as a choice to the awesome Java hosting company PaaS that is Jelastic!”
AdoptOpenJDK uses infrastructure, build, and test scripts to produce prebuilt binaries from OpenJDK class libraries and a choice of either the OpenJDK HotSpot or Eclipse OpenJ9 VM. It’s a free and open source implementation supported by a range of companies, from IBM and Microsoft to GoDaddy and Pivotal.
Azul Systems’ Zulu implementation of the Java Standard Edition (SE) specification that contains all the Java components needed to build and run Java SE apps. Azul’s CTO Gil Tene, said his company was happy to see “a wide set of OpenJDK variants added to the Jelastic PaaS that provides customers with elasticity in both scale and runtime choice.”
Amazon Corretto is a no-cost, multiplatform, production-ready distribution of the Open Java Development Kit (OpenJDK) certified as compatible with the Java SE standard. It comes with long-term support that includes performance enhancements and security fixes, and it allows developers it build and run Java applications on operating systems such as Amazon Linux 2, Windows, and macOS. Amazon uses it internally.
Eclipse OpenJ9 is an open source JVM optimized for small footprint, fast startup, and high throughput. It can be built as a component of OpenJDK v8 and later, and prebuilt binaries available at the AdoptOpenJDK project for Linux and Windows. Unsurprisingly, Dan Heidinga, Eclipse OpenJ9 project lead, was pleased to hear about the support. “Jelastic user focus comes across loud and clear in the broad choice of OpenJDK variants they provide on their PaaS,” he said in a statement.
Liberica is a certified, Java SE 12-compliant distribution of OpenJDK 12. It’s a 100 percent open source Java implementation built from OpenJDK by BellSoft.
“We are glad to see Jelastic join the growing number of cloud services and open source projects that are adopting GraalVM,” said Eric Sedlar, VP and technical director at Oracle Labs. “GraalVM provides zero overhead interoperability between programming languages allowing developers to write polyglot applications and select the best language for your task.”
Posted by John K. Waters on 08/13/2019 at 3:13 PM0 comments
Call it "A Tale of Two Repositories."
Mercurial is a free, cross-platform, distributed version-control system (SVM), and current host of the source files and change histories of most JDK projects since 2008. Git is the free, cross-platform, distributed version-control system that is almost certain to become the new home of those repositories.
It has been almost exactly one year since the launch of Project Skara, an investigation of alternative source-code management (SCM) systems and code review options for the JDK -- and in particular, whether Git might not be a better option than Mercurial. Sponsored by the Build Infrastructure Group and led by Joseph Darcy, a member of the Technical Staff at Oracle, the project sparked a serious discussion in the Java community about SCM options for OpenJDK.
For at least two Java jocks (and a whole lot more), that discussion turned into action last week, when Darcy and his colleague on Oracle's Technical Staff, Erik Duveblad, proposed JEP 357, the goal of which is to "Migrate all single-repository OpenJDK Projects from Mercurial to Git."
The list of specific goals of JEP 357 includes:
- Migrate all single-repository OpenJDK Projects from Mercurial to Git
- Preserve all version control history, including tags
- Reformat commit messages according to Git best practices
- Port the jcheck, webrev, and defpath tools to Git
- Create a tool to translate between Mercurial and Git hashes
The JEP's authors also included a list of things they will not do:
- We will not migrate multi-repository OpenJDK Projects, such as the JDK 8 Updates Project. Those Projects can migrate to Git if and when they consolidate into a single repository.
- We will not change the bug system from JBS.
- We will not address the question of whether OpenJDK Git repositories will be self-hosted or hosted by an external provider. That issue will be the topic of a future JEP.
- We will not propose changes to the current JDK development process, though this JEP does enable such changes.
The bottom line for proponents of this move is that Git has the potential to handle larger projects. In the prototypes initially developed via the Skara project, converted repositories showed "a significant reduction in the size of the version control metadata," the JEP authors reported. In one example cited, the .git directory of the jdk/jdk repository, which is approximately 300 MB with Git is about 1.2 GB with Mercurial, depending on the Mercurial version being used.
"The reduction in metadata preserves local disk space and reduces clone times," they wrote, "since fewer bits have to go over the wire. Git also features shallow clones that only clone parts of the history, resulting in even less metadata for those users who do not need the entire history."
And then there's the bulging toolbox Git provides: text editors with Git integration (native and plugins), including Emacs, Vim, VS Code, and Atom; almost all IDEs ship with Git integration out of the box, including IntelliJ, Eclipse, NetBeans, and Visual Studio; and multiple desktop clients are available for local interaction with Git repositories.
The group working on JEP 357 has already prototyped a program to convert a Mercurial repository to a Git repository. The structures of commit messages for both Mercurial and Git jdk/jdk repositories are shown on the JEP page. The group has also prototyped backward compatible ports of several Mercurial tools.
Examples of converted repositories are available at https://github.com/openjdk/.
Posted by John K. Waters on 07/17/2019 at 7:58 AM0 comments