Apache Olingo Java Library Graduates to Top-Level Project

On Tuesday, the Apache Software Foundation (ASF) sent out a graduation announcement of sorts. The Apache Olingo project, which provides generic Java and JavaScript libraries designed to implement the OASIS Open Data Protocol (OData), has had a status upgrade from Incubator to Top-Level-Project (TLP).

Olingo is a standardized protocol for creating and consuming data APIs. By implementing OData, which is REST-based and uses HTTP, AtomPub and JSON, Olingo is able to use uniform resource identifiers (URIs) to connect with feed resources. The aim is to simplify the querying and sharing of data across disparate apps in the enterprise, the cloud, and on mobile devices. Using OData makes it possible for Olingo to provide a uniform way to expose full-featured data APIs.

Olingo works with browser-based used interfaces, which use it to query data on servers. But it's also used to synch data to mobile devices and exchange data among server systems. Olingo is part of the technical foundation of SAP's NetWeaver Gateway technology and other enterprise solutions.

Olingo extensions contain additional features, such as the support of Java Persistence API (JPA) or annotated bean classes. The project's documentation, wiki and tutorials highlight several examples of implementing a custom OData service, including a sample Web application built with Apache Maven that can be deployed to any Java Platform, Enterprise Edition (JEE)-compliant Web application server, such as Apache Tomcat.

Olingo joined the Apache Incubator in July 2013 with a combination of Java server libraries for OData 2.0 provided by SAP, and Java client libraries for OData 3.0 and JavaScript libraries for OData 3.0 from Microsoft's Open Technologies group (MS Open Tech). Since joining the incubator, the Olingo project has produced three releases, which resulted from the work of 20 individual contributors responsible for 495,107 lines of code and 1,102 commits.

"OData v4 recently became an OASIS standard that is increasingly opening up data for an open Web," said Eduard Koller, senior program manager at MS Open Tech, in a statement. "Apache Olingo is open source software to aid in the production of OData v4.0 clients and servers in both Java and JavaScript. The project brings together several companies and community developers and we look forward to welcoming more users and contributors to the community."

Acceptance as a TLP, which is based on a voting process within the organization, means that Olingo has won a place among such projects as the Apache HTTP Server, the Tomcat Java app server, Hadoop, the Lucene search engine and OpenOffice. Graduation, as the Foundation puts it, signifies "that the project's community and products have been well-governed under the ASF's meritocratic process and principles." Every Apache TLP has a project management committer (PMC) and a chair. An incubator project could also graduate to a subproject of an existing TLP.

The ASF's Incubator is a temporary container project that is the official gateway to mainstream Foundation activity. All potential Apache projects start in the Incubator, which gives the Foundation a chance to scrutinized them, make sure they meet the ASF's legal standards (it's a 501(c)3 non-profit organization), and help them to adopt Apache procedures.

Posted by John K. Waters on April 9, 20140 comments


JCache, Longest Running Java Spec, Finally Ready

News about Java Specification Requests (JSRs) don't usually make it to the front page, but the announcement that JSR-107, the spec request for a Java Temporary Caching API, better known as JCache, has earned final approval should be top-of-the-fold news -- if for no other reason than the time it took to get there.

The JCache project summary explains that the spec standardizes in-process caching of Java objects "in a way that allows an efficient implementation, and removes from the programmer the burden of implementing cache expiration, mutual exclusion, spooling, and cache consistency."

JSR-107 was the longest running spec request in the history of Java and the Java Community Process (JCP). The JSR-107 Expert Group was formed in 2001. But the spec languished for about a decade until Oracle and Terracotta teamed up and began pushing to get the spec into Java EE 7. Terracotta implemented a version of the JCache spec in Ehcache, a widely deployed open-source Java caching solution the company acquired in 2009. But JCache didn't make it into the Java EE 7 release.

"JCache provides a well thought out, standardized API and programming contract for Java caching," said Greg Luck, Hazelcast CTO and co-author of the JCache spec, in a statement. Luck left Terracotta earlier this year to join Hazelcast, which develops, distributes and supports a leading open source in-memory data grid (IMDG). The product, also called Hazelcast, is licensed under an Apache license that allows developers to include the grid in their applications. The company also provides a commercially licensed Hazelcast Enterprise edition, as well as a management and deployment console called Hazelcast Management Center.

Hazelcast is set to release a production class implementation of JCache, Luck told ADTmag in an e-mail. He plans to continue to his work on JCache, "taking up a co-maintenance lead role" in the spec. "Hazelcast plans to deeply bake in JCache, so that Hazelcast is standards based and may be used for caching without vendor lock-in," Luck said.

Hazelcast's senior solution architect Chris Engelbert also severs on the JCache expert group. Oracle's Brian Oliver and Cameron Purdy also serve as spec leads on the JSR.

Last year, Gartner predicted that in-memory computing is "racing toward mainstream adoption." The analyst firm expected the relatively small IMDG subset of the in-memory computing market (IMC) to grow fast and to reach $1 billion by 2016.

Yet Forrester Research analyst Mike Gualtieri now suggests that JCache may be arriving in an environment populated with products featuring their own in-memory database engines that may eliminate or delay the need among some companies for a caching grid. "The primary use case for in-memory caching is to speed up applications that get bogged down by slower database bottlenecks," Gualtieri said. "A hot trend among the database vendors, such as SAP Hana and Microsoft SQL Server 14, is in-memory capabilities that dramatically speed up database access. These databases aren't just using more memory for old-school indexes of record caching. They have added in-memory database engines that take advantage of the hardware characteristics of RAM. Also, distributed NoSQL databases, such as MongoDB, can perform very well and have the same distributed architecture as Hazelcast and Terracotta."

Posted by John K. Waters on March 25, 20140 comments


EclipseCon 2014: Java 8 Support, Flux Integrates Orion

The big news at the latest edition of EclipseCon North America, which wrapped up in San Francisco on Thursday, was Oracle's Java 8 announcements. The conference planners devoted an entire day at the show to Java 8 (George Saab's opening presentation on "Java Day" was standing room only). The Foundation itself is providing Java 8 language support as an add-on to the Eclipse IDE.

"We're pretty excited about the Java 8 language support for Eclipse Kepler (the latest version from last June's the release train)," Eclipse Foundation Executive Director Mike Milinkovich told me in an earlier interview. "The Eclipse Java development tools team has done a great job adding Java 8 support to the IDE, including a formatter, code completion, code navigation, search and indexing, a reconciler, and incremental builder support for all of Java 8. I think that the quick assist support for migrating anonymous classes to lambda expressions will be particularly popular with Java developers as they migrate their code."

During his conference keynote, Milinkovich talked about another Eclipse project that should get at least a bit of shine from the Java 8 spotlight: the Flux project. Formerly called Project Flight, Flux aims to design and implement a new architecture and infrastructure for integrating development tools across the desktop, browsers, and servers. As the project page describes it, "The goal is to provide an extremely flexible platform and infrastructure that allows new cloud-based tooling components to be built highly decoupled from each other and that bridges the gap to existing desktop IDEs at the same time."

This project is all about cloud-based developer tooling, and its initial focus is Java and JavaScript tooling implemented by re-using parts of JDT (the tool plug-ins that implement a Java IDE) and the Eclipse Orion browser-based development platform. The Orion client is written in JavaScript and runs in the browser. Other services for other languages to follow, Milinkovich said.

But accomplishing this goal also involves providing connectors to such desktop dev tools as the Eclipse IDE, IntelliJ IDEA, NetBeans, and even plain text editors. This project will be distributed under both the Eclipse Distribution License and the Eclipse Public License, and it'll be hosted on GitHub.

The Flux Project is being led by Pivotal's Martin Lippert. He leads the Spring Tool Suite at Pivotal, which made the initial Flux code contribution. Pivotal spun off from EMC's VMware in 2012.

Milinkovich said he expected Orion, which was contributed to the Eclipse open source community by IBM, to become an increasingly important dev environment. The Flux integration of the Web-based tools of Orion with the desktop tools of Eclipse will give developers "the ability to use the right tools to work on their code, wherever they are," Milinkovich said. "They can use Orion to work on code on their tablet, or in an environment where a Web-based browser is the right way to go."

Milinkovich's keynote was entitled "Eclipse: The Next 10 Years." His talk covered a bit of Eclipse history and then he pulled out his crystal ball to make a few predictions about trends that will affect developers in the future. The Flux Project underscores the growing importance of the cloud, but what topped his list is the overall ascendance of software, which, thanks to the explosion of code bases, is quickly becoming the most important element of the enterprise. An example, he pointed to the Airbus Aircraft, whose planes use four times more onboard code than they did three years ago. Next on his list: the Internet of Things (IoT), which is definitely the next big thing for developers, he said. He pointed out that the Foundation currently has 14 projects in the IoT space, with more are on the horizon.

Posted by John K. Waters on March 21, 20140 comments


Java SE 8 Is Almost Here, and Lambda Is the Star

The delays are over, the final approvals are in, and the general availability release of the Java Platform, Standard Edition, 8 (Java SE 8) is right around the corner. What has been called a revolutionary upgrade of one of the world's leading software development platforms is due on March 18. Mark Reinhold, chief architect in Oracle's platform group, has described Java SE 8 as the largest ever upgrade in the history of Java, covering the programming model, as well as a "carefully coordinated co-evolution" of the virtual machine, the language, and the libraries.

This release incorporates several high-profile Java Specification Requests (JSRs), the most talked about of which is JSR 335, Project Lambda. Originally planned for the Java SE 7 release, but pushed back, Project Lambda extends support in the Java language and core libraries to enable the Java SE APIs to use lambda expressions (closures), which are anonymous functions.

Reinhold has called the support for lambda expressions in this release the single largest upgrade to the programming model, ever. Mike Milinkovich, executive director of the Eclipse Foundation, calls it a massive change of the Java language, libraries, and JVM.

"Without a doubt the most important new feature in Java 8 is lambdas," Milinkovich told ADTmag. "This is a sweeping change that modernizes the language, allows for a much more readable syntax, allows for better code generation, and helps the JVM make much better use of multicore processors. Inner classes were always a hack, and having proper closures in the Java language has been a missing feature for a very long time."

The Foundation is releasing Java 8 language support as an add-on to the Eclipse IDE at next week's EclipseCon. That add-on includes a formatter, a code completion feature, code navigation, search and indexing, a reconciler, incremental builder support, and "quick assist" support for migrating anonymous classes to lambda expressions. "I think [that feature] will be particularly popular with Java developers as they migrate their code," Milinkovich said.

Using lambda expressions will, among other things, make it easier for developers to write code for multicore processors. But on a more fundamental level, lambda expressions introduce the idea of functions from lambda calculus into the Java language, making the Java SE 8 release look like a step toward functional programming. The functional paradigm, which emphasizes the evaluation of expressions, rather than the execution of commands, and essentially eliminates the need for program state by expressing all computations in the form of functions that take arguments and return values, is used by such modern JVM languages as Groovy, Scala, and Clojure.

The lambda support in Java SE 8 alone makes it an important milestone for the language and platform, said IDC analyst Al Hilwa, but it's one of several changes that represents significant new functionality in this release.

"There are a variety of interesting things in SE 8, like the Streams API focused on parallel processing large data sets, Project Nashorn's faster JavaScript engine, and of course implementing Lambda expressions," Hilwa said. "These are significant changes to the language that will have a long-term impact as we shift into a highly parallel world populated with multi-core devices and big data. To see the team do this while simultaneously investing heavily in securing the platform in the face of escalating malware attacks everywhere is a huge achievement."

The list of new features in this also includes the new Date/Time API (JSR 310), Type Annotations (JSR 308), and a set of Compact Profiles, which allow Java SE 8 implementations to scale down easily.

One prominent Java initiative, Project Jigsaw, hasn't been included in this release. The Java-native module system is expected in Java SE 9.

"Generally, programming languages evolve slowly and with great focus on stability," Hilwa observed, "because the code has to continue to run for decades. Meanwhile, new innovations have to be accommodated, and so Java's governance model allows it to be evolved and adapt to new innovation, and this is what we are seeing here."

Oracle is hosting a live Java 8 Launch Webcast on March 25.

Posted by John K. Waters on March 12, 20140 comments


Adopt-a-JSR and Java SE 8

The long awaited, much anticipated release of Java SE 8 is nearly upon us. March 18th is the official release date, though numerous "launches" and other events will follow. A lot of work went into this release, with contributions coming from many quarters -- including Java User Groups (JUGs) around the world who participated in the Adopt-a-JSR program.

The Java Community Process (JCP), which manages the development of standard technical specifications for Java technology, launched the program in December 2011. Adopt-a-JSR encourages individual members of the Java community -- average developers working with Java day-to-day -- to "adopt" a Java Specification Request (JSR) by following its progress, supporting its expert group, reporting back to the wider community on its progress, and evangelizing its benefits.

The program aims to get JUGs involved in the Java standards process, and through those organizations, to promote "grass roots, developer-level participation in existing and emerging Java standards," the JCP says. The idea it to generate "earlier feedback, leading to more developer-friendly APIs;" "end user/developer expert input;" and to get the developer community to do more of "the heavy lifting."

To get that level of participation, the organization turned to the London Java Community (LJC) with the idea. LJC members Martijn Verburg and Ben Evans got the ball rolling, and the JCP considers the program to be "JUG-led."

"This is one of the best things to happen to the [JCP]," JCP chair Patrick Curran told ADTmag in a recent interview. "It has added some great energy and real enthusiasm."

The initial list of JUGs participating in the Adopt-a-JSR program included SouJava in Brazil, GoJava (Brazil), Houston JUG (US), and Chennai JUG (China). That list now reportedly comprises 20 JUGs, including Belgium JUG, Campinas JUG, CEJUG, Cologne JUG, Congo JUG, Faso JUG, Guadalajara JUG, Hyderabad JUG, Indonesia JUG, Istanbul JUG, Joglo Semar JUG, Jozi JUG, LJC, Madrid JUG, MBale JUG, Morocco JUG, Peru JUG, Silicon Valley JUG, and Toronto JUG.

Participants engage with the program on three levels: Starter, Intermediate, and Advanced (details here).

The JUG members contributed to many of the signature changes coming in Java SE 8, including JSR 335 (Lambda Expressions for the Java Programming Language), and JSR 310 (the new Date and Time API). The latter effort was led by LJC members James Gough and Richard Warburton.

JUGs have long been a valuable community resource for Java professionals. These volunteer organizations create opportunities to share information and to network, in person, with other Java practitioners. Most groups have some kind of Web presence, and there are some virtual groups out there.

Oracle also sponsors an Adopt OpenJDK program, which was launched in 2012. It has the same goals as Adopt-a-JSR, but focused on the open-source Java reference implementation. The program currently has 147 participants, and is administered by the LJC's Verburg.

Of course, the final release of Java SE 8 on March 18 doesn't end the Adopt-a-JSR program. For more information on how to get involved in work already under way for Java SE 9, go here.

Posted by John K. Waters on March 12, 20140 comments


Report: It's Time To Include Localizers on the Agile Team

There was a time when enterprise application development teams simple threw their code over the wall to the people charged with the task of localizing it. Those days are fading, of course; software developers in medium to large companies have been generating ever greater percentages of their organizations' revenues outside the West for the past decade. And the pressure to "go global" faster is ever increasing.

Consequently, say the industry watchers at the Cambridge, Mass.-based research firm Common Sense Advisory, it's time for the team responsible for adapting U.S.-made software to other languages and cultures (a process called localization) to join the Agile team.

"Agile goes so fast that the other teams supporting it have had to get much closer together," said Director of the Global Leaders Service Rebecca Ray. "Testing, documentation, even the marketing people -- everyone needs to get together. And the localization team needs to work closer with the app dev team, too."

Ray recently co-authored a research report ("Localization at the Speed of Agile: Best Practices for Making the Transition") with her firm's founder and chief strategist, Donald A. De Palma. The report looked at the unique challenges associated with implementing Agile localization through interviews with 21 companies that develop software and faced this challenge.

"Agile is the gateway to integrating internationalization and localization as they were meant to be in the software development process," the authors wrote. "No localization team should pass up the chance to make it happen."

You'd think the spread of lightweight development methodologies among app dev teams would offer a nice solution to this "go global" challenge, but Agile is not a natural fit with traditional localization processes, Ray said.

"Agile is really a different way of developing software," she explained. "It's much more circular than linear, so it basically breaks the more linear processes that have been used for localization in the past. It's a huge change for localization teams."

Yet it's a change that must be made, Ray said. The localization team must become a part of the Agile team from the beginning of a project. If you don't want it to break your software, "localization" must become part of the "definition of done."

"Make sure that you talk to the localization people up front, during design," she said. "When you put together your users stories, the localization people should be in that group. The definition of done has to include localization."

During their research, Ray and De Palma found three common strategies among the companies interviewed for "syncing up" localization teams and developers: 1) "lag one sprint behind developers," which allows localizers to test features that may not be finished until the last minute in the previous sprint; 2) "stay current with each drop," which requires a high degree of automation, especially if sprints occur every two weeks or less; and 3) "sync up when most of the user interface (UI) work is complete," which allows the localization team to avoid the huge churn in features that usually happens within the first few weeks after development begins.

The report emphasizes the importance of automation to the success of an Agile localization effort. "Things have to be automated enough so that the translation management software can be connected directly into the source code repositories," Ray said. "That way, the software can just pick up whatever strings have changed and shoot them off to the translation/localization provider."

Also, the integration of localization with Agile doesn't have to be an all-or-nothing proposition, the researchers suggested. "There are certain things you must get right and places where you have to align or integrate localization with product development processes," they wrote. "However, you don't need to attend every single Scrum, translate every iteration of every string, or deliver 100% human translation for all sprints. If you spend too much time creating a perfectly Agile process, you will likely lose much of the adaptability and flexibility that comes with the model."

The reprioritization of localization from a last-minute consideration mirrors in some ways the status evolution of software testing from the ugly stepchild of the software development process. Proper localization can be a matter of life-and-death when it involves medical devices or automotive applications.

"I want to be sure that the doctor in Turkey can read my MRI output," Ray said. "And when my Toyota Prius arrives, it better speak English to me, and it better speak it well. There can't be any mistakes in that dashboard."

"People should keep in mind that this should not be painful," she added. "It should just be, this is another business process, and here's what we need to do."

An extract from the report is available on the Common Sense Advisory Web site.

Posted by John K. Waters on March 7, 20140 comments


Colbert at RSA: When Smart Equals Funny

I've been covering tech trade shows and user conferences for more than two decades, and last week's RSA conference was the first in my experience to include a comedic keynoter who actually understood the technology and the issues surrounding it. Stephen Colbert, host of Comedy Central's "The Colbert Report," gave the conference closer in San Francisco on Friday to a packed house, and killed.

"RSA developed this conference in 1991 as a forum for cryptographers to gather and talk shop," Colbert said, "and I assume breed with one another. Of course officially that's called exchanging private keys."

Colbert kidded conference organizers for booking FBI director James Comey as a speaker, and noted the director's comment that "At our best, we are looking for security measures that enhance liberty."

"Well said director," Colbert said. "I'm sure that under enhanced liberty you can have all the privacy that you want-just like under enhanced interrogation you can breathe all the water you want."

He also dinged Scott Charney, head of Microsoft's Trustworthy Computing group, who also spoke.

"Not everyone can book a speaker from an Orwellian dystopia," he said. "I look forward to next year's speech from the executive director of Sweet Dreams Euthanasia Clinic, Incorporated."

Colbert laid into NSA leaker and conference buzz hog Edward Snowden during his talk, calling him "practically a war criminal" for taking top secret U.S. intelligence to China and then to Russia. "Was Mordor not accepting asylum requests?" he asked. (He's a known hardcore Tolkien fan.)

He also had a few choice words for the NSA: "We can trust the NSA," he said, "because without a doubt it is history's most powerful, pervasive, sophisticated surveillance agency ever to be totally pwned by a 29-year-old with a thumb drive."

Colbert addressed the boycott of the conference this year by 13 digital security experts, who canceled their talks after Reuters reported that RSA, the conference organizer and chief sponsor, had a $10 million contract with the NSA to set as the default in their encryption products a flawed formula for generating random numbers, which effectively created a back door.

Activists from Fight for the Future appealed to Colbert to join their boycott in an open letter, which read in part: "We know you, Stephen, and we know you love a good 'backdoor' joke as much as we do-but this kind of backdoor is no laughing matter...We want to hear your speech, but give it somewhere else!"

"The elephant in the room is that I was asked not to come [and] speak here," Colbert told his audience. "That came as something of a shock to me. Normally I'm asked not to be somewhere only after I've spoken."

"I looked at the signatures on the online petition," he added. "Then I looked at the signature-my signature-on the bottom of the contract saying I'd be here today, and my conscience was clear, as long as the check clears...Well, it's not actually a check. They gave me a bitcoin voucher from Mt. Gox, and I'm sure it's going to be fine."

At one point, Colbert offered a kind of acknowledgment of the American people for their support of the NSA's programs.

"We all deserve credit for this new surveillance state that we live in," he said, "because we the people voted for the Patriot Act. Democrats and Republicans alike. We voted for the people who voted for it, and then voted for the people who reauthorized it, then voted for the people who re-re-authorized it."

Colbert also pitched his own data security venture, CloudFog. "We take a novel horizontal approach to vertical socket encryption," he said. "The result can only be described as diagonal."

Here in Silicon Valley, smart often equals rich. I'm glad to see it sometimes equals funny.

Posted by John K. Waters on March 3, 20140 comments


Juniper's Bitar at RSA: 'The Next World War Will Be Fought in Silicon Valley'

It's been a while since I attended a conference keynote presented by a speaker as apparently pissed off as Nawaf Bitar, senior vice president and general manager at Juniper Networks. His RSA Conference talk, entitled "The Next World War Will be Fought in Silicon Valley," was seasoned with infuriation and rife with get-off-your-butt admonitions.

"Our privacy is being invaded," he said. "Our intellectual property is being stolen. The public trust is at an all-time low. The attack on our information is outrageous. But you know what? I don't think we give a damn. I'm fed up with talking about outrage. It's easy to talk about outrage."

He then went on to compare what he considered true outrage -- the self-immolation of Tibetans protesting China's rule of their country, Nelson Mandela's refusal to renounce his views in exchange for release from prison, and the Tiananmen Square protests of 1989 -- with "liking" a cause on Facebook, retweeting a link, posting a bad review, and boycotting a tech security conference (referring, one assumes, to the 13 speakers who canceled their scheduled talks at the event over allegations that RSA agreed to incorporate a flawed encryption formula into one of its security products to satisfy a secret $10 million NSA contract.)

The second group, he said, are examples of "a new American disease," which he called First World Outrage.

The problem is, we're not really paying attention, Bitar declared, even when we're warned in no uncertain terms. He pointed to a Wired article (http://www.wired.com/threatlevel/2012/03/ff_nsadatacenter/) by James Bamford published in 2012 entitled "The NSA Is Building the Country's Biggest Spy Center (Watch What You Say)," which quoted an unnamed U.S. intelligence official: "Everybody's a target; everybody with communication is a target."

"We were explicitly told that we would be spied upon," Bitar said. "There are data centers and data closets all over the world collecting an unprecedented amount of our personal information…. We now know with stunning clarity how deeply our privacy is being invaded. What's really changing?"

Not that we would sit still for everything, Bitar allowed. Threats to families and livelihoods never fail to move people to action. If Edward Snowden's NSA revelations had raised our taxes, he argued, there would have been rioting in the streets. But it's "high time" that we added our information to that very short list of things we truly care about.

We are now facing the unintended consequences of our response to 911, he said, which, among other things, fueled a well-meaning U.S. government to build an information collection system "the like of which the world has never seen."

"Fast forward to today, and we have direct evidence of the depth and breadth of this information collection," he said. "The meta data of our phone calls. Backdoor access to systems. Deciphering encryption keys. Spying on companies. Spying on everyone!"

We should also be concerned (if not actually freaking out) about the changing threat landscape, he said. The past five years have seen attacks against nuclear sites, power grids, and the intellectual property of companies by hactivists, thieves, and increasingly nations seeking to exploit weaknesses in the "cyberfabric" of other nations. Government officials have voiced concerns that cyber attacks might now pose a greater threat than terrorism.

In fact, a cyber attack could, he asserted, lead to actual war.

"If an enemy shot down one of our passenger airliners, we would go to war," he said. "If a nation state compromised our air traffic control system and two passenger airliners collided, would we not also go to war?"

It took him a while to get there, but Bitar finally arrived at his call to action.

"Our information is being stolen," he said. "As individuals and citizens, we should be truly outraged -- not first-world outraged. The time for apathy is over. We cannot go on the offensive and hack back; we would lose the moral high ground. But we can no longer remain passive. It's time for a new type of defense, an active defense that disrupts the economics of hacking and challenges convention, a type of defense that interferes with the attackers, that breaks algorithms, that disrupts data collection. It's time for all of us to turn the tables on the attackers."

Bitar's concerns are not new, of course, and he offered no specific solutions. But his effort to create a kind of historical context for the current state of affairs in the world of cyber security was engaging, and his unvarnished outrage was refreshing. He didn't set himself on fire, so I'm not sure whether he could be faulted for expressing First World Outrage, but judging by the audience response to his talk, I suspect that he might have lit a fire under at least a few attendees.

Posted by John K. Waters on February 26, 20140 comments


2014 Developer Challenges and Opportunities, Part III: Extreme Automation, Service Virtualization, Don't Be Afraid of SOAP, More

See Also:

There's a lot on the horizon for developers in 2014, and I just couldn't let the "predictions" thing go without passing on the observations of two more top industry watchers.

Theresa Lanowitz, founder of industry analyst firm Voke, for example, points out that catastrophic software failures in 2013 could make 2014 the year enterprises begin dialing back the pressure on app dev teams to get to market at warp speed. Why? Because businesses are being held accountable for these software failures. The shutdowns and glitches that plagued U.S. financial exchanges last year resulted in a credit rating downgrade (Goldman Sacks), a major contract termination (CGI Federal for healthcare.gov), and IT people being put on administrative leave.

"I believe that's the first time we've seen repercussions for the technology side," Lanowitz said. "The constant relentless push to get things to market faster is coming back to haunt us. Software is now such an integral part of the enterprise that failures have a serious impact. Businesses can't afford to absorb these big failures. Customers will give you only so many passes. And when Standard & Poor's says they are going to downgrade your credit rating and that you have to have a liquid capital reserves on hand to pay out for damages your faulty software causes…  Well, the C suite is suddenly interested in software."

Time-to-market still matters, Lanowitz said, but more than ever, so does software quality. Those conflicting demands are likely to lead to enterprise developers to embrace what she calls "extreme automation."

"Developers are going to have to say, all right, what can I do to make sure that we are delivering on time, but also with a high degree of quality, while understanding my cost and knowing what's going to happen when we have a defect in production when there's a catastrophic event," she said. "I think we're going to see developers answering that question more often in the coming year with extreme automation across the entire application lifecycle."

Essential for this level of automation, she said, is service virtualization, which she believes, though currently underutilized, could become the hub of the modern application lifecycle.

"The app lifecycle needs to be built using things like service virtualization, virtual lab management, and dev/test cloud," she said. "You want to give developers and testers environments as close to production as possible, so that the testers can test and immediately give developers the defects to remediate. This approach also reduces the provisioning time; developers and testers don't have to wait for lab environments. And you want to be able spin up a platform very quickly, and then replicate that platform throughout your software supply chain."

According to Voke's research, organizations that employ service virtualization in this way see significant benefits, Lanowitz said, including fewer defects going into production, shorter wait times, and an increase in the availability of services for end-to-end integrated testing.

"I think the term 'virtualization' has become too closely associated with the data center, and people don't think of it in this context," Lanowitz said. "But this is a technology that spans the entire application lifecycle. Now it's just a matter of education."

Forrester Research analyst Randy Heffner says 2014 should be the year developers begin to focus seriously on what he calls "digital business design," which is an integration strategy that sees the trends that will likely dominate the coming year -- big data and predictive analytics, mobility, and API management -- as pieces of a larger picture.

"There are no stand-alone applications anymore," Heffner said. "So don't design your application as though it is one. Your app will be integrated across a multi-organization ecosystem of business activity. So design the business activity first. Design the transaction, and then figure out how to embody it within the delivered solution. This business-first approach is going to be the key to bringing these trends together in a coherent way that enables sustainable business flexibility going forward."

Heffner also advised developers not to let their enthusiasm for REST architectural style cause them to miss the continuing value of SOAP.

"There are a lot of what I call quarter truths out there about SOAP," he said. "Of course, REST is very important, but don't fear SOAP. Understand when and how to use them. On the open Web, you'd better be using REST, because that's what developers demand. For BtoB situations, companies are often willing to invest a bit more, and often they're using toolkits where SOAP is easier to use; they're not scripting-language-based, but Java- or C++-based. The truth is, there's a lot of expansion still happening with SOAP."

"Just be wary of those who get very religious about REST -- you've got to have these particular models and you have to use verbs in this way and that way, etc.," Heffner added. "What will rule the day is pragmatic REST, designing to fill the right kind of model based on what you need to do."

Posted on February 5, 20140 comments