Home > Uncategorized > One of the Big Guys Contributes to Open Source

One of the Big Guys Contributes to Open Source

Over the years I am regularly reminded about how much software permeates industrial activity far beyond the software industry itself. Thirty years ago the procurement director at McDonnell-Douglas told me that software was the most challenging thing to predict planning and executing on aircraft design and production schedules.

A lot of software expertise resides in companies that don’t publish software. In the national defense domain, software development is diffused among Defense Department components and its many contractors.

That’s prelude to why I was intrigued by a press release last month from Lockheed Martin. Engineers in the company’s Information Systems and Global Solutions unit donated a data search engine/discovery tool to  the open source community Codice Foundation. Codice is a 2012 offshoot of the Mil-OSS project. Both work to move defense related software from the proprietary world into the open source world. Codice members modeled the effort after the granddaddy of open source foundations, Apache.

By the way, Apache last month released a new version of the widely-used Hadoop framework for distributed, big-data applications. It was a very big deal in the open source and big data application development world. For the uninitiated, here is a good explanation of Hadoop from InformationWeek.

Lockheed donated to Codice what the company describes as the core software in the Distributed Common Ground System (DCGS) Integrated Environment, or DIB. It’s a mouthful, but DOD uses it to share ISR (intelligence, surveillance and reconnaissance) data. The donated core is dubbed the Distributed Data Framework (DDF).

Andy Goodson, a Lockheed program manager, explained that the DIB started as an Air Force-led joint program a decade ago. Its basic function is to make data discoverable. As an open source tool, the DDF can be adapted to any domain where multiple data sources must also be rendered discoverable by an application. It makes data where it resides available to application algorithms as well as to processes resulting in the data’s presentation to users.

In effect, the DDF furthers the fast-emerging computing model that treats data and applications as separate and distinct resources. The federal government has been trying to adapt to this model for some time, as manifest in (among other things) the Digital Government Strategy. In practice, to be useful, data even from disparate sources within a domain must adhere to that domain’s standards. But it need not be connected to a particular algorithm or application, so the data is reusable.

Goodson said Lockeed found that the code, because it is independent of any particular data source, was not subject to export control even though it was used in a sensitive military environment. It had advice and counsel from its federal client in this determination. Now, the software is available through Codice to other agencies, systems integrators and developers dealing with the issue of big data applications.

Advertisements
  1. No comments yet.
  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: