Interoperability
BY MICHAEL ABRAMSON
© 2007 FrontLine Security (Vol 2, No 1)

It’s been a little over a decade since I began my quest for the holy grail of computing: the delivery of sustainable information INTEROPERABILITY. Known by many names over the years, the terminology that is growing on me is “semantic interoperability.” The objective, most can agree, is the “guaranteed access to quality information requisite to making sound business or operational decisions.” So why, after more than a decade, does this goal still appear as elusive as ever?

Experience shows several reasons for this sad lack of progress in our ability to ­manage, share and exploit information resources:

  1. Information has generally taken a backseat to many other aspects of the computing environment (such as program code, messages, process, and network infrastructure);
  2. The lack of sound data management, metadata management, and information management in many organizations inhibits the development of sustainable corporate knowledge needed to the functional, technical and performance specifications which form the basis for traditional development processes);
  3. The fluid and dynamic nature of real-world events (such as natural disasters, failing states, international conflicts, terrorism, pandemics, and mergers and acquisitions) requires new and innovative approaches (e.g., policy based middleware) to information assurance which have been slow to develop;
  4. An entrenched information security culture (focusing only on the protection and control elements) prevents the adoption of the new engineering practices and technologies needed to address the growing need to fuse and share information assets;
  5. The IT community’s perpetual search for the “Silver Bullet” – that one piece of technology that will address the entire requirement, one solution fits all – has meant that few organizations are willing to trial and evolve the innovative approaches and technologies.

To do justice to any of these points, one would require a research paper, however, I would like to look at one small ­success story and what was learned while participating in its development.

In the mid 90s I accepted a contract with Canada’s Department of National Defence (DND) to develop a business case for the Common User Core (CUC) – the specification, design and implementation of a common operating environment for the department. Like many organizations at the time, DND projected that deploying a common suite of computer applications would foster interoperability – individuals using the applications could more easily share information.

Three years of effort came to an abrupt halt by a series of government policies, international agreements, laws, which effectively prohibited the department from establishing and maintaining product-based standards. The procurement system thus became a show stopper. That being said, several other issues came to, light: 1) DND could not keep pace with the perpetual change in the industry; 2) Suc­cessive procurements and product customizations demonstrated that the central tenant of the approach was flawed – applications in themselves did not promote interoperability; 3) DND could not prescribe application usage to its partners; 4) The selection of a specific tool restricted the ability to leverage innovation; and 5) The configuration management, accreditation and deployment issues were far greater than originally anticipated. An alternate approach needed to be found.

During the course of the CUC study, I was presented with a set of specifications for a data-centric approach to interoperability that had been evolving as part of a study conducted by Supreme Head­quarters Allied Powers Europe (SHAPE): Army Tactical Command and Control Informa­tion System (ATCCIS).

At the time, several NATO partners, led by the United States, Germany, the United Kingdom, and France were seeking to test and demonstrate the viability of technology agnostic specifications for data sharing within coalition operations: Data Model specification for the Generic Hub (GH) – now referred to as the Joint Consultation Command & Control Informa­tion Exchange Data Model (JC3IEDM); and the ATCCIS Replication Mechanism – now referred to as the Data Exchange Mechanism (DEM).

The CUC team was asked to develop a Canadian prototype and take Canada from observer status to active participant in the 1999 demonstration in Ede, Holland. Over the next three years, the team success­fully developed, tested and demonstrated the ability of multiple coalition partners to share tactical planning and situational awareness data in near real-time in a hetero­geneous technology environment. Each nation implemented the specification using technologies conducive to their own operating environments and national interests, but came together in data and communications protocols developed by the community. This was a major step forward.

Building on this success, the ATCCIS (now MIP) community has grown to over 30 participating and observing nations.

The small Canadian contingent for the first three years gave me the opportunity to develop a detailed understanding of the fundamentals for the specification, development and deployment of an interoperability solution. ATCCIS’s agnostic approach to technology clearly demonstrated that the answer is not to be found in a technology solution, but in open-architecture approach and standards (e.g. the GH and DEM specifications) agreed to by the community. No partner to a coalition or community can dictate a specific technical platform – as demonstrated by the CUC and like initiatives.

No amount of technology will supplant an organization’s need to discover, understand and manage its information environment. Without the evolution of corporate knowledge, an organization will not effectively deliver and sustain capability. This was illustrated by the difficulties encountered by the more technology-advanced nations in leveraging the benefits of the MIP capability. Nations with little or no legacy found it easier to adopt (/integrate) basic MIP capability.

In many domains, including the military, relegating information to an afterthought has organizations and agencies scrambling to recover lost corporate knowledge – at significant expense and varying levels of success. Considering that much of this information was gathered as part of Y2K efforts and subsequently discarded – one might wonder if, as a community, we have learned the key lesson: It’s all about the information!

As you might have noticed, I use information, not data, as the message we should be addressing. The use of information implies that for interoperability to exist the base element must have meaning to the community to which it applies – data does not. “42” is data, but out of context it has no meaning.

In solely providing the data we omit the meaning of an element, which leaves the system, application, or user to imply this meaning. This can lead to innocuous or devastating results depending on the environment. MIP is demonstrating that determining meaning from data takes years of development, and agreement on significance may not apply to organizations outside a narrow community of interest.

The JC3IEDM has been developed over the course of more than a decade and provides an excellent foundation for consultation and sharing of situational data in the command and control domain. This is demonstrated by the large number of nations wanting to use the data model.
Any system, however, will suffer if differing interpretations exists on how to put data into and take data out of the prescribed structures.

During the era in which the ATCCIS study developed its specifications, the IT community was focussed solely on data management. The evolving disciplines of Information Management and Knowledge Manage­ment were little more than doctoral theses. The issues of ontology and semantics focuses on linguistics and were nowhere in the lexicon of computer scientists.

Even today, as we discuss issues related to interoperability and information assurance, few are discussing the meaning of the messages (data structures) and the associated business rules needed to assemble and disassemble payloads of data, which constitute meaningful information within a community.

To the credit of the MIP community, many of the identified early shortfalls in knowledge, process and technology, are now being addressed in their efforts to define a standard set of business objects for the JC3IEDM, again demonstrating the ability of this particular community to identify challenges and adapt to its environment.

So what do we know of the relative successes of ATCCIS and MIP? First, we know that by participating, a nation can develop the ability to share planning and situational data with other MIP Partners. We also know that making this capability operational is still proving to be a challenge.

The Stumbling Blocks
The loosely coupled legacy environ­ments of many participating nations impede the staging of national data for release and further impede the use of MIP data within national systems.

There is no quick fix to this challenge. It will take recognition and the commitment of resources by senior management within the environments to resolve this. It will also take some out-of-the-box thinking, and non-traditional approaches to get there. We will never understand and document the ‘as is’ condition – it is always changing. It is unrealistic to believe this to be the starting point. We need a process of discovery, experimentation, testing, and rapid deployment to get there. “Develop the capability we understand and build from there in a progressive manner.”

Information Assurance (IA) concerns related to the releaseability of informa­tion to coalition partners impedes deployment.

We must address IA as a double edged sword – getting information to the decision makers that need it and denying it to those who don’t. Having the mission fail because information was denied is no form of success. There are too many instances of this occurring. The reality is that decision makers are going to break rules to complete their missions – the real risk is nobody knowing it happened. We need to have the tools to make data release-able in a known and auditable manner during operations.

Additionally, the delivery of interoperability will not be addressed until the IA community starts focussing on the asset being protected rather than the infrastructure to protect it. No matter how good the infrastructure we will not secure information until we understand how it is collected, aggregated (fused) and used. We are currently spending 90% of our time on 10% of the problem.

The MIP community has resisted the move­ment of specifications into the commercial domain where they can be standardized and integrated into commercial-off-the-shelf (COTS) products.

I have been on the forefront of an effort to generate a commercial standard for the JC3IEDM and its transactional business rules. The benefits of this standardization would be enormous across wide segments of the Command and Control (C2) community. At present both the user community and system integrators are reluctant to mandate the use of the JC3IEDM because of the programmatic liability that may ensue. By their very nature, standards acceptance by a broad community, provide some immunity to these liabilities. The standard would facilitate the adoption of the JC3IEDM and a set of agreed semantics which transcend COTS C2 Applications – promoting out of the box interoperability.

There is a need for the development of an engineering approach to rapidly and progressively integrate national environments and systems.

Traditional engineering approaches are inapplicable to a domain where the majority of the requirements are not well understood by stakeholders or subject to inordinate amounts of change. Both of these conditions are being faced by users, projects and integrators as they wrestle with this seemingly insurmountable challenge. The Object Management Group’s  Model Driven Architecture (MDA) approach as with their aligned efforts in information and application assurance, ontology management, UML profiles (such as for DODAF and MODAF (UPDM)), Shared Operational Picture Exchange Services (SOPES), Information Exchange Framework (IEF) and policy management appear to be building the foundations of this engineering approach.

For those of you thinking that this article is solely focused on the military – think again. Consultation, Command and Control (C3) is applicable to a wide range of organizations and agencies that need to share planning and situational awareness information. Public safety, homeland security, policing, crisis response, emergency response, failed state reconstruction, business continuity and government operations are classic examples. Could the JC3IEDM be used in these environments as well? The answer is a resounding “YES.” Can the JC3IEDM be used to bridge these communities? Again the answer is “YES.” So what is needed?

Greater participation of organizations and agencies in developing the practices, standards and tools is needed to build on the successes of ATCCIS and MIP. We are getting close!

====
Michael Abramson is a founder and CEO of Advanced Systems Management Group Ltd. and has been a consultant to government and private sector organizations for more than 25 years. He is currently the Co-chair of the OMG C4I Domain Task Force.
© FrontLine Security 2007

RELATED LINKS

Comments

CLICK HERE TO COMMENT ON THIS ARTICLE