Version 3 (modified by jbourne, 16 years ago)

--

POSC-Caesar FIATECH IDS-ADI Projects

Intelligent Data Sets Accelerating Deployment of ISO15926

Realizing Open Information Interoperability


Template Methodologies

N-ary relations, that is "templates" in ISO 15926 parlance, can be defined using a wide variety of different approaches. When such an approach takes the form of a systemic process, we call it a "methodology".

The Purpose of a Methodology

The likelihood of two different templates working well together in any given scenario is much higher for those templates that have been generated using the same methodology.

In general this is the very purpose of a methodology: to create a set of templates that work well together to solve some interoperability problem.

Retain the Record of Methodology

Because multiple methodologies can be used to create definitions in the same modeling system, it is crucial to record the methodology that created a definition, and any other methodologies it has been proven for.

This allows the selection of a cogent, orthogonal set of definitions to solve a specific interoperability problem using a given modeling system. That is to say, the intersection of modeling system and methodology together select the working set of definitions.

Coarse-to-Fine Approach

A coarse-to-fine approach takes relationships that already exist in highly agglutinated or generalized forms and then may break them down into finer relationships in order to explore the structure of the data.

It is important to recognize that often they need not break them down much if at all - often, models built with this approach are purely for the recording of the data, without much emphasis placed on the analysis of the structure, beyond a certain level necessary to solve specific problems.

Constraints on the structure of the data

Coarse-to-fine approaches are typically empirical or based on actual data - they model information as it is recorded by humans or other systems. This exploits two very important features:

  • it is often purpose-driven - the model fits the data because it has been developed as an abstraction of the patterns already in the data.
  • it tends to reflect the way that humans think about the data in the disciplines that work with the problem set.

Most data model design processes follow this approach - not because it necessarily results in the best possible model, but because it quickly results in a model that fits the problem and the data fast.

Perhaps more importantly, it is popular because it is successful, and it is successful because it does not challenge the participants to alter the way they think about the structure of the data.

Note: this is not an obsequious comment - humans are not logical, humans think by making generalizations about observable patterns, and such generalizations need not be comprehensive; that is to say it is implicit that the generalizations are not intended to necessarily cover all cases. Human language is built to communicate these kinds of observable patterns or cases which fit them.

Frequently, this means that humans hold to lore that while useful, might not be correct. But its usefulness extends beyond its value as an approximation, or a rule of thumb - it is useful because human languages makes it concise to communicate to other humans (because human language is based on the same principles - generalizations not intended to be comprehensive in scope).

As a result, these kinds of approaches can only be used for interoperability where the problem-set is roughly shared across the communicating parties - that is to say, if one party wants to use this data to solve a different problem, there is a good chance it is actually useless to them.

Coarse-to-Fine Approach

The Coarse-to-Fine approach takes information and models it in its extant form (as it is exposed from language, data or usage). It then breaks down those "coarse" relations into finer relations to the depth required to address a specific problem set.

Fine-to-Coarse Approach

The Fine-To-Coarse approach seeks to model information from a set of founding principles. These principles determine an starting set of finely-grained relations from which successively coarser and coarser sets of relations are built, until (perhaps) relations that are generally useful for solving specific problem sets can be reached.

The Reach of a Methodology

Coarse-to-fine approaches have a tendency to be more neutral to a modeling system - the shallower the coarse-to-fine approach (that is to say, the coarser the concepts) the more neutral it tends to be. This allows very shallow coarse-to-fine methodologies to be applied across different systems.

Fine-to-coarse approaches are (by definition) founded on a specific modeling system, and so these methodologies are generally confined to a single modeling system.


@todo notes to incorporate/explore included as comments


Home
About PCA
Reference Data Services
Projects
Workgroups