ISWC 2015 Ontology Design Principles Tutorial

Title:  Ontology Design Principles in Support of the Analyst:  Expressivity, Non-brittleness, and Seamless Integration at Scale

Abstract: A few years back, a set of ontology design principles were intensively vetted and finalized by world-class ontologists, classic data scientists, and IC analysts that Eric Peterson had originally put together as a result of his experiences in ontology-based data integration - most recently as chief modeler on the DataSphere integration project and one of two principal ontologist/modelers on Synapse.  This session will cover two dozen foundational preventative solutions for many of the modeling problems encountered in serious semantic modeling efforts. While designed to support semantic integration at scale, these principles also promote reasonable levels  of temporal, geospatial, and analytical expressivity/sophistication.  They also facilitate non-brittle future-proof reaction to changing requirements for enterprise ontology modeling, and the semantic web as well.

Motivation:  As an ontologist for the intelligence community, Eric has observed/experienced/facilitated surprising levels of agreement over data models and model fragments when the participants first agreed on design principles.  Once key principles are established, most competing model fragments or models are automatically weeded out.  Data modelers serve many masters, but this presentation helps show that high expressivity, low brittleness, and integratability are not only jointly achievable, but are mutually facilitating and reinforcing - bordering on inseparable.  Good semantic design is evident when unintended benefits continue to arise long after the design is codified, implemented, and first utilized.

The modeling principles can enhance virtually all subject domains while focusing on key areas such as space, time, and events.

This expressive modeling approach supports present, near-term, and mid-term intelligence (military, business, etc.) analysis and analyst tools. These principles were developed in the trenches on projects with deadlines, accountability to customers, and strict operational requirements.

Audience:  This tutorial should be of interest to the semantic professional and the academic community as it straddles the line between applied research and that which is reasonably achievable for a customer.  Participants with all levels of ontology modeling experience are encouraged to attend.  There is much information to be absorbed and enjoyed on all levels of semantic expertise and all levels are most welcome.  Ideal preparation would be to have an understanding of edge reification, quad statements, Allen’s interval algebra, and classical data normalization.


Eric Peterson (; is at Noblis/National Security Partners as an enterprise data modeler for a large government agency.  He recently provided ontology-based spatial and temporal reasoning support for the METEOR natural language understanding project. He has accumulated nearly three decades of service to the Intel and DoD communities utilizing semantic technology, AI, knowledge representation/engineering, and NLP. He was Chief Modeler on the DataSphere semantic data integration project and one of two chief semantic integration modelers on Synapse. He was Chief Technologist for a DoD data integration effort for Harris Corporation. His early formal semantic influences include working for Leo Obrst at MITRE and at Vertical Net doing ontology-based agent communication and semantic integration of product catalogs and Bill Anderson on Knowledge Bus.

Eric presented a much less comprehensive version of this material at tutorial at Semantic Technology for Intelligence, Defense, and Security 2014 (STIDS).

Eric also presented this material at a meeting of SIGSEM at Ft. Meade attended by Joe Rockmore and semantic experts and stake-holders.

Length:   Half day.