Application Authoring in RDF

In ~2014, we published an open source authoring technology called glue-ml, as part of glue.ai v1.0 (used to be glue.ai, but...ask me what happened, after class - Apache 2.0 license).

In structural terms, this technique is authoring by graph:  

  • Authors create declarative descriptive graphs, governed by ontology types.
    • In the simplest example, an author can save a .ttl (RDF-turtle) file from a text editor, or from an RDF-aware editor such as Protege or TopBraid.
    • Other authors may instead use spreadsheets, or a customized domain-specific editor.
    • Semantic and logical validity of authored graphs is up to the author and tools.  

  • Applications read the graphs as RDF using the most convenient technology.
    • The app reading approach is independent of the authoring tool used, since only the information content of the RDF model is important.
    • Application code knows the ontology used by authored graph, which provides the logical design contract.
      • App impl may use a code-binding approach, when that feels comfortable.
      • Alternately the app impl may treat the authored RDF as a data-source for query.
      • Semantic validity is application dependent
        • Pretending otherwise is the most common sales con in software, forever!
    • Favorite example:  Switchable, layered profile graphs chosen for a runtime launch
      • Strings passed on command line (or other args) refer to named partial-profile graph files in our given deployment
        • Assume deployment offers some file resource space, such as JVM classpath
        • Layering is implicit in conventions for file names and folders
        • Layering is merged away in the next step
      • Named input model files are read and merged into a single cached in-memory model, which is the profile of the launched application.
        • Profile is just the graph-sum of all the input turtle files that were selected by launcher, e.g. at command line.
        • Ontologies used by the input partial-profile files should generally be the same
          • Easy case:  All partial-profiles owl:import the same version of some MyAppProfile_owl.ttl 
      • Launching app now configures its components using the profile, which is just a single graph of typed individuals.  
        • Actual configuration meaning is specific to application, naturally (else, see THE BIG CON)
        • Usually this single graph contains pointers to other artifacts, which may be graphs or other.
          • App may apply pattern recursively, if we want "turtles all the way down".  
      • AppProfile = example ontology for some component descriptions, released as part of the cogchar layer.
        • The 2016 AppProfile schema as a turtle file.
      • The configuration reading code is in the appdapter + cogchar codebase - TODO: look up exact packages

Popular posts from this blog

Streaming analytics in Scala using ZIO and Apache DataSketches (part 1)

Overview of AxLam model : Nouns, Types, Morphisms

AxioMagic under Spark on cloud (AWS, Azure, ...)