Many times, I am asked, “what is the difference between the Zachman Framework and other Frameworks, notably TOGAF (The Open Group Architecture Framework)?” There is a profound difference and Alan Brown, the owner of The Open Group, expressed this in introducing me to a TOGAF conference in Johannesburg several years ago when he said something to the effect, “for years, I thought it was either TOGAF OR Zachman. That is incorrect. It is TOGAF AND Zachman.” This article is my effort to elaborate Alan’s observation using a “Chemistry Metaphor”.
John Zachman, CEO, Zachman International
John will be presenting the course, ‘Zachman Enterprise Architecture Certification: Modelling Workshop‘ 10-12 March 2020, London. This course will also be available via live-stream
John keynoted at the IRM UK Enterprise Architecture Conference Europe 2019, 21-24 October London on the subject, ‘Building and Running Systems is not Enterprise Architecture and Vice Versa‘
The next Enterprise Architecture & Business Process Management Conference will take place 26-29 October 2020, London
TOGAF is a methodology … step 1 do this. Step 2 do that. Step 3 do this. etc. DoDAF, DND-AF, XYZ-AF are typically methodologies or methodology derivatives … they are specifying “composite” models (implementation “views”, relationships, pictures, documents, etc.) that specify either methodological steps or inputs or outputs to methodological steps. This should not be surprising because for 75 years, those of us from the information community have perceived ourselves to be in the business of building and running systems. We are very “building” or (“manufacturing”) oriented and therefore process (methodology) dominant. The methodology constitutes actual work. The end object historically has been to get to implementation as soon as possible in order to improve quality, save time and save money (better, faster, cheaper) for the Enterprise. Implementations, by definition, require multiple variables … for example, in the programming domain, to get a program to run, you need data, you need instructions, you need addresses, you need document or screen formats, you need control structures and you need conditions (rules), (i.e. what, how, where, who, when and why components) a composite of more than one single thing. Wherever there are implementations, there have to be multi-variable, “composite” constructs and therefore our traditional “Frameworks” tend either to be methodologies or inputs or outputs of methodology steps, formalizations of multi-variable, “composite” artifacts (models).
The Zachman Framework is different. It is not a methodology nor does it classify or specify “composite” constructs. It is an Ontology – the structured classification of all of the relevant, “primitive” (single-variable) concepts (facts) for describing an Enterprise. The Zachman Framework does not do anything. It does not imply anything about what to implement or how (methodologically) to build (manufacture) implementations. That is, the Framework does not imply anything about what composite implementations (systems) to build or methodologically, how to build them. The Framework is inert. It is simply a schema, the intersection of two classifications that have been used by humanity for thousands of years.
The Zachman Framework is an ontology like the Periodic Table is an ontology, a structured classification of all the chemical elements that exist in nature and that are scientifically used (reused) for manufacturing chemical compounds. The Periodic Table does not imply anything about which chemical compounds to produce or how to produce them. There is a fixed set of chemical elements in the chemical ontology. There is an infinite number of compounds that can be created by an infinite number of chemical processes from the fixed set of chemical elements. The discipline, the science of Chemistry, did not exist before Mendeleyev published the Periodic Table. There were alchemists, practitioners, who developed processes that could produce chemical compounds, in retrospect, relatively trivial compounds, by trial and error, best practice, whatever they could learn in a single lifetime of practice. For every different compound, there is a different process, in fact, there may be many different processes for creating a single compound.
TOGAF, DoDAF, etc. are like the chemical processes (methodologies) for creating compounds, composite implementations. As an example of a chemical process, here is a process for producing salt water:
Pour a bottle of bleach into a bottle of an alkali and you will produce salt water,
a relatively simple process and a simple compound that clearly could have been discovered by alchemists. TOGAF, DoDAF, etc. are similar in the fact that they are methodologies (processes) for producing implementations (compounds, composites). They are different than my simple saltwater example in that they, DoDAF, TOGAF, etc., are collections of implementation composites (“compounds”, models, documents) and processes. They do not produce one composite, they produce many composites. How many different composites (documents, models, matrices, pictures, etc.) are identified in one of these “Frameworks”? Many. How many will there be in the future? How many should there be? (My opinion is, there is always going to be one more … one more … one more, etc., etc.) Potentially, there is an infinite number of possible composites. It is a “beauty in the eye of the beholder” issue … whatever you like or want or need at the moment … and the changing external environment precipitates ever-changing implementation requirements, that is, different composites.
The process (methodology) for creating aspirin, a fairly sophisticated chemical compound is:
Synthesize aspirin from salicylic acid by acetylation with acetic anhydride.
All the nouns in this sentence are compounds (composites, implementations) made up of more than one element from the Periodic Table. The textual definition does not express the ontological components unless you have studied the Periodic Table and learned the linguistic conventions for describing compounds. I seriously doubt that the compound “aspirin” could have been created by trial and error by alchemists. Whoever figured out how to create this one relatively complex compound, “aspirin”, would have had to have a serious understanding of the Periodic Table.
The ontological question is, “does the ontology non-redundantly and comprehensively classify all relevant, single, facts (elements) in a logical structure reflecting the natural laws of classification (like the Periodic Table does)?” The periodic table is a TWO-dimensional classification, not a one dimensional hierarchy or decomposition or taxonomy (where the same thing can be classified in more than a single category – “DE-normalized”). One dimension of the Periodic Table classifies the elements of the universe by the number of neutrons and protons in the nucleus (the molecular weight) and the other dimension by the number of electron rings. It is a two-dimensional “schema”, one fact in one place, “normalized”. And, there is a fixed set (finite set) of elements in total.
The process (methodology) question is, “what product (compound, composite) are you trying to create and will the process you have defined precisely create whatever you intended to create?”
There actually is a couple more, relevant, process (methodology) questions: “Did you use the elements and structure of the ontology in scientifically defining your process for creating the product (compound), such that it is based on laws of nature embodied in the ontology, … or, in contrast, did you just create the product (compound) by trial and error using whatever compounds (implementations) that already exist or could be acquired? Or, did you use some process (best practice) somebody else defined by trial and error for creating some product? And then, is the product (compound, composite) that your or someone else’s best practice created, identical to the product (compound, composite) you are trying to create?
There is another very difficult problem … to extend the chemistry metaphor: to create salt water, one would add Hydrochloric acid (HCl), a compound of Hydrogen and Chlorine to another compound, Sodium Hydroxide (NaOH) a compound of Sodium, Oxygen and Hydrogen. Both Hydrochloric Acid and Sodium Hydroxide exist in nature as an acid and an alkali but before Mendeleyev published the Periodic Table, no-one would know the ontological contents of either of these compounds. So … there was no explanation of how or why the process worked … it just worked in practice, “best practice”. Take a bottle of this stuff and pour it into a bottle of that stuff and you are going to get this other stuff. It just works.
The problem with TOGAF, DoDAF, FEAF, MODAF, and XYZ-AF is, not only are they complex, robust methodologies that are producing many composite artifacts (models, documents, matrices, etc.) but they are using ontologically composite objects (compounds) in the process. If you examine the meta model of the artifacts being produced, the entities in the meta model are composites (like compounds) themselves. There is no ontological specification of the contents. (Just like pre-Mendeleyev alchemy.) There is nothing wrong with any of these methodologies. The question is, how are they defined … or how are they being employed and how do they work?
I recently participated in an exercise to map the metamodel of one of the aforementioned methodologies against my Framework metamodel of “primitive”, single-variable, ontological components and even though we had extensive textual definitions of the meta entities in the methodology Framework, they were not precise in that they did not specify the ontological components of the meta entities… we had to make assumptions about what the author of the methodology Framework had in mind. Then the question rapidly becomes, were the assumptions correct?
Here is an example of a composite (meta) concept: “Capability”. What is a “Capability’? There are bunches of different textual definitions these days for “capability” … that are not particularly consistent and not very precise even though they might be rather verbose. What are the ontological components of “Capability”? Are there Inventories (“things”, “Entities”) involved in the “capability”? Are there Processes (Transformations) in there? Are there Distribution Networks (“Locations”) in there? Are there Responsibilities (“Roles”/Work Products”) in there? Are there Timings (“Cycles”, “Events”) in there? Are there Motivations (Objectives, Strategies) in there? Are there Business Concepts in there? Is there Systems Logic in there? Is there Technology Physics in there? Are there Tooling Configurations in there? Are there Implementation Instances in there? Are there more than one or some combination of many of these ontologically “primitive” components in there?
Clearly, there could be any number and any combination of these ontological components that could make up a “capability”. No one has yet offered a precise, ontological definition and therefore anyone can assume any definition they want to assume. That explains why there is considerable inconsistency and diversity in the definitions and why the use of such a complex composite construct in methodologies is imprecise and produces inconsistent results. Maybe some alchemist can make it work … but it is likely not repeatable nor predictable nor reusable in any other context than the one in which it is being employed. Predictability and repeatability are characteristics only derived from an ontology.
As soon as a composite construct or composite formalism is created, it is fixed. It reflects a moment of time. It is a snapshot. Multiple independent variables are hard bound together. If one variable has to be changed, the composite is obsolete. Those of us who come from the Information Community have understood this in principle almost from the origin of the information domain … late binding! If you want flexibility, do not hard bind independent variables together until you click the mouse! We even practice this in some IT specialties notably in “OSI Layering” and “data independence”. Why are we not practicing this in the ENTRPRISE, in Enterprise Architecture? I would submit, we have not identified and separated the ontologically independent variables. Therefore any Enterprise Architecture efforts that do not create and manage the ontologically independent variables likely create point-in-time snapshots (composites) that risk becoming obsolete the moment any one of the component variables changes. In fact, a notable CIO in the U.S. Federal sector recently was proud to proclaim having eliminated many Enterprise Architecture projects and saving the Federal Government lots of money because “Enterprise Architecture projects take a lot of time and cost a lot of money and only produce pictures that go on the shelf.” Tragically, he probably was right if many of the “pictures” were “composite” in nature and therefore, obsolete the moment they were created.
In the exercise in which we were mapping the meta model of one of the above methodology “Frameworks” against the Zachman Ontology, I would estimate that 60 – 75% of the meta entities in the Framework we were examining were composite in nature. We couldn’t tell which ontological components were included, even having the extensive textual definitions of the meta entities from the originator of the methodology Framework. Furthermore, the textual definitions certainly were not standard across all other methodology-based “Frameworks” and therefore, not reusable.
I have only looked at the DoDAF, DND-AF, FEAF, TOGAF, DoDAF, and other AF’s in a cursory fashion. (Life is too short!) However, they prove my point … the plethora of artifacts (composites, models, “views”, pictures, formalisms) is robust. It looks to me like it is unbounded … and the artifacts are infinitely complex individually as well as in their abundance. Where is the ontology, the fixed set of Primitive components which is infinitely less complex and from which one could create ANY, complex, composite “view” (model) on demand through “late-binding,” which, in the manufacturing industry, would be called, “Mass-Customization”?
Of course, there is no ontology because those of us who have the inclination and ability to do Enterprise Architecture have 75 years of inertia building and running systems. We have been “manufacturing”. We live in the composite modeling domain. We have not been ENGINEERING, certainly not engineering ENTERPRISES, which is an ENGINEERING issue that requires single-variable, PRIMITIVE models, ontological disciplines, not composite, manufacturing, implementation models. I would suggest that we, the information people, are pre-Mendeleyevian. We are alchemists. The information domain is still based on “best practices.” The world is in the middle of a transition from the Industrial Age to the Information Age. Therefore we, the Information People, are in the middle of the transition from building and running systems, a manufacturing discipline to an engineering discipline that might, someday, be called, Enterprise Engineering and Manufacturing (like Chemical Engineering and Manufacturing, or Airplane Engineering and Manufacturing, or Computer Engineering and Manufacturing, etc. etc.) … or maybe it will just be called, “Enterprise Architecture”!
However, having said all of this, I would never criticize any of these existing Frameworks (methodologies or complex sets of composites) or tell anyone to stop doing whatever they are doing. They are all doing good work BUT … I doubt that what they are doing is going to result in what they want it to result in … ENTERPRISE integration, ENTERPRISE flexibility, ENTERPRISE interoperability, ENTERPRISE reusability, ENTERPRISE alignment, and so on. All of these characteristics are engineering-derived, not manufacturing or implementation-derived. Engineering Enterprises requires ontologically defined, single-variable, “primitive” models … not implementation oriented, multi-variable “Composite” models. Therefore, for the NEXT iteration (and I am positive there is going to be another iteration), I would recommend that we, the Enterprise Architecture practitioners (the alchemists) start to get the ontological Primitives in place and learn how to use them to create whatever composite (methodological) implementations they want … dynamically, on demand … late-binding … mass-customization of the ENTERPRISE!
Within 50 years of the publication of the Periodic Table, the alchemists had become Chemists and Physicists and were splitting atoms. Art (best practices) was transformed into a science. My perception is, Enterprise Architecture is in the same kind of transition from a practice (an art) into a SCIENCE, much like the chemical industry experienced in the early 1900s. After Mendeleyev published early versions of the Periodic Table around 1890 or so, friction went to zero! By the 1940’s, the chemists and physicists were splitting atoms! It got very sophisticated, very quickly! An ontology provides the basis for research. Predictability and repeatability are based on reason, logic, laws of nature, not dependent upon trail and error, life experience, art. Alchemy becomes Chemistry. Alchemists become Chemists. Systems become Architecture.
Similarly, in a mere hundred years, other disciplines rapidly matured including building construction, airplane manufacturing, electronics, ship-building, every other kind of industrial product manufacturing industry got really sophisticated really quickly. They all got started with a standard set of formalisms, single-variable, descriptive representations of the object they envisioned, intended to produce … ARCHITECTURE. No one in the building industry, the airplane industry, the computer industry, etc., etc. is arguing about what constitutes architecture, the structured set of single-variable descriptive representations for the object under construction. Architecture is Architecture is Architecture. In fact, that is how I learned about Enterprise Architecture. I did not invent the Zachman Framework. I just happened to see the pattern of descriptive representations for any or every other known object and I simply put Enterprise names on those same descriptive representations. It was only much later that I discovered that the Zachman Framework was technically, an ONTOLOGY.
So, what do YOU think is going to happen with Enterprise Architecture?
Is it just a broader perspective or scope for building and running systems? More of the same only broader in context? Will it go the way of the alchemists … a collection of best practices, just another technological “silver bullet” that was a slight blip on the annals of information technology history?
Or, will it become the ontological structure that forms the basis for human collaboration, ENTERPRISES, of the Information Age? Will we, the Enterprise Architects of today be able to embrace the single-variable, Enterprise engineering, primitive modeling, new paradigm, architectural concepts that constitute the basis for creating, changing, operating, managing every other complex object known to humankind and specifically, in our case, ENTERPRISES?
I am confident that Enterprise Architecture IS the “Issue of the Century”, the key to the Information Age, much the same as architecture for industrial products (like buildings, airplanes, computers, ocean liners, etc., etc) was the key to the Industrial Age. If the building people, the airplane people, the computer people, ship-building people, the Industrial Product people were still arguing about what architecture is for industrial products, the standard set of descriptive representations for describing things, we would not have hundred story buildings, Boeing 747’s, super-computers, automobiles, or any of the plethora of Industrial Age products that have become so common-place in our lives today… the issue of the Industrial Age.
Furthermore, I am confident that the serious enterprise architecture practitioners of today will become the Enterprise Engineers and Manufacturers and Enterprise Architecture will become the science essential to an Enterprise’s participation in the Information Age.
To reiterate Alan Brown’s observation at the outset of this article: It is NOT “EITHER the Zachman Ontology OR a Methodology . It IS the Zachman Ontology AND a Methodology.” The Architect will simply have to have a serious understanding of the ontology and use the ontological components in the definition of the methodology (process) and the creation of the composite implementation artifacts that constitute the basis for creating, changing, managing, and operating the Enterprise.
John A. Zachman is the originator of the “Framework for Enterprise Architecture” (The Zachman Framework™) which has received broad acceptance around the world as an integrative framework, an ontology for descriptive representations of Enterprises. Mr. Zachman is not only known for this work on Enterprise Architecture, but is also known for his early contributions to IBM’s Information Strategy methodology (Business Systems Planning) as well as to their Executive team planning techniques (Intensive Planning). Mr. Zachman retired from IBM in 1990, having served them for 26 years. He is Chief Executive Officer of his own education and consulting business, Zachman International® and Owner and Executive Director of the Federated Enterprise Architecture Certification Institute in Washington, D.C.
Copyright John Zachman, CEO – Zachman Internationalhttps://www.zachman.com/
 “John Zachman’s Concise Definition of the Zachman Framework” by John A. Zachman. www.Zachman.com
 Although additional protons and neutrons could be hypothetically added to the nucleus, creating different elements, as the molecular weight increases, there is insufficient electrical forces to hold the nucleus together and the elements would be instable. Therefore, the Periodic Table classifies the total set of elements that exist in nature.
 “Architecture Is Architecture Is Architecture” by John A. Zachman. www.Zachman.com
 “Enterprise Architecture: the Issue of the Century” by John A. Zachman. Database Programming and Design magazine. March 1997.