THE QUALITY OF THE PICTURES IN THIS THESIS MAY VARY FROM

THOSE IN THE ORIGINAL FILE, OWING TO THE REDUCTION IN FILE

SIZE THAT WAS NECESSARY FOR SUCCESSFUL LOADING TO THE

REPOSITORY LEGAL KNOWLEDGE ENGINEERING METHODOLOGY FOR LARGE SCALE EXPERT SYSTEMS

VOLUME 1

by

Pamela N.Gray, LL.B.(Melb.), BA.(Melb.), LL.M.(Syd.)

Barrister and Solicitor (Vic, 1968, Aust., 1968, N.T., 1974)

Solicitor (Eng., 1974)

This dissertation is submitted for the degree of Doctor of Philosophy

University of Western Sydney

School of Law

March 2007  Pamela N. Gray, 2007 Dedicated to my son, Xenogene Gray. ACKNOWLEDGEMENTS

I would like to acknowledge the invaluable suggestions and guidance given to me throughout my candidature by my supervisor, Dr. Scott Mann. He was chosen wisely as my principal supervisor by Professor Caroline Sappideen. His support and understanding of the work were consistent and excellent. I also thank Professor Yan Zhang for his support and willingness to help where he could, given the nature of the interdisciplinary study.

I am particularly grateful for the recognition by Dr. Mann of the relevance to the thesis of the work of Abraham Fraunce (1588), and his support in obtaining a small grant from the Law School to fund the translation of the graphics in the book from Norman French. In this regard, I am also very grateful to Professor Sybil Jack for undertaking the translation during her retirement. It was most fortunate that I was able to find a translator of Norman-French in Sydney who also had expertise in the period of the book and the historical context of the book. So many threads of good fortune such as this have assisted the work throughout the candidature, that well and truly I am reminded of the reassurance that should give hope to all researchers in their opus magnum:

Seek and you will find.

Jesus, sermon on the mount concerning law

(Bible, Matthew 7:7; see also Luke 11:9).

In the second year of the candidature, my colleague, Philip Argy, Senior Partner in the law firm, Mallesons Stephen Jaques, and now President of the Australian Computer Society, asked me to design legal expert system according to my previous work. This request was the impetus for my design of eGanges, a major turning point in the thesis. I also thank Philip for his strong support in trialling eGanges.

Further, I acknowledge the practical support given to the work by my son, Xenogene Gray, who demonstrated the validity of the thesis by programming the eGanges software design, so that it was possible for me to provide applications of the shell. He filled in the link that is missing in the thesis. Throughout the period of my candidature, Xenogene was also a delightful home companion who balanced my daily striving with joy, family devotion, and loyalty – a true kindred spirit.

I would also like to acknowledge the wonderful support and assistance of Law Librarian, David Sinfield, in the final literature search toward the end of 2006, shortly after his appointment to his position. He was significantly helpful with a mastery of the databases and international library loan facilities that are now available at the Whitlam Library of the University of Western Sydney.

Professor Reg Matthews, in a timely manner, released me from my lectureship duties at Charles Sturt University so that I could take up my Australian Postgraduate Award to pursue the research full time at the University of Western Sydney Law School. To wind up the candidature, his successor, Associate Professor Ross Wilson, supported Charles Sturt University grants of Special Study Leave and a Banksia Award. I thank them again for their support.

Pamela N. Gray

Sydney, Australia

March 2007 STATEMENT OF AUTHENTICATION

This dissertation has been presented in the fulfilment of the requirements of a PhD in Law at the University of Western Sydney. I certify that this dissertation is original and the result of my own work; any assistance received in its preparation and all sources used have been acknowledged in the text. I also certify that the substance of this dissertation has not already been submitted for any degree and is not currently being submitted for any other degree.

Signature......

Date...... 27.March.2007...... Table of Contents List of Figures...... iv Abstract...... vi Chapter One: Meta-epistemology...... 1 1.1 Meta-epistemological methodology...... 1 1.1.1 Scope...... 1 1.1.2 Legal epistemology...... 2 1.1.3 Ontology of legal possibilities...... 9 1.2 A SPECIFIC Meta-epistemological method...... 32 1.2.1 Five procedural steps and five epistemological stages...... 32 1.2.2 Deep model of legal domain epistemology ...... 38 1.2.3 Epistemological standard in legal knowledge engineering...... 75 1.3 Demonstration Prototypes...... 80 1.3.1 Five Prototypes...... 80 Stage 1 Prototype – Legal domain epistemology...... 80 Stage 2 Prototype – Computational epistemology of 3d legal logic...... 84 Stage 3 Prototype – Shell epistemology of eGanges...... 92 Stage 4 Prototype – Programming epistemology of eGanges...... 95 Stage 5 Prototype – Vienna Convention application epistemology and ontology ...... 97 1.3.2 Teleological epistemology...... 100 1.3.3 Technological epistemology...... 105 Chapter Two: Artificial Legal Intelligence and Meta-epistemology...... 107 2.1 Problem of artificial legal intelligence ...... 107 2.2 Extensions of Master’s work...... 107 2.2.1 Legal intelligence and legal epistemology...... 107 2.2.2 Collective legal intelligence and individual legal expert epistemology108 2.2.3 Human intelligence and computational legal intelligence...... 108 2.2.4 Deconstruction and reconstruction of legal intelligence...... 110 2.2.5 Reconstructed legal intelligence and intellectual artefacts...... 112 2.2.6 Jurisprudential systems of legal choice and their transformation...... 112 2.2.7 Jurisprudential systems science and specific meta-epistemological method...... 114 2.2.8 Several and integrated knowledge engineering methodologies...... 115 2.2.9 Paradigms of legal intelligence and logic...... 115 2.2.10 Rule-based legal intelligence and case epistemology...... 118 2.2.11 Survival metasystem and survival abduction...... 118 2.2.12 Science of legal choice and epistemology of legal choice...... 119

i 2.2.13 Small and large legal expert systems...... 120 2.2.14 Justinian and technological codification...... 122 2.2.15 Workstation and shell ...... 123 2.2.16 Aims and their realisation...... 124 Chapter Three: Computational legal epistemology...... 125 3.1 Computational epistemology of 3d legal logic...... 125 3.1.1 Monads...... 127 3.1.2 Rivers...... 132 3.1.3 Fans...... 141 3.1.4 Strata...... 147 3.1.5 Nested Logic ...... 152 3.1.6 Triads ...... 156 3.1.7 Spectra...... 163 3.1.8 Double Negatives...... 170 3.1.9 Star Poles ...... 171 3.1.10 Adversarial fishbone...... 176 3.1.11 Neutrality...... 180 3.1.12 Criss-crossing...... 181 3.1.13 Rings...... 183 3.1.14 Sphere...... 184 3.1.15 Universe...... 185 3.2 Intellectual Artefacts...... 185 3.3 Stage 2 specific meta-epistemological method ...... 187 Chapter Four: Shell epistemology...... 189 4.1 Design of eGanges...... 189 4.2 Interface...... 189 4.2.1 Dialogue epistemology...... 189 4.2.2 eGanges interface statics...... 193 4.2.3 eGanges interface dynamics...... 194 4.2.4 eGanges Rivers...... 195 4.2.5 Transformation of 3d visualisation by extraction of 2d River...... 196 4.2.6 Object-oriented logic...... 199 4.2.7 Extraction of 2d River from Sphere ...... 200 4.2.8 Extraction of deductive River from mixed argument River...... 203 4.2.9 Extrapolation of River from mixed logic decision trees ...... 230 4.2.10 Rivers and flowcharts...... 271 4.2.11 Rivers and other 2d epistemology...... 273 4.2.12 3d epistemology...... 274

ii 4.3 Communication system...... 290 4.3.1 Legal choice...... 290 4.3.2 Question and answer logic...... 291 4.3.3 Current result logic...... 294 4.3.4 Glosses...... 299 4.3.5 Communication by Notes...... 315 4.3.6 Build Functionality...... 316 4.3.7 Conveniences...... 317 4.3.8 Communication system heuristics...... 317 4.3.9 Conclusion...... 322 4.4 Stage 3 specific meta-epistemological method...... 323 Chapter Five: Further developments: application epistemology and ontology...... 326 5.1 Application Epistemology...... 326 5.1.1 Sample application...... 326 5.1.2 Prior analytics, mereology...... 326 5.1.3 Jurisprudence of legal knowledge engineering...... 329 5.2 Constraints...... 330 5.2.1 Equity epistemology...... 331 5.2.2 Daemons for disparate repetition...... 332 5.2.3 Evidentiary assessment systems...... 332 5.2.4 Precedent retrieval...... 333 5.2.5 Topological abduction...... 334 5.2.6 eGanges affects...... 335 5.3 Significance ...... 336 5.3.1 Hypotheses...... 336 5.3.2 Epistemology...... 338 5.4 Jurisprudence of Legal Knowledge Engineering...... 342 5.5 Stage 5 specific meta-epistemological method...... 343 List of References...... 345 Further Reading...... 379 APPENDIX...... 382

iii List of Figures Figure 1.1: J. Popple's directed acyclical graph ...... 18 Figure 1.2: Page from M. Maimonides, Mishneh Thorah, (c. 1180)...... 24 Figure 1.3: Horrock's Porphyry tree (2005)...... 26 Figure 1.4: Photograph of Model of reasoning of Korzybski, A...... 28 Figure 1.5: Model of reasoning of Korzybski, A...... 29 Figure 1.6: Gray's Specific Meta-epistemological Method...... 36 Figure 1.7: Development of the filing cabinet 3-D scheme for data retrieval...... 61 Figure 1.8: Case retrieval cuboid system of A. Kowalski (1991) ...... 62 Figure 1.9: The dialectic method of Peter Ramus...... 66 Figure 3.1: River map – a system of rules...... 133 Figure 3.2: Fan of wholly formalised rule streams in a River system...... 143 Figure 3.3: List of wholly formalised rule streams of a River system...... 144 Figure 3.4: Two formalised rule streams of a River system locked together...... 144 Figure 3.5: River map of formalised rule streams...... 145 Figure 3.6: River map with Six Strata below it - wire...... 150 Figure 3.7: River map with Six Strata below it - rendered...... 150 Figure 3.8: Complex River – extensive system of rules...... 153 Figure 3.9: Initial map for Ok to send message – Australian Spam Act 2003...... 154 Figure 3.10: Submap for No Australian link – Australian Spam Act 2003...... 155 Figure 3.11: Adversarial triads – contradictory and uncertain correspondence...... 158 Figure 3.12: Initial map for prosecution – Australian Spam Act 2003...... 161 Figure 3.13: Triad Spectra...... 164 Figure 3.14: Star showing Poles and pole streams...... 172 Figure 3.15: eGanges compliance map without submapping (chaos map) - Australian Spam Act 2003...... 177 Figure 3.16: Adversarial Fishbone...... 178 Figure 3.17: Ishikawa (1985, p.63) Fishbone: Cause and Effect Diagram...... 179 Figure 3.18: Sphere of legal knowledge...... 184 Figure 3.19: Legal Universe ...... 186 Figure 4.1: System of binary relations and Situate system...... 203 Figure 4.2: Ramist Chart of columns as outline of an art...... 205 Figure 4.3: Celaya's The Geometry of the Mind...... 206 Figure 4.4: Tartaret's Logic in Space...... 207 Figure 4.5: Arguments for the Queen in Northumberland's Case by A Fraunce...... 211

iv Figure 4.6: Arguments for the Queen in Northumberland's Case by A Fraunce...... 211 Figure 4.7: Judgment for the Queen in Northumberland's Case by A. Fraunce...... 212 Figure 4.8: P.N. Gray's eGanges River of Queen's arguments in Northumberland's Case ...... 218 Figure 4.9: Tree of Porphyry (c. 300 AD)...... 221 Figure 4.10: Tartaret's Tree of Porphyry (c. 300 AD)...... 222 Figure 4.11: A Wigmore Chart...... 225 Figure 4.12: J.H.Lambert (1728-77) - 'Some A is B' ...... 228 Figure 4.13: One of Peirce's early graphs, in considering non-relative logic...... 228 Figure 4.14: B. Coecke, D. J. Moore and S. Smets: a photon logic lattice...... 229 Figure 4.15: Tree showing root node and leaf nodes...... 232 Figure 4.16: Latent Damage Law Tree – top level...... 234 Figure 4.17: Latent Damage Law Tree – breach of duty...... 235 Figure 4.18: Kelsen tree of R. E. Susskind...... 240 Figure 4.19: and/or tree of R.E. Susskind...... 240 Figure 4.20: eGanges River map of Susskind's Kelsen tree by P.N. Gray (2006)....241 Figure 4.21: eGanges River map of Susskind's and/or tree by P.N. Gray (2006)....241 Figure 4.22: Multiple root node trees - P.H. Winston...... 248 Figure 4.23: Baird, D.G., Gertner, R.H. and Picker, R.C.: Figure 2.1...... 263 Figure 4.24: Baird, D.G., Gertner, R.H. and Picker, R.C.: Figure 2.2a...... 263 Figure 4.25: Baird, D.G., Gertner, R.H. and Picker, R.C.: Figure 2.2b...... 264 Figure 4.26: Baird, D.G., Gertner, R.H. and Picker, R.C.: Figure 2.3...... 264 Figure 4.27: Baird, D.G., Gertner, R.H. and Picker, R.C.: Figure 2.7...... 265 Figure 4.28: Baird, D.G., Gertner, R.H. and Picker, R.C.: Figure 5.1...... 265 Figure 4.29: Baird, D.G., Gertner, R.H. and Picker, R.C.: Figure 5.2...... 266 Figure 4.30: Graphics of Terrell's four dimensions of legal reasoning...... 280 Figure 4.31: Cube of lawyers' epistemology of M. Conover...... 285 Figure 4.32: Toms, E.: Holistic logic: A formalisation of metaphysics...... 307

v ABSTRACT

Legal knowledge engineering methodology for epistemologically sound, large scale legal expert systems is developed in this dissertation. A specific meta- epistemological method is posed for the transformation of legal domain epistemology to large scale legal expert systems; the method has five stages:

1. domain epistemology;

2. computational domain epistemology;

3. shell epistemology;

4. programming epistemology; and

5. application epistemology and ontology.

The first of these stages, domain epistemology, is dealt with in Chapter One in the course of defining the problem that the thesis addesses. The nature of legal epistemology is defined in terms of a deep model that divides the information of the ontology of legal possibilities into the three sorts of logic premises, namely, (1) rules of law for extended deduction, (2) material facts of cases for induction that establishes rule antecedents, and (3) reasons for rules, including justifications, explanations or criticisms of rules, for abduction. Extended deduction is distinguished for automation, and provides a map for locating, relatively, associated induction and abduction. Added to this is a communication system that involves issues of cognition and justice in the legal system.

The dissertation continues the work of the candidate's LL.M. thesis (Syd) which was completed in 1990, subsequently revised, and published as a book, Artificial Legal Intelligence, in 1997. The Master's work began with an exploration of legal intelligence, historically and in the contemporary field of legal knowledge engineering; it developed the Science of Legal Choice and Technological Jurisprudence, and concluded with the visualisation of 3d legal logic as a discovery in legal knowledge engineering. The book made clear the likelihood of a technological codification of law with the vitality of intelligence, and an associated opportunity for designer civilization. Chapter Two of the thesis sets out this earlier work and how it is advanced by this thesis.

vi Epistemologically sound large scale legal knowledge engineering is required for the technological codification of law and technological civilization design. It is shown that legal intelligence, defined in terms of epistemology for discovering and processing the ontology of legal possibilities, is a basis for designer civilization with the vitality of large scale artificial legal intelligence.

The second and third stages of the specific meta-epistemological method are demonstrated by prototypes in Chapters Three and Four respectively, which extend the deep model of the legal domain epistemology for computation. Chapter Three sets out the computational epistemology of 3d legal logic which is an articulation of the logic implicit in the visualisation of 3d legal logic, as the basis for determining knowledge representation and heuristics for automating the extended deductive processing of the ontology of legal possibilities. The visualisation provides logic structures of extended deduction, induction and abduction, for the ontology of legal possibilities; induction and abduction are located by reference to the components of extended deduction. This satisfies the requirements of the domain epistemology identified in Chapter One.

The design of the shell for legal knowledge engineering, eGanges (electronic Glossed adversarial nested graphical expert system) is described in Chapter Four. In order to design the shell, recourse is had to both the computational epistemology of 3d legal logic and other matters in the legal domain epistemology, particularly those matters relevant to the design of the communication system between the user and an application of the shell. The fourth stage, programming epistemology, is presumed by the production of operating software, namely the eGanges shell, programmed in by the candidate's son, Xenogene Gray, a computational physicist, according to her doctoral design set out in Chapter Four; a small sample finance law application of eGanges, which can show the software in operation, is available at: www.grayske.com

The fifth stage, application epistemology and ontology, is discussed in Chapter Five with a view to demonstrating the suitability of the prototypes for large scale legal knowledge engineering, and identifying further development of the specific meta- epistemological method that might be undertaken, particularly the prior analytics required to prepare the formalised extended deductive premises, and the

vii interrogation for the communication system. The Appendix sets out a sample of draft rule maps of the United Nations Convention on Contracts for the International Sale of Goods, known as the Vienna Convention, to illustrate that the substantive epistemology of the international law can be mapped to the generic epistemology of the shell.

The Ontological School of legal knowledge engineering arose in the 1990s to remedy the shortfalls of rule-based, case-based and logic programming, identified by Valente (1995); it promised a new synthesis for the earlier methods, that would produce what they had failed to do. However, the Ontological School has failed to use the philosophical concept of ontology as epistemology in the field of Artificial Intelligence, to obviate the need for sound legal domain epistemology. This thesis deflects the ontological solution back to the earlier rule-based, case-based and logic advances, with a definition of artificial legal intelligence that rests on legal epistemology; added to the definition is a transparent communication system of a user interface, including an interactive visualisation of rule maps, and the heuristics that process input and produce output to give effect to the legal intelligence of an application. The additions include an epistemological use of the ontology of legal possibilities to complete legal logic, for the purposes of processing specific legal applications.

While the specific meta-epistemological methodology distinguishes domain epistemology from the epistemologies of artificial legal intelligence, namely computational domain epistemology, program design epistemology, programming epistemology and application epistemology, the prototypes illustrate the use of those distinctions, and the synthesis effected by that use. The thesis develops the Jurisprudence of Legal Knowledge Engineering by an artificial metaphysics.

viii 1 CHAPTER ONE:

META-EPISTEMOLOGY

1.1 META-EPISTEMOLOGICAL METHODOLOGY

1.1.1 Scope

The objective of this thesis is to clarify and develop certain areas of legal knowledge engineering. It does this by posing and demonstrating a specific meta- epistemological method for legal knowledge engineering that is suitable for large scale expert systems. The use of the specific meta-epistemological method is demonstrated by two major prototypes and a sample of a third prototype:

(1) the computational epistemology of 3d legal logic, which is devised by reference to acquired legal domain epistemology, and is a development of the 3d visualisation of law (Gray, 1990, 1995, 1997) through the specification of the logic of the visualisation,

(2) the design of an expert system shell with the constraints necessary for a fifth generation computer language, named eGanges, as a program epistemology, based on the computational epistemology of 3d legal logic and other legal domain epistemology that is concerned with cognition, communication and justice, and

(3) a sample of a large scale application of the eGanges shell, that illustrates how the shell is suitable for large scale applications. This large scale application is in the field of the United Nations Convention on Contracts for the International Sale of Goods, known as the Vienna Convention, and is characterised by an eGanges application epistemology; it is the first large- scale application of the shell and the sample can serve as an international precedent for the construction of further applications in a national or international field of law.

The extensive, complex knowledge of the legal domain calls for a large scale legal knowledge engineering methodology. However, no epistemologically sound large scale methodology per se has been developed for, or used in, large scale legal Chapter One: Meta-epistemology 2 knowledge engineering. Legal domain epistemology is shown to provide a sound deep conceptual system for legal knowledge engineering. However, jurisprudential development of legal domain epistemology is required to identify its deep conceptual system for epistemologically sound legal expert systems. In posing the specific meta- epistemological method and demonstrating its use, the thesis clarifies and develops jurisprudence for large scale legal knowledge engineering based on sound legal domain epistemology.

It is not suggested that the specific meta-epistemological method is the only possible legal knowledge engineering method, the best possible method, or the only effective method for expert system construction; nor is the specific meta-epistemological method posed as the only possible, the best possible or the most effective meta- epistemological method. This thesis poses the specific meta-epistemological method as a possible effective method, at this stage of the technology, for epistemologically sound legal knowledge engineering, that is suitable for large scale systems development. The conclusion of the thesis sets out the limits reached in terms of further work that might be undertaken to extend the jurisprudential developments of the thesis.

1.1.2 Legal epistemology

Since epistemology was established by Aristotle as a branch of Metaphysics for transforming ontological statements into knowledge, three types of epistemology are now distinguishable. Epistemology may be a sound process or method for:

(1) the transformation of ontology to knowledge;

(2) the transformation of knowledge to ontology; or

(3) the application of knowledge to the real world for some practical purpose.

Legal epistemology involves all three of these types of epistemology:

(1) Legal ontology is transformed to legal knowledge when law-making authority verifies its rule structure as law. The power of legal authority is the form of empiricism in the legal domain that establishes the truth of a rule of law for the purposes of its deductive application to cases. Once this truth is established, a rule of law is legal knowledge. Chapter One: Meta-epistemology 3

(2) Legal knowledge may then be applied to a case by matching the ontology in the rule to the facts or ontology of the case, in order to determine the legal outcome of the case; thus knowledge is transformed to ontology;

(3) Through the transformation of rule knowledge to the real world ontology of a case, a practical purpose is achieved, namely the consequent of the rule is applied to the case. Also, any new ontology in a case may be absorbed by authoritative adjustment of the established rules of law; the law is developed to provide for new disputes.

Other legal knowledge is used in legal epistemology; this is the knowledge adopted by the judiciary in case judgements firstly, the inductive ontologies that are instances of rule ontologies, and secondly, the abductive ontologies that justify the rules of law.

Every rule has a legal consequent which may be adopted as a practical purpose. The rule sets out the conditions required to produce that consequent. If there is an adoption of a consequent as a goal, then the means to achieving that goal are the conditions or antecedents in the rule that must be carried out or put into place. This is a practical perspective on the law; various perspectives on the law are possible by way of evaluation or use of the law. There are ontologies of perspectives and uses of the law that are legal practitioner ontologies; as the ontologies of potentialities used in practical reasoning, these are also part of legal epistemology. For instance, a practitioner's ontology of potential conflict in a building contract, is used to select contractual terms that will prevent such conflicts in the transaction. An ontology for drawing a cohabitation contract, using eGanges, was developed to illustrate the extension of eGanges by the addition of a negotiation aid (Gray, Gray and Zeleznikow, 1997).

Eight sub-epistemologies can be identified in legal expertise:

1. Profession and authority – how power to make and administer law is distributed; 2. Rules of law – how an expert opinion or judgment is determined; 3. Cases – how precedent cases are given effect in the formulation of expert opinion or judgment; Chapter One: Meta-epistemology 4

4. Evidence – how findings of fact are determined (See Wigmore, 1913, 1931 and 1937.); 5. Litigation – how court orders are obtained; 6. Commercial practice – how benefits of law are obtained; 7. Legal strategies – how gains are maximised and losses are minimised through law; 8. Justice – how justice is achieved in the legal system. An exhaustive study of how these eight sub-epistemologies are related or interact, is outside the scope of this thesis. As a modus operandi of lawyers, law is distinguished from facts; similarly, antecedents in a rule are distinguished from the inductive ontologies that are instances of the antecedent(s) or consequent in a rule. The law of evidence is a legal sub-epistemology for determining the facts of a case. Legal domain epistemology is concerned with the application of law to case facts that are determined by its sub-epistemology of the law of evidence; it is also concerned with the derivation of rules of law by reference to case facts. The eight sub- epistemologies provide some indication of the multi-perspectives on the rules of law that may indicate how law may be used; sub-epistemologies are not sub-systems but provide alternate uses. However, any use of law requires an understanding of the jurisprudential system dynamics of legal expertise; uses, including automated use, must be consistent with each other.

The two Greek words that make up the term epistemology are episteme, meaning knowledge and logos meaning plan. Epistemology may be defined as the plan or scheme of knowledge; an epistemology is a particular plan or scheme of particular knowledge (cf. Foucault, 1966, 1969, 1983). Inevitably, a plan or scheme is a synthetic concept with structure and operations. The outcome of providing epistemological adequacy or soundness in a particular knowledge engineering project must be a synthetic program. The larger the program, the larger will be the task of creating that synthesis or operational system. In knowledge engineering, the structure of expert knowledge is provided by knowledge representation, and algorithms provide the operations on that structure.

Algorithms were identified by Kowalski (1979a and 1979b) as the form of human intelligence that might be automated. He set out a formula for developing algorithms for artificial intelligence: Chapter One: Meta-epistemology 5

ALGORITHM = LOGIC + CONTROL

Pre-determined premises are required for logical processing; they are the knowledge to be represented in the program. As extensively reviewed by Peirce (1931-1935, 1958), there are three forms of logic, namely deduction, induction and abduction, each of which uses premises. Davis (1972, p.22) explains Peirce's processing view of logic:

Peirce, in his early “Faculties” essays, refers to cognitive processes of all types as “inferences.” ... and the word “inference”, or better yet, “synthesis” represents well his alternative view that all cognitive processes are movements of the mind from one thing to another. ... For Peirce there are three kinds of reasoning processes: deduction, induction, and abduction.

Although law-making authorities do not lay down the law expressly as premises identified for use in one or more of these three forms of logic, what they say can be construed as such; they implicitly do determine authoritative logic structures of legal ontology. It could be said that legal information is laid down in three posits, each of which integrates ontology and epistemology in different ways:

(1) Deductive posits: ontologies to be used as deductive premises, in the form of rules of law that may be formalised as conditional propositions;

(2) Inductive posits: ontologies to be used as inductive premises, that may be formalised as existential statements which are definitional, and are usually the material facts of cases for use as instances of antecedents or instances of consequents in the deductive rules of law. Inductive instances may be extended by common knowledge, and dictionaries of synonyms and antonyms.

(3) Abductive posits: ontologies to be used as abductive premises for explanation or justification of deductive rules or inductive instances. Abductive premises may be formalised as logical arguments within their abductive context; they may be strong or weak reasons and they may be strong enough to displace a deductive rule or inductive instance and justify modification of the rule system.

Legal ontologies are given an integral logic structure by law-making authorities. What a judgment says about a premise, and how it is used in the judgment, Chapter One: Meta-epistemology 6 determine the nature of its logical posit. Thus when a premise is called, or treated as, a rule of law, it is a true deductive premise; when a premise is described, or treated, as a material fact, it is a true inductive instance of an antecedent or consequent in a rule of law. When a premise is described or given as a reason for a rule or part thereof, or a reason for accepting a material fact as an instance of an antecedent of a rule of law, then it is a true abductive premise. In a limited way, this legal epistemology is recognised by Waller (1995, p.170-1), in an introductory law text:

In any area where people use deduction they may employ one of two kinds of syllogism. They may begin, if the task is of a theoretical kind, by using the word “all”. The ancient example is:

1. All men are mortal.

2. Socrates is a man.

3. Therefore Socrates is mortal.

This method is simple. If the first two propositions are correct, the conclusion is obvious. The first proposition is called the major premise, the second the minor premise. But, of course, you may want proof of either premise. “Is it true that all men are mortal? It is true that Socrates is a man?” In this example long experience shows plainly that both are correct. In any event, the logician would answer that he or she is merely making assumption. Consequently then the answer is true as a theory. ...

Lawyers, and most other thinkers, prefer in practice to employ the second kind – the hypothetical deduction. That begins with “if” instead of “all”. For example there is this syllogism:

1. If a person deliberately hits another with a cricket bat that person has committed the crimes of assault and battery.

2. Jane deliberately hit Bill with a cricket bat.

3. Therefore Jane is guilty of these crimes.

The hypothetical method is often superior for use because it does not say “all”. It is another kind of assumption, not so hard to prove and likely to be correct. ...

So “If P then Q” is relevant as a guide – tautological though it may be. It remains the best and most common kind of inference for courts though they rarely use the actual terms: syllogism, major or minor premises. But they do constantly say, “If that is the law, then it follows that the plaintiff was entitled” or “the defendant is guilty”.

In 1962, Waller was the candidate's lecturer in the first year law degree subject, Introduction to Legal Method, at Melbourne University; he (p.168) also recognised Chapter One: Meta-epistemology 7 that precedent cases provide inductive examples, even in the formulation of new antecedents or rules. Some inductive instances are determined by analogy, and some by common sense or authoritative iteration. Waller also explored the logic used by lawyers that is outside the realms of deduction and induction, especially in keeping rules consistent and providing for new cases. The third form of logic is abduction, so these other forms of argument are categorised in this thesis as abductive, meaning leading from and to the rules (cf. McCarthy, 1980).

For the purposes of legal knowledge engineering, it is the jurisprudential system of legal epistemology that deductive processing may only use the deductive premises. If inductive or abductive premises are interspersed non-monotonically in extended legal deduction, as happens in practice, they may or may not reinforce the deduction, but they do not change the necessity of the deductive conclusion. Induction may operate to select the appropriate antecedents, and thus the appropriate rule for the continuation of extended deduction; abduction may be strong enough to require modification of the rule system. Otherwise, extended deductive application of a rule system is monotonic in its continuity; it is the most certain and objective form of reasoning in the application of law, and minimises the risks of judicial bias and other corrupting factors that are the concern of the American Realist School of jurisprudence. The division of premises as deductive, inductive and abductive structures of legal knowledge representation are a prerequisite for using the logical operations of the appropriate jurisprudential algorithm.

Thus there are three major concerns of legal knowledge engineering, all of which are epistemological: legal knowledge, legal logic, and jurisprudential system controls which might be meta-rules or heuristics of the legal domain. For legal knowledge engineering, algorithms are determinable in this scheme of legal epistemology.

The integration of ontology and epistemology is fundamental to human intelligence and artificial intelligence. John Neumann produced the first binary code computer language in the development of the first electronic computer at the end of the Second World War; the language used one memory for both data (ontology) and instructions on processing the data (epistemology). Like the cogito of Descartes (1637, 1644) who was a lawyer, integration of ontology and epistemology may be an indication that thought is at the deepest level that it can be. The Cartesian cogito, sometimes called the Cartesian circle, represents the interface of body and mind: 'I think' Chapter One: Meta-epistemology 8

(ontology required for epistemology) 'therefore I am' (epistemology that establishes ontology). In other words, I conclude that there is existence because I exist. The one certain instance, a rational posit, founds the general as an abstraction of the many. At this deepest level of mirrored entrances to ontology and epistemology, is where the foundations of a deep conceptual model of legal intelligence might be laid; integrated legal ontologies and legal epistemologies are the basis of the deep conceptual model for legal knowledge engineering.

The cogito was initially posed by Parmenides (c.540-480 BC), who was regarded by Russell (1961, p.66) as the inventor of logic because in one short poem, On Nature, Parmenides shifted the emphasis in metaphysics from existence to truth. Metaphysics expanded from ontology to include epistemology; knowledge was true ontology and epistemology established knowledge. Parmenides distinguished appearance from reality and maintained that thought was a more stable basis for determining harmonious knowledge than perception: ‘for to think is the same as to be’ (Burn, 1960, 1967, p.393).

Law-makers' posits may be compared to the monads of Leibniz (1714), the a priori principles of Kant (1781, 1788), the epistemes of Foucault (1969) and the paradigms of Kuhn (1962, 1970); each suggests some integration of ontology and epistemology. The fusion of ontology and epistemology reconciles legal positivism and analytical jurisprudence in the manner required for legal knowledge engineering, that is deeper and more particular than the analytical positivism of Austin (1832) and Kelsen (1911).

It might be thought also that the application of legal ontology to the real world for some practical purpose, without the intermediary of legal knowledge, falls within the range of legal epistemology (cf Valente, 1995); the thesis raises but is not concerned with this philosophical problem. In the legal domain, it is legal knowledge that is applied to the real world, and this knowledge includes legal ontology, particularly the ontology of legal possibilities which the law affects and effects. However, the logos or plan in physical ontology limits and determines the logos of legal epistemology; legal ontologies may only be created within the limits of physical ontologies. The medieval offence of witchcraft, was eventually discredited as it contravened this jurisprudential meta-rule of legal epistemology. Chapter One: Meta-epistemology 9

1.1.3 Ontology of legal possibilities

Statements about what exists are ontological statements. Ontologies are matters that exist, including the conceptual existences created by humans, such as laws; an ontological statement of law is an assertion that the law exists and not that the content of the law necessarily exists; legal ontologies are the existences acknowledged in the content of laws or created by laws. An ontological statement may be made also about legal practice.

There is a circularity about ontology and epistemology that defies origin like the problem of what came first, the chicken or the egg. As in the Cartesian cogito , what exists presupposes existence and presupposition is epistemological. Epistemology might be regarded itself as an ontology because it exists, and ontology may be discovered by an epistemology; this circularity may indicate infinity as the boundary of thought. Recursion is also a phenomenon of many intelligent programs. In a sense, the circularity may be regarded as a good basis for using the two approaches in thinking, since they presuppose each other only and nothing else: on the basis of the circularity or fusion, both the egg and the chicken are food for thought and, even though they may be integrated, both legal ontologies and legal epistemologies must be specified for automation.

In its meaning, law is concerned with what will happen if a situation or case exists; that is the nature of a rule because it has the conditional proposition form, 'if (antecedent(s)) then (consequent)'. The ontological situations that are explicit in law, might exist; law assumes an ontology of possibilities. Reconfigurations of ontology in express rules of law, may produce a range of hypotheticals (cf. Rissland, 1982, 1983, 1984, 1985); the extent of the hypotheticals used by the legal profession is determined by what is the legal consequent if one or more of the antecedents in a rule of law do not exist, which is possible, or are given additions, which are realised potentialities. For example, a Statement of Claim which pleads material facts to establish antecedents in the rules relied on, presents a configuration of legal ontologies, which may be followed by a Defence that reconfigures the antecedents to suit alternative rules that are relied on; in the defence there may be admitted material facts interspersed with denied material facts, and sometimes additional material facts, all of which amount to a reconfiguration of the antecedents relied on in the Statement of Claim.. Chapter One: Meta-epistemology 10

When the first programmer, Ada Augustus Lovelace (1815-1852), who collaborated with Charles Babbage (1792-1871), was inspired by the wonderful patterns in cloth produced by the weaving machines of the industrial revolution, she saw how the patterns of human intelligence might be produced by machine programming. She developed binary mathematics to program the Analytical Engine designed by Babbage to calculate mathematical tables. A lecture given by Babbage at Turin, Italy, in 1840 described the Analytical Engine; L.F. Menabrea, an engineer who attended the lecture, wrote a report of the presentation which was published in Bibliotheque Universelle de Geneve No.82, October, 1842 as Sketch of the Analytical Engine invented by Charles Babbage, Esq. Lovelace translated Menabrea's paper and added her own notes on the programming scheme she designed. The translated paper with her notes is republished as an Appendix in Bowden (ed.,1953). One of the major founders of modern mathematical logic (Morgan, 1847), including relational logic, Augustus De Morgan (1806-71), whose father-in-law had tutored Ada's mother, was the private tutor of Ada; as a woman, Ada had been precluded from a university education due to sex discrimination policies on admission to candidature at the time. In her extensive notes on the Analytical Engine, Lovelace (Bowden (ed.),1953, p. 400) explains how the Engine might also execute Combinatorial Analysis:

The methods in Arbogast's Calcul des Derivations are peculiarly fitted for the notation and the processes of the engine. Likewise the whole of the Combinatorial Analysis, which consists first in a purely numerical calculation of indices, and secondly in the distribution and combination of the quantities according to laws prescribed by these indices.

The programming epistemology of Lovelace distinguished instructions from calculations, and founded the computer components of memory and central processing unit which, together with input and output devices, constitute the design of modern computers. Machine code is mathematical and its words are now made up of 0s and 1s in various patterns and lengths; it is comparable to Morse Code rather than decimal numbers, based on the number of fingers of a person, developed in the middle ages by Hindus. Binary symbolism is rather semiotic (cf. Kevelson, 1987 and 1988); interpreters are required to translate machine words into human words and vice versa. The automation occurs at machine code level; programming may occur at a higher level if appropriate interpreters are available to compile the program into machine code or vice versa. Like the indices of Combinatorial Chapter One: Meta-epistemology 11

Analysis, adversarial reconfigurations of legal ontologies are also given rules for their processing.

Reconfiguration of ontology is a part of legal epistemology that was adapted for scientific method by Lord Chancellor Francis Bacon (1620) in the Second Book of his Novum Organum. His system of four Tables, illustrated by the study of heat, allows consideration of (1) the attributes of heat through a range of instances of heat, (2) the attributes of a lack of heat through a range of instances of a lack of heat, (3) degrees or comparative instances of heat and lack of heat with causal observations on increasing and diminishing heat, then (4) the attributes of a lack of heat that are excluded from the attributes of heat. The pattern in the Tables is comparable to pleadings in a court case; the Novum Organum, which was posed to replace Aristotle's work on ontology and logic, the Organon, was written just prior to Bacon's dismissal from office for taking bribes. He died a few years later from a chill suffered during his study of cold. Bacon (1620) explains his system as follows:

The investigation of the forms proceeds thus::a nature being given, we must first present to the understanding all the known instances which agree in the same nature, although the subject matter be considerably diversified. And this collection must be made as a mere history, and without any premature reflection, or too great degree of refinement. ...

Negatives, therefore, must be classed under the affirmatives, and the want of the given nature must be inquired into more particularly ... (p.141)

In the third place we must exhibit to the understanding the instances in which that nature, which is the object of our inquiries, is present in a greater or less degree, either by comparing its increase and decrease in the same object, or its degree in different objects... no nature can be considered a real form which does not uniformly diminish and increase with the given nature. (p.145)

For on an individual review of all the instances a nature is to be found ... man ...is only allowed to proceed first by negatives, and then to conclude with affirmatives, after every species of exclusion.

We must now offer an example of the exclusion or rejection of natures found by the tables of review, not to be of the form of heat; first premising that not only each table is sufficient for the rejection of any nature, but even in each single instance contained in them. For it is clear from what has been said that every contradictory instance destroys an hypothesis as to form. Still, however, for the sake of clearness, and in order to show more plainly the use of the tables, we redouble or repeat the exclusive. (p.149) Chapter One: Meta-epistemology 12

In the exclusive table are laid the foundations of true induction, which is not, however, completed until the affirmative be attained. ... And, indeed, in the interpretation of nature the mind is to be so prepared and formed, as to rest itself on proper degrees of certainty, and yet to remember (especially at first) that what is present depends much upon what remains behind. (p.150)

William Harvey (1578-1657), the physician who discovered the circulation system of blood, was acquainted with Bacon; he observed that Bacon wrote philosophy like a Lord Chancellor (anon., 1952, p.vi). Bacon's father was also Lord Chancellor in his time, so Francis, who had studied at Cambridge University and at Gray's Inn, was well imbued with legal epistemology. At the outset of adapting legal method to science, he observed:

Although there is a most intimate connection, and almost an identity between the ways of human power and human knowledge, yet, on account of the pernicious and inveterate habit of dwelling upon abstractions, it is by far the safest method to commence and build up the sciences from those foundations which bear a relation to the practical division, and to let them mark out and limit the theoretical. (p.137)

Bacon set out his method for science to 'superinduce' (p.137) knowledge. Scientific knowledge must look to its inductive instances as the source of truth that can be carried through to establish its Major deductive premises; whereas law looks to law- making power for the 'truth' of its Major deductive premises which then determine the scope of its inductive instances in cases (cf. Ashley, 1990).

In common law countries, a case is now pleaded in a variable Statement of Claim as one or more form(s) of action; this requires a statement of how the case facts of an action satisfy the relevant rules (cf. McCarty, 1997). Material facts of the case must particularise the antecedents in the relevant rules and state the Final consequent of those rules in terms of the claim, as well as the orders that thereby are sought. Where several rules that are connected, are relied on, the interim conclusions that connect the rules must be set out in the Statement as matters that are particularised by the facts of the case. Where there are no rules to rely on, an action on the case may be pleaded, with facts suggesting new rules, or a certain exercise of discretion by reference to relevant factors.

Issues of fact and law are resolved through the further pleadings, namely the Defence and Counterclaim, and Reply, if any. The Defence will indicate which facts in the Statement of Claim are denied, and, effectively, which rules or parts of rules in a Chapter One: Meta-epistemology 13

Statement of Claim are joined in issue by the defendant; the defence relies on contradictions of the facts pleaded by the Plaintiff, and the rules that deal with such failures to establish a claim. The Defence may plead further facts. If the further facts pleaded by the defendant amount to a claim against the plaintiff, then they must be pleaded as a Counterclaim, which is like a Statement of Claim by the defendant. Only pleaded matters may be raised and relied on at the trial; the parties are confined to these matters and issues.

Solicitors who specialise in a limited field are generally able to use prototype or precedent pleadings such as a Statement of Claim, so that it is not a difficult task to reconcile rules and case facts. Difficult cases require settlement of pleadings by a barrister who is skilled in reconciling rules and cases, and in pleading several forms of action, either coherently or in the alternative, in the one Statement of Claim. Litigation is conducted on the basis of the pleadings. Further and better particulars of a claim and written interrogation of the parties, called interrogatories, are available to elicit the evidence that can be expected at the trial. The judgment of the court usually settles issues of fact and law, with some commentary on the rules and case authorities that have been relied upon to settle any issues of law or found the action or defence. Any precedent cases relied upon in legal argument would have come into being initially as a fact situation pleaded as satisfying the rule(s) within which the case falls. Each court has its procedural Rules about pleadings.

In the legal domain, there may be different ways of establishing the same consequent i.e. there may be rules in the alternative or disjunctions; or there may be different ways of establishing the contradictory consequent. Disjunctions and adversarial or contradictory rules are all laid down by law-making authorities in a system of rules of law, although what is explicitly laid down may not be logically complete with every alternative. The more antecedents there are in a rule of law, joined by conjunction, the more disjunctions there are to consider where one or more of the antecedents do not exist. Alternative combinations of antecedents and their contradictories are legal possibilities. In the legal domain, possibilities are extended by conjunctions, disjunctions and failures. Consequents in the range of legal possibilities must be consistent with the express black letter rules. The ontology of legal possibilities is opened up by the application to express law, of the logical concepts of consistent contraries and contradictories. Chapter One: Meta-epistemology 14

The epistemology of the legal domain uses the ontology of legal possibilities that are implicit in law, as a means of ensuring consistency between the contrary and contradictory rules of law relied on by opponents in litigation. Rules that are not rules of law are excluded from legal epistemology; these rules produce invalid legal arguments. The ontology of legal possibilities in law is not concerned with all possible rules; it is concerned with all possible cases within the scope of the rules of law.

The assumption of an ontology of legal possibilities is an epistemological assumption; the range of legal possibilities is established, as a matter of legal knowledge, by logical derivation from express black letter law that is known. The process of derivation is an epistemological process, like the formulation of a truth table in logic (cf. Wittgenstein, 1922); it provides for possible cases within the scope of the express black letter law and the valid legal arguments that apply to those cases.

In his work on deontic logic, von Wright (1951) recognised that the logic of necessity, namely deduction, could be extended by a logic of possibility. The legal 'truth table' provides a taxonomy of valid legal arguments as an epistemological structure for the categorisation of any possible case and the determination of its legal outcome. Arguments that are not in the 'table' are not established valid legal arguments. It is the scope of valid legal arguments that determines the limits of the ontology of legal possibilities.

In the legal truth table, the Major deductive premises, namely the rules of law, are always true; to say they are false is to say that they are not rules of law. Minor premises which are established as true by the evidence in a case, must be true for the law to apply; truth tables in law assume the Minor deductive premise is also true. The denial of a Minor premise may be pleaded in order to invoke a different rule. It is not logically valid to extend a rule of law to its adversarial form. Only the establishment of a contradictory ontology can provide the basis for an opponent's argument in litigation. It is the ontology of legal possibilities that is represented in the legal truth table of valid arguments.

Thus, if there is a rule 'if a then c', it is not thereby logically valid to assume 'if not a then not c'. There may be ways other than a to establish c. However, if the rule 'if not Chapter One: Meta-epistemology 15 a then not c' is established ontologically by law-makers, expressly or impliedly, then there is an adversarial provision that is part of the ontology of legal possibilities. The adversarial contradictory will be established from the meaning of the law-maker's language in laying down the express rule; if the antecedents are referred to in terms that they must be established, then this will produce an adversarial contradictory.

Of course, law-making authorities may not lay down the adversarial contradictory rule; instead they may lay down a disjunction: 'if not a then c'. A disjunction of mutually exclusive contradictory antecedents occurs with some qualification in the Australian Spam Act 2003; a message which is not a commercial electronic message is not prohibited and a commercial electronic message which complies with certain conditions also is not prohibited. In legal epistemology, not-a implies not-c may have ontological validity, but not logical validity, as an authoritative assertion of law; epistemological rules may override the meta-rules of logic. Contradictories may be common points for both adversaries; it can not be assumed that the contradictory of one party's points is the same as a point for the opponent's case. Authoritative legal ontologies must be considered for each case, as well as the burdens of proof.

Even though the ontology of the adversarial contradictory may be implied, and extended deduction justifies forward chaining in the direction indicated by the inference arrow that represents 'then' or 'implies' in the conditional proposition, this does not authorise backward inference; the conditional proposition that is a rule of law is only a biconditional equivalence, a material implication, or a logic posit equivalent to the horseshoe, the reversed C of Peano (1858-1932) (Sanford, 1989, 1992, p.51), if the law-maker designates it as such, and usually this does not happen unless there is a legal presumption. Prima facie, a consequent in a rule of law does not logically establish its antecedents; antecedents must be established, directly or indirectly, by evidence of material facts in order to establish a consequent.

Usually legal experts carry out the construction of the ontology of legal possibilities in a limited way for particular cases as they arise; however, artificial legal intelligence must be given the means to carry out this analysis for all possible cases. Furthermore, users of legal expert systems may wish to explore the ontology of legal possibilities and the taxonomy of valid arguments, in order to understand the justice that applies to their case, or to take the benefit of the scheme of justice in the Chapter One: Meta-epistemology 16 ontology of legal possibilities and the taxonomies of valid legal arguments, from which invalid legal arguments may be inferred, evaluated and contravened.

Ontological posits and legal truth tables also solve the problem of semantic invalidities in logical form. Semantic invalidation of a modus ponens syllogism which is used in applying law to a case, is described by Waller (1995, p.170), in his first year law text, in the following way: Every sentence containing six words is true. This sentence contains six words. Therefore it is true.

In large scale legal knowledge engineering, the task of iterating each legal possibility is formidable. A deep model of legal expertise allows algorithms to be devised for a legal expert system, to minimise and manage this task. This was the approach taken by Allen Newell and Herbert Simon in their development of the first intelligent program, Logic Theorist (Waldrop, 1987, p.20-33). Heuristics of logic were devised as program instructions; Logic Theorist discovered a new and improved proof of a theorem. Such algorithms may be implicit in the deep model of legal expertise, even though they have not been developed or used in legal practice or in jurisprudence. Legal knowledge engineering methodology may enhance legal practice and further develop jurisprudence.

The scope of legal possibilities is limited by the ontologies that are found in express black letter law; on the basis of express ontologies, logic is used to determine implicit ontologies as well. Complete legal logic requires contradictory legal possibilities, and reality requires legal possibilities of uncertain ontologies. The totality of these possibilities provides data that is processed by the algorithms of deductive legal logic with meta-rules of jurisprudential system control. In accordance with this legal epistemology, algorithms of a deep structure of legal expertise may be determined and specified; the meaning of law is logically complete with the ontology of legal possibilities and legal algorithms may be determined on this basis.

Four steps (cf. Bacon, 1620) are required to systematically ascertain the full extent of the ontology of legal possibilities: (1) the determination of the positive extended deductive order of express deductive posits, (2) the determination of contradictories and uncertainties of express deductive posits in extended deductive order, (3) the Chapter One: Meta-epistemology 17 determination of inductive posits, their contradictories and uncertainties, and (4) the determination of abductive posits, their contradictories and uncertainties.

(1) Positive extended deductive order

To establish the possible cases within the scope of the express black letter law, that are the extent of possible legal ontologies, the express legal ontologies of black letter law are initially formalised as the antecedents and/or consequents of the system of rules of law that permit extended deduction; every formalised rule is a Major deductive premise in an extended deductive order whereby rules become linked continuously. Susskind (1987, p.146), the champion of rule base systems, pointed out as crucial, the nature of this linking:

... the consequents of some rules function as the antecedents of others.

Thus, if a consequent of one rule is established when all its antecedents are established by the facts of a case, that consequent may be used as an established antecedent in a second rule, to establish, along with further facts of a case that establish any other antecedents in the second rule, the consequent of that second rule, and so on, in a sequence of extended deduction. This phenomenon produces rule hierarchies which prima facie have mixed components of law and fact that may raise issues of fact, issues of law or mixed issues of law and fact in a particular case; the mix of components is evident in the tree of Popple (1996, p. 71), that is set out in Figure 1.1, which he calls a directed acyclical graph. In this diagram, the circles may raise issues of law and the squares may raise issues of fact; in some cases, both a circle and square may be in issue as a mixed issue of law and fact. The hierarchy of extended deduction produces the mix of potential issues in a case.

Hart (1961) observed that the natural language which was used to express law, brought with it some concepts that were semantically indeterminate; he described these terms which might be called abstractions, as open-textured. In contract law, the concept of consideration is open-textured; it may be satisfied by an indeterminate number of instances. The concept of consideration allows for new transactions of commercial exchange to arise. Popple (p. 70-1) explains his directed acyclical graph in terms of the open-texture of some concepts in an area of law: Chapter One: Meta-epistemology 18

The circles are parent nodes, representing open-textured concepts; the squares are leaf nodes, representing concepts which are considered to be fully defined (i.e. answerable by the user). The top level parent node is called the root node.

Figure 1.1: J. Popple's directed acyclical graph (Figure 2.1 in A Pragmatic Legal Expert System, 1995, p.71)

When it comes to proving a case, law applies not just with the necessity of deduction, but, significantly, with the necessity of extended deduction based on the overlap of common components from different rules. Legal ontologies include the open-textured concepts of the epistemological hierarchy of the rules of law that are Major premises for extended deduction; the hierarchy is the epistemological structure that orders the ontology for extended deductive processing of its components by way of application to a client case.

The open texture of some antecedents in law makes extended deduction inevitable. Detailing of antecedents in some rules, with further rules, is the inherent structure of Chapter One: Meta-epistemology 19 the hierarchy of the rule systems in law. Material facts in cases remain the inductive instances of antecedents, as distinct from rules of finer granularity, even if finer rules have only one antecedent. Finer rules are an opportunity to require several antecedents, not just one, and to add a further hierarchy of requirements, not just instances of a legal concept.

The logical interpretation of black letter law that produces the ontology of legal possibilities may be further expanded where there are alternate interpretations due to ambiguity and gaps; this was regarded by Allen (Allen, 1980; Allen and Saxon, 1991) as a major difficulty in logic programming of legal expert systems. Allen might be regarded as the father of legal logic for legal knowledge engineering. By contrast, Du Feu (1980) might be regarded as the father of pragmatic legal knowledge engineering. Pragmatic processing of legal ontologies allows for the meaning of the law-making posits of law, to be given any necessary logical procedures of their own, or their own ad hoc processing procedure that accommodates alternate meanings. Computers require logic or ad hoc procedure; logic procedure may simplify ad hoc or pragmatic procedure that is deemed logic procedure, or vice versa. Both logic procedure and deemed logic procedure may provide semantic accuracy that is integral to law as stated. Multiple interpretations may require special interpretive procedure that is pragmatic; thus, if the wording of a rule, such as 'a person who is found using a drug' might mean 'using a drug at the time of being found' or 'a person found out to have been using a drug', then an expert system would require procedures to accommodate both these alternatives, as different evidence is required to prove each.

(2) Contradictories and uncertainties

To complete the ontology of deductive legal possibilities in a legal expert system, the extended deductive antecedents and consequents that are formalised from black letter law, are expanded by their contradictories and uncertainties, including the contradictories and uncertainties of open-textured ontologies; these additional antecedents and consequents are also structured as rules, further expanding the system of formalised rules that are within the scope of the black letter law. It is possible that a case may occur with the contradictory or uncertainty of an antecedent or consequent that is specified in a rule of law; the effect of this must be clarified, especially where disjunctions produce alternative rules with the same consequent. Chapter One: Meta-epistemology 20

In the legal domain, the contradictories in the ontology of legal possibilities are treated as ontologies, even if they ensure, not some antonym, but the absence of some fact or condition; since rules of law may require the absence of certain antecedents, such absence of an existence is treated as legal ontology. For instance, the absence of rejection of a contractual offer is one of the necessary and sufficient conditions to establish a valid contract. Uncertainties are given legal consequents, according to the rules of burden of proof, so they too are part of the ontology of legal possibilities used by legal experts, pending resolution of uncertainties in a judgment.

The elements of ontological possibilities in the law are not completed by the deductive contradictories and uncertainties; however, these elements of ontological possibilities are contained in the epistemological structures of rules, as antecedents and/or consequents in an extended deductive structure. Combinatorial explosion of the alternative possible combinations of initial antecedents and consequents, their contradictories and uncertainties, determines the full scope of alternative, consistent, valid deductive legal arguments in the totality of the epistemological system of rules, and all possible case pathways through the full extent of the legal ontology of deductive antecedents and consequents.

Prior analytics, which is one of the six parts of Aristotle's work on logic, the Organon, deals systematically with the formalisation of premises for valid deductive inferences and the extent of necessary logical conclusions in syllogisms; invalid conclusions are also considered within the scheme of invalid inferencing and fallacies. A prior analytics of black letter law is posed in this thesis as part of the first step in legal knowledge engineering methodology, namely the acquisition of the expert legal knowledge.

Acquisition of legal expert knowledge must take into account both the black letter law and the further rules to accommodate the contradictories and uncertainties of antecedents and consequents, as a matter of logical completeness; these further rules are the implied rules of contradictories and uncertainties that may be relied on by opponents in litigation. Such prior analytics may formalise and shape rule hierarchies as continuous Major deductive premises for application by extended deduction to possible cases argued in litigation; the logical extension of the rules of law, and the hierarchies of Major deductive premises, are matters of legal epistemology. Chapter One: Meta-epistemology 21

Express law may state a mixture of rules for opposing parties, but extended deductive premises must be streamlined for one party or the other. It may be necessary to use a rule or its contradictory form to complete the streamlining for one side in litigation.

The Organon is a rich source for further development of the Jurisprudence of Legal Knowledge Engineering. Other parts of the Organon, such as the Categories and On Interpretation are pertinent to the formulation of legal ontologies, as units for ordering, preliminary to the premise formalisations of the prior analytics.

A full examination of the relationship between ontology and logic is outside the scope of this thesis. Such a study would have to begin with Aristotle's science of being as being in his Metaphysics, and his formalisation of ontologies for logical argument in his Organon, especially the Categories, On Interpretation and the Prior Analytics, and then take account of relevant literature through the ages. Bradley and Swartz (1979) used the notion of possible worlds to introduce the study of logic and its philosophy; possible ontologies also have been the subject of recent study (Augustynek and Jadacki, 1993). However, these recent studies are not directly pertinent to legal epistemology. They use logic to determine the truth of possible worlds. In the legal domain, truth is determined by power, although it is constrained by the realities of physical ontology and may be guided by expert witnesses and their knowledge of the truth of possible worlds.

(3) Inductive posits

Further ontological possibilities in regard to selected black letter law arise from the inductive instances which particularise the deductive antecedents and consequents of the system of rules within the scope of the black letter law (cf. Popple, 1996, p.68); these are likely to be factual instances of initial antecedents, their contradictories and their uncertainties. Inductive instances, which are existential or definitional in nature, may be iterative, or analogous to each other; they may be devised by reference to dictionary definition, synonyms, facts or dicta of precedent cases, expert evidence, common knowledge or common sense. Thus, further epistemological treatment of ontologies in rules, by induction, further expands the possible ontologies for legal argument or for the goal attainment of legal strategies. The instances are induced ontologies, to be applied through rules, and not directly to case facts. In the legal Chapter One: Meta-epistemology 22 domain, rules of law are enforced, pursuant to the rule of law, not pursuant to the rule of inductive ontologies or a rule of the functional ontologies of Valente (1995). Inductive errors may be corrected without the need to correct the deductive rules of law. The contingent nature of rules, as conditional propositions, acts as fair warning of enforcement to subjects of the law.

The components of a case situation are ontological; the situation itself is also ontological. The configuration of the components of a case situation can be mapped to the configuration of the ontology of the relevant legal possibility implicit in black letter law, or vice versa. The configurations of the ontological possibilities implicit in black letter law, and the processes of matching these to the ontologies of cases are matters of legal epistemology; that is the nature of the rule of law.

(4) Abductive posits

A final expansion of possible legal ontologies pertains to the ontologies to be found in abductive premises used in legal argument. These ontologies may also expand and contract as further circumstances come to hand and potentialities for legal invention or law-making, are realised. Abductive premises, their contradictories and uncertainties, may provide strong or weak support for rules of law. They are usually reasons for rules, such as the biblical commandment of love thy neighbour which was given as the basis for the duty of care in an action for negligence; abductive reasons may be the deeply rooted customs of moral action referred to by Buchler (1961, p.159). Abduction usually takes the following form which is a fallacy: If a then b b Therefore a

For example, Good law if based on biblical commandment Negligence law is based on biblical commandment Therefore Negligence law is good law

Legal abduction may take various forms, and even a fallacy may provide strong or weak support for a rule of law. In legislation and explanatory memoranda, abductive information may be available. Where case facts are brought to rules as the inductive instances of antecedents, or case dicta establish rules or parts of rules, as envisaged Chapter One: Meta-epistemology 23 by Branting (1991), reasons for rules or parts of rules might also be given, abductively to the decision in the case (cf. Atkinson, Bench-Capon and McBurney, 2005). Abduction may be a meta-ratio for a ratio decidendi.

Abductive posits may have their own separate epistemology or patterns of reasoning, some of which might be a modus ponens form of deduction or similar extended deduction, in its own abductive context. For example: a legal duty based on a biblical commandment is a good law; love thy neighbour is a biblical commandment; all people who love their neighbour, take care not to harm their neighbour; therefore such a duty of care is good law. This abductive reasoning is a reason that supports the duty of care, but is not in itself law; only the rules which give effect to the duty of care are law.

Historically, inductive and abductive annotations were made to codes of law as glosses; in modern times, margin notes are customary in statutes, but are treated as extraneous to the statutory law. Patterns of glossing that ex facie appear to be epistemological can be seen in the medieval annotations by Maimonides (1135- 1204) of the Jewish Code of Laws with Aristotelian and other commentary (c.1180). A page from this work, shown as Figure 1.2, indicates some sort of stratification in reasoning around central ideas (See: www.ucalgary/~elsegal/TalmudMap/Maimonides.html). Maimonides' work may permit a richer study of legal abduction; such a study is not made in this thesis due to the constraints of the main line of investigation. The extraordinary form of Maimonides' glosses, which varies from page to page, does indicate some epistemological structure that is an advance of complexity, following the work of the glossiters of Bologna, that began a century before; the Bologna glosses of the Roman Code of Laws were confined to the four simple margins around the text, without stratification. The Bologna annotations included inductive and abductive commentary on the text of the laws; a translation of the medieval Hebrew of Maimonides may also reveal inductive as well as abductive commentary, in the stratified structure. It would be difficult to provide Aristotelian commentary without retaining its epistemology; there may be transcendent rationes for meta-rationes. The reconciliation of the Jewish Code and the works of Aristotle, might also be a precedent for the comparative study of laws of different countries, including a study of their logical structure differences as well as their substantive differences, for the Chapter One: Meta-epistemology 24

Figure 1.2: Page from M. Maimonides, Mishneh Thorah, (c. 1180) - last paragraphs of Treatise 12 and first paragraph of Treatise 13, Book 12 (on frauds). Annotation of Jewish Code of Laws with Aristotelian and other commentary, D. Pizzighettone and A. Dayyan (eds), C. Adelkind for M.A.Giustiniani, Venice, 1550. Chapter One: Meta-epistemology 25 purposes of reconciling the differences as common international law, or as the basis for conflict resolution.

Where induction and abduction are located like glosses, by reference to the components of extended deduction, strands of annotative reasoning should be kept separate from the strands of extended deductive reasoning. Otherwise the sequence of reasoning may appear non-monotonic. Ontologies that are deemed by law-making authorities to apply to cases by necessity, should not be confused with abductive ontologies that play a different role in legal argument. Deemed deductive ontologies may be regarded as monotonic pragmatism. However, one deductive rule does not support or challenge the remainder of the rule system, although their abductive reasons may.

Legal epistemology finds the ontology of legal possibilities and its logical use structures for processing; then it provides the logical processing methods for applying the legal ontology to real world ontology. Epistemology provides the structures and processes for chaining through ontologies of legal possibilities.

Even a scheme of semantic retrieval is epistemological in the sense that it is a plan of means to retrieve specified or targeted information. In information science, ontology is used as a domain epistemology to acquire vocabulary with meaning mechanisms; an embellished Porphyry's tree (Horrock, 2005), which is an epistemological structure that was devised by Porphyry (c.232-304) to represent Aristotle's ontology of substance, shows the ontological categorisation that is useful in information science. Horrock's Porphyry tree is shown in Figure 1.3; Porphyry's tree is a knowledge structure that founds predicate logic. It explains the syllogism referred to by Waller (1995, p.170): All men are mortal Socrates is a man Therefore Socrates is mortal

The conclusion takes the reasoning back to the Major premise so that the syllogism is, in this sense, cyclical. Circularity provides identification of something as a sequence of presuppositions that may invalidate the reasoning, due to its self- referential nature; in an intelligent computer program, unbreakable recursion prevents a conclusion from being reached. This circularity and its implications for Chapter One: Meta-epistemology 26

Figure 1.3: Horrock's Porphyry tree (2005) See - http://www.epsg.org.uk/pub/needham2005/ the validity of hierarchical reasoning are illustrated by Korzybski (1933, 1941), the founder of general semantics, who posed non-Aristotelian semantic systems. His anti-Aristotelian model of reasoning is shown in Figures 1.4 and 1.5. Korzybski constructed his model of reasoning as an object which he used in teaching; a photograph of the object is shown in Figure 1.4. The model compares human reasoning which has many levels of abstraction, vertical and horizontal, with the understanding of an animal such as a dog. A Structural Differential in the photograph indicates the comparison of human reasoning to a dog's comprehension. A similar comparison could be made between computer and human reasoning: legal domain epistemology might be compared to computational legal epistemologies, program epistemology, programming epistemology, and application epistemology. The Korzybski model in Figure 1.4 is explained as follows:

For the event we have a parabola in relief (E), broken off to indicate its limitless extension.

The disk (Oh) symbolises the human object; the disk (Oa) represents the animal object. The label (L) represents the higher abstraction called a name (with its meaning given by a

definition). The lines (An) in the relief diagram are hanging strings which are tied to pegs.

They indicate the process of abstracting. The free hanging strings (Bn) indicate the most important characteristics left out, neglected, or forgotten in the abstracting. The Structural Chapter One: Meta-epistemology 27

Differentials are provided with a number of separate Labels attached to pegs. These are hung, one to the other, in a series, and the last one may be attached by a long peg to the event, to indicate that the characteristics of the event represent the highest abstractions we have produced...(Korzybski, p.399)

Figure 1.5 is also explained:

The diagram is used in two distinct ways. One is by showing the abstracting from the event to the object, and the applying of a name to the object. The other is by illustrating the level of statements which can be made about statements. If we have different objects, and label them

with different names, say, A1 A2 A3 ... An, we still have no proposition. To make a proposition, we have to accept some undefined relational term, by which we relate one object to the other. The use of this diagram to illustrate the levels or orders of statements implies that we have selected some metaphysics as expressed in our undefined relational terms. We should be fully aware of the difference between these two uses of the one diagram for the structural illustration of the two aspects of one process. ((Korzybski, p.397).

The use of the term ontology in regard to the semantic web, is equivalent to a knowledge base. The W3C (World Wide Web Consortium), deals with standards related to the web. The consortium has recommended the Web Ontology Language, OWL, as being its official standard. OWL DL is based on OWL description logics like Porphyry's tree, first order semantics, and other reasoning algorithms. Use of an ontology requires epistemology; the use structure of ontology is required for selection of appropriate logic. OWL has a simple but expanding epistemology.

As a solution to the epistemological shortcomings of rule-based and case-based systems, Valente (1995) added ontology to the repertoire of legal knowledge engineering methodology. He recognised that legal ontologies could be extracted from black letter law and modelled in various ways as functional ontologies for legal expert systems. His modelling of extracted ontologies was to be in accordance with models of legal practice ontologies that focussed on tasks, goals and methods. However, he did not consider that such modelling and models were, ipso facto, epistemological; nor did he consider the requirement that the functional ontology be in accordance with sound legal epistemology that had to be found in the legal practice ontologies. Sound legal epistemology plays a role in determining the ontology of legal possibilities for logical processing.

It was suggested by Valente also that the modelling of functional ontologies would remedy the epistemological shortcomings of earlier legal knowledge engineering to Chapter One: Meta-epistemology 28

Figure 1.4: Photograph of Model of reasoning of Korzybski, A. Science and Sanity: An Introduction to Non-Aristotelian Systems and General Semantics, The International Non-Aristotelian Library Publishing Company, Lakeville, Connecticut, second ed. 1941, originally published in 1933, p.398. Chapter One: Meta-epistemology 29

Figure 1.5: Model of reasoning of Korzybski, A. Science and Sanity: An Introduction to Non-Aristotelian Systems and General Semantics, The International Non-Aristotelian Library Publishing Company, Lakeville, Connecticut, 2nd ed. 1941, originally published in 1933, p. 396. Chapter One: Meta-epistemology 30 be found in logic systems. Inevitable ontological 'commitments' embedded in logic formalisms were identified by Valente, but he did not go on to find the ontology of legal possibilities embedded in, or implied by, the express conditional propositions of law:

With regard to their role as a representation tool for legal knowledge, the basic problem is that most of the proposed formalisms (which means basically deontic logics) fail to keep track of the epistemological aspects they necessarily involve, i.e. of the (inevitable) ontological commitments embedded in the formalism. (Valente, 1995, p.17.)

However, Valente did acknowledge that functional ontologies modelled to accord with tasks, goals and methods, would require verification by a legal expert; how a legal expert who can not impart expert knowledge for acquisition by a knowledge engineer, might provide verification of the validity of a program, is not clear. In his criticism of Valente's work, Aikenhead (1996) referred to the oft-quoted point made by Susskind (1987, p.20), in regard to legal expert systems:

It is beyond argument, however, that all expert systems must conform to some jurisprudential theory because all expert systems in law necessarily make assumptions about the nature of law and legal reasoning.

Niblett (1988, p.33) took issue with Susskind on this point, with the criticism that Susskind placed greater significance on jurisprudence in the design of legal expert systems than it deserved. This thesis maintains that legal knowledge engineering requires the development of its own Jurisprudence of Legal Knowledge Engineering, and should not be limited to jurisprudential works that precede the emergence of the technology; accordingly, the thesis continues the development of Technological Jurisprudence.(Gray, 1997, p.171). Susskind built the bridge from pre-computer jurisprudence to Technological Jurisprudence.

Nothing in pre-computer jurisprudence posed or addressed legal ontologies as such, although there were jurisprudential studies of what law is and what legal method and legal logic are. It might be thought that the rule of law could not be replaced by the rule of functional legal ontologies; people could not be gaoled, their property seized, or behaviour restricted, on the basis of functional legal ontologies. Laws are recognised warnings of law enforcement; ontologies are not, even though the familiar municipal no dogs icon, road speed limit icons and similar signs might qualify as such. Known legal icons occur as legal ontologies when their meaning is Chapter One: Meta-epistemology 31 understood. A paradigm shift in social organisation from rules to icon ontologies would be difficult to impose, especially where complex rules could not be captured in a simple icon the meaning of which was commonly understood.

Nevertheless, in some areas of legal practice, such as the formulation of terms in a building contract or terms in a cohabitation contract, for minimisation of risk of conflict in the transaction or relationship, the legal practitioner ontologies of conflict prevention that avoid legal possibilities of litigation, may be very useful; a building ontology and a cohabitation ontology can be schematized for effective selection of a set of terms. It is established legal practice to cut and paste terms from precedent documents that are based on such conflict prevention ontologies.

Even in the management of evidence, constructed ontologies shaped by the requirements of rules, may be of assistance. Wigmore (1913, 1931, 1937) developed a taxonomy of witness reliability and evidential value as a judicial aid for deriving findings of facts; this taxonomy, which is epistemological, rested on an ontology of witness and evidence types. It was schematized graphically as deductive premises to suit the deductive structure of rules of law.

More significantly, for the purposes of this thesis, Valente's work filled an important gap in legal knowledge engineering methodology that was not appreciated by Aikenhead. The extraction of a legal ontology and its remodelling, explains the process of formalising rules of law as Major deductive premises; it is the process of prior analytics. Moreover, the reuse of ontology posed by Valente, is required if a deductive antecedent or consequent in a rule of law is varied in accordance with black letter law, for the sake of logical completeness. Valente was not an experienced legal practitioner; he could see no further to the epistemological remodelling of legal ontologies that legal practitioners must do to account for possible case variations of antecedents and consequents in rules of law, but he illuminates precisely a step in the reasoning of legal practitioners, not before exposed.

Inevitably, legal ontologies are reused in the construction of the combinatorial explosion of possible cases implicit in a rule of law. Systematic reuse in this construction indicates simple heuristics for processing express black letter law, so that an express codification of the entire legal and not legal possibilities, is not necessary in a program epistemology. Chapter One: Meta-epistemology 32

Legal epistemology has various algorithms, each of which has its own logic and its own heuristics of jurisprudential system control. Legal expertise is a system of intelligence with sub-systems for certain differentiated purposes; although the forms of logic, deduction, induction and abduction are used in the legal domain, they are used specifically in a jurisprudential system of logic. It is the task of this thesis to determine the nature of that system, as a matter of sound epistemology, at each stage in the specific meta-epistemological method, so that it is feasible to construct large scale legal expert systems.

The prior analytics of the ontology of legal possibilities is a prerequisite for determining the deductive, inductive and abductive premises to be processed as legal data by the appropriate algorithms. The logical purposes, the use structures, not just the substantive purposes, must be considered in determining posited legal ontologies of authorised law-makers; authorities provide the purposes and structures which are epistemological, when they lay down the law. Legal ontologies may only be found in their identified jurisprudential posits, just as data can only be found at its identified computer address or by its data structure.

1.2 A SPECIFIC META-EPISTEMOLOGICAL METHOD

1.2.1 Five procedural steps and five epistemological stages

The ontology of legal possibilities with its integral logic structures for processes of legal epistemology, provides a deep model of legal expertise for legal knowledge engineering and for particularisation of the five steps of knowledge engineering set out in Hayes-Roth, Waterman and Lenat (1983) and adopted by Capper and Susskind (1988): 1. Acquisition of the knowledge 2. Representation of the knowledge 3. Design of the program 4. Construction of the program 5. Testing the program

There are also five stages in the specific meta-epistemological method which are similar in their sequence to the five steps of knowledge engineering: Chapter One: Meta-epistemology 33

1. Domain epistemology; 2. Computational domain epistemology; 3. Shell epistemology; 4. Programming epistemology; and 5. Application epistemology.

The specific meta-epistemological method provides a particular focus of epistemology, and continuity of this, for the tasks of the knowledge engineering method. The deep model of legal domain epistemology provides a methodology for Step 1, acquiring legal expertise for legal knowledge engineering; the deep model itself is an acquisition of generic legal expertise, as well as providing for the acquisition of substantive knowledge of law for an application.

Thus, the deep model can clear Feigenbaum's bottleneck (Feigenbaum, 1981, p.226; Gillies, 1996, p. pp.25-31). Feigenbaum was one of the authors of the first expert system, DENDRAL, a chemical analysis system, which was produced in 1965 at Stanford University where John McCarthy had established an artificial intelligence group. John McCarthy and Steve Russell developed LISP (LISt Processor) in 1958, as a computer language which processed systems of data lists that could be lists of logical premises, lists of rules, or lists of knowledge assertions. List systems are epistemological phenomena; the knowledge structures of lists are processed by the list system heuristics. DENDRAL was the first rule-base expert system and it was produced by Edward Feigenbaum, Bruce Buchanan and Joshua Lederberg (Buchanan, Sutherland and Feigenbaum, 1969; Lindsay, Buchanan, Feigenbaum and Lederberg, 1980). Following this work, Buchanan (Buchanan and Headrick, 1970, p. 45) suggested that a rule-based system could be suitable for the legal domain because legal problems can be broken down into sub-problems, each of which requires a series of decisions governed by decision rules. Feigenbaum used the metaphor of the bottleneck to identify the difficulty of acquisition of expertise for automation; the difficulty applies in the legal domain, due to legal logic practices not being clear, even to the lawyers who use them. Gillies (1996, 29), who explains the problem, cites the view given by Quinlan:

Part of the bottleneck is perhaps due to the fact that the expert is called upon to perform tasks that he does not ordinarily do, such as setting down a comprehensive roadmap of some subject. (Quinlan 1979, p.168) Chapter One: Meta-epistemology 34

Newell (1982) also considered that the knowledge level of expert systems required closer investigation if useful expert systems were to be constructed. The expert systems literature became increasingly critical of the epistemological shortcomings of the technology. This thesis seeks to redress Feigenbaum's bottleneck, through the specification of legal epistemology in a computational form for automation, the design of an expert-friendly, transparent shell, and production of a substantive application of the shell.

The deep model is as deep as required for the transformation from real to artificial legal intelligence. The specific meta-epistemological method proposed by this thesis distinguishes several stages in the epistemological transformation required to produce a legal expert system from an identified domain epistemology; in each of the epistemological stages that correspond to the five steps of knowledge engineering, different epistemological constraints are applicable. The deep domain model has jurisprudential constraints and these must be transposed to computational constraints that have knowledge structures and algorithms for their processing; the constraints of computation permit a program design, the construction of the program according to the epistemological constraints of some programming language(s). Substantive law constraints required by an application must fit the constructed shell program. As the deep model passes through the stages of epistemological transformation in the course of the specific meta-epistemological method, the ontology of legal possibilities and its logical structures and division of premises are first identified in Step 1/Stage 1, then represented visually in Step 2/Stage 2, namely in the knowledge representation of the epistemology of 3d legal logic.

On the basis of the Step 2/Stage 2 representation, legal possibilities are located in their integral logic structures of extended deduction, inductive spectra and abductive strata, as an indication of the specific knowledge representation that sound legal epistemology requires; then, for the purposes of the eGanges shell design, in Step 3/Stage 3, further legal domain epistemology is drawn on to embed the deep domain model in an appropriate interface and communication system. The logical processes of legal epistemology are implemented in the input-output processing that is based on the lawyer-client interaction. The cognition, justice and communication system of the legal domain are also part of its epistemology. Chapter One: Meta-epistemology 35

Following the shell program design, is the Step 4/Stage 4 programming of the shell, in accordance with the programming epistemology of the web language, Java, which is not explained in this thesis. The shell is tested in Step 5/Stage 5, with the partial construction of a large scale expert system in the field of the Vienna Convention. The substantive epistemology of this Convention law, is shown fitting to the generic epistemology of the shell. Other testing is not devised, due to the limits of the thesis.

When applied to the legal domain, the five stages of the specific meta- epistemological method demarcate a progressive development of a legal expert system that involves the development of a shell. There are some advantages of a shell as it is reusable, provides the familiarity of standardisation, and facilitates maintenance as law changes. If a shell is to be produced in a legal knowledge engineering project, providing epistemological structures and processes common to various fields of substantive law, a transformation is required from each of the stages of the specific meta-epistemological method to the next, as new constraints in each stage reshape the epistemology.

If there is to be no shell, a particular substantive program is designed in the third stage of the specific meta-epistemological method, rather than a shell program which is generic to various substantive fields of law; application epistemology is a consideration in this particular substantive design, rather than constituting a fifth stage following the shell programming. The specific meta-epistemological method provides a framework for the transformation required to meet changing epistemological constraints where a shell is to be developed. It might be adapted where no shell is to be developed. Testing a program may be applied to either a specific expert system or by way of an application of a shell.

The specific meta-epistemological method addresses a continuity in the process of transforming acquired legal expertise into a legal expert system. This continuity and transformation process has not before been adequately dealt with in legal knowledge engineering, as the available tools and techniques of the field of artificial intelligence have been followed simply; an adequate acquisition of legal expertise has not provided the foundation for a sound epistemological transformation. This thesis sets out the relevant domain epistemology as an adequate acquisition of generic legal expertise, and demonstrates the continuity in its sound transformation to a legal expert system, namely the Vienna Convention application of eGanges. A return to the Chapter One: Meta-epistemology 36 legal domain epistemology is required to acquire knowledge of the Vienna Convention and carry out prior analytics to transform its rules to the eGanges program epistemology.

The system of the specific meta-epistemological method is illustrated in Figure 1.6. The methodology itself is a thinking system, as well as a procedural framework. The work of Whitehead (1929) explored the systems methodology required for determining organic systems; this was a precursor to the work of Bertalanffy (1968) which established systems science. Systems that are operational require epistemological methodology that is operational. The thinking system of the methodology in Figure 1.6, begins with the domain epistemology where it enters a process of investigation. As legal experts use an epistemology that is partly explicit and partly implicit, to determine how a client’s case is categorised by reference to the body of law, sometimes the implicit part of their epistemology seems mystical or unclarified. The client case categorisation process amounts to legal argument(s), and is the basis of many legal services. To automate this domain epistemology, initially, it must be transformed substantially, into a generic computational legal epistemology that is precise in its system of knowledge structures and the processing of such structures. Once the investigation produces realisation of the knowledge structures and processing that are generic to the many areas of substantive law, there is the basis for designing a shell that can be used for small and large-scale applications in various fields of law; the design commences a period of instantiation where preceding realisations are implemented. This transformation from investigation through realisation to instantiation, significantly clarifies mystical aspects of legal epistemology.

Figure 1.6: Gray's Specific Meta-epistemological Method Chapter One: Meta-epistemology 37

The computational epistemology that is a suitable basis for a shell, must be further adapted as a program design, with an interface through which the program communicates with the user. If the computational epistemology does not include the whole of the expert domain epistemology, then further domain epistemology may be accounted for in the shell epistemology or application epistemology. This requires a process of retroduction (Peirce, 1931, p.28) or recursion, i.e. going back for more information before proceeding, and then proceeding on the basis of the expanded information. The epistemology of the shell design reveals the missing information about the interface design; there is a realisation that further investigation of the legal domain is required to determine the communication system between lawyer and client, and related matters of cognition and justice, before the instantiation can proceed.

Once there is a complete shell epistemology, programming must be carried out to implement the design as an actual computer program; thus there is a further transformation of the program epistemology of the shell to a programming epistemology whereby the shell is actualised. A programmer is constrained by the one or more computer languages that are selected and used to create the specified shell; the programming epistemology used in the construction of eGanges is object- oriented programming.

Finally, application epistemology must be configured to the computational and shell epistemologies, and be given effect by the programming. Application epistemology arises from the socio-legal ontology of a particular substantive field of law. For instance, the data that is the Vienna Convention, contains ontological concepts such as the sphere of the Convention, concluded contract, and remedies; these are legal concepts that have a social reality, here termed socio-legal ontologies. The epistemological structures in substantive ontologies, place their components in an order relative to each other, whereby they may be processed to ascertain how remedies or solutions are obtained or not available in any particular case. The social epistemologies of law-makers, especially legislators, who expect certain social outcomes from their socio-legal ontologies, attain their outcomes through the structures and processes of legal epistemologies that are based on logic. Judicial interpretation gives effect to the social purposes of legislation. Chapter One: Meta-epistemology 38

The generic shell must be epistemologically sound for the configuration of its substantive application epistemologies. The foundation of domain epistemology and its transformations through the specific meta-epistemological stages, can ensure this epistemological soundness. The application epistemology of the Vienna Convention matches the shell epistemology; the socio-legal ontologies of the convention, are placed in an order relative to each other, whereby they may be processed to ascertain how remedies or solutions are obtained or not available in any particular case; thereby, the expected social outcomes of the Convention are instantiated according to sound legal epistemology.

Enhancements of legal domain epistemology may be incorporated at any of the five stages of the specific meta-epistemological method, as domain evolution. The legal system is operational homeostatically, as a stable, repetitive system, and heterostatically, as an evolving, creative system. The technological precision of legal knowledge engineering may dispel or isolate fuzziness and uncertainty, as a matter of legal knowledge engineering jurisprudence or a legal science; greater precision can be given to juristic communication and justice itself. The prototypes posed in the thesis by way of demonstration of the specific meta-epistemological method, contain clarification and enhancements which illustrate this Jurisprudence of Legal Knowledge Engineering. Also, the prototypes further develop the science of legal choice that was established in the candidate’s Master’s dissertation (Gray, 1990, 1997). The specific meta-epistemological method is a framework for systematic specification and development of legal choice in large scale legal expert systems.

1.2.2 Deep model of legal domain epistemology

Knowledge processing, ipso facto, involves epistemology; accordingly, it advances information retrieval to epistemological processing. Like data retrieval, epistemological processing requires data in knowledge structures and retrieval processes that effect algorithms of knowledge processing. The ontology of legal possibilities has its integral logic structures for processes of legal epistemology; its detailing in a generic representation, a shell design, and a substantive application, demonstrates continuity of a deep and precise model of legal expertise at each stage of transformation in legal knowledge engineering. Chapter One: Meta-epistemology 39

The field of legal knowledge engineering has made progress in developing its deep model of integral ontologies and epistemologies, as knowledge structures and their knowledge processing, but has not yet satisfied the legal profession of its soundness and efficiency. It falls short in the areas which this thesis addresses, particularly in devising a transparent, efficient, expert-friendly interface for construction and consultation, and the heuristics that firstly, minimise the specification of the combinatorial explosion of legal possibilities implicit in the rules of law, and secondly, distinguish in accessible locations, the non-monotonicity of their inductive and abductive additions. The problem of non-monotonic sequences of reasoning are resolved in the deep legal domain model of this thesis by distinguishing and treating severally, the premises that are used in each of the three forms of logic, deduction, induction and abduction. In the legal domain, each unit of legal information is laid down authoritatively and carefully with its specific logical attributes; it must be learned, treated, and applied in that way. Justice engineering in a complex social organisation is a fine art, and must remain so in its automation.

Given the integration of ontology and epistemology both in law-making and in computation, this analogy may be assumed in proceeding through the transformations of the deep legal domain model of this thesis to the development of large scale legal expert systems. The integration of ontology and epistemology as the basis of the deep legal domain model is to be transformed to the integration of ontology and epistemology in the large scale legal expert system software. In the deep legal domain model, algorithms direct the processing of detailed data in knowledge posits; these aspects of the deep legal domain model can be used to derive program instructions from algorithm specifications. In turn, algorithm specifications require data structures drawn from the knowledge posits of the deep legal domain model; for the software, knowledge structures are required as data structures or program posits on which processing algorithms work. Advances in the field of legal knowledge engineering that have been made, may be considered as advances in the determination of appropriate data structures, as legal knowledge structures, and advances in the determination of algorithms to process these knowledge structures. Many of these advances have occurred through use of artificial intelligence tools and techniques. Chapter One: Meta-epistemology 40

At the outset of legal knowledge engineering technology after the second world war, the existence of a deep model of computational legal expertise was grasped by Allen, Caldwell, Meldman and Stamper, and investigated as such by McCarty, Smith, and Deedman. Many inroads into the knowledge structures of legal knowledge and their processing were made, investigating semantic networks in law, frame systems, a computer language for law, rule-base systems, case-base systems, neural network processing of law, ontological modelling of law and legal tasks, goals and methods, and communication between lawyers and their clients. The advances of these investigations will be considered briefly in relation to the work of this thesis under the following headings: · Lists, frames and rule-base systems · Semantic nets and command language · Hybrid rule-base and case-base systems

Neural networks are concerned to simulate reinforced human learning, and do not purport to be epistemological. Ontological modelling is already considered in the development of the ontology of legal possibilities.

● Lists, frames and rule-base systems

Early searches for a deep model of human intelligence, in the field of artificial intelligence, began with a deep model of logic that used lists of data as knowledge structures, and pointers as processing instructions. During the 1950s, Allen Newell and Herbert Simon developed IPL (Information Processing Language) which was the first list processing language; lists of data, which might be lists of antecedents or rules, lists of inductive instances or existential statements, or lists of argument premises, could be manipulated in relation to other lists of data, such as lists of Minor premises, or lists of conclusions, in accordance with processing algorithms that incorporated pointers. At the end of each item in the list, a device pointed to the next item for processing; pointers could be varied and might point to an item in another list. In 1956, Newell and Simon used IPL to produce the first intelligent program, Logic Theorist. The discovery of the new improved proof of a logic theorem by Logic Theorist was due to the new heuristics or procedures of logic inferencing which the program design incorporated; this deep model devised by Newell and Simon made possible the new intelligence of the program. Knowledge engineering may improve human intelligence. Chapter One: Meta-epistemology 41

Next, Newell and Simon, with programmer, John C. Shaw, produced GPS (General Problem Solver) which searched for a problem solution by a procedure known as 'depth-first'. The problem was seen as hierarchical; it could be broken down into sub- parts, so that a search deeper and deeper into the sub-parts could proceed systematically until a solution was reached. Deep models of the problem could employ depth-first search. Subsequently, best-first search procedures were developed with breadth-first options. Large scale legal expert systems have very deep problems that require very deep models and variable search directions.

McCarty's Taxman Project began by using a list processing language, Micro- PLANNER, and then used a frame language, AIMDS. Frames were invented by Minsky (1974, 1981) to partition lists so that they could be processed in relation to each other and so that, within a list, there might be sub-partitions to assist representation and internal processing of a list. In a schemata of frames, different frames may have common components. Frames may be related by virtue of common components or different components. Relationships between frames may take the form of a tree.

Following the work on his early programs, Taxman I (1972-7) and Taxman II (1978- 85), McCarty, who is regarded by Susskind (1989, p.29) as the Father of AI and law, concluded that the most critical task for the development of intelligent legal information systems was the construction of deep conceptual models of the relevant legal domain. For thirteen years McCarty attempted to create a legal expert system in the field of corporate tax law, using legislation and case law. He recognised the epistemological limitations of the computer languages he relied upon, and posed, as a deep model solution, the development of a computer language, LLD (Language for Legal Discourse), suited to the legal domain (McCarty, 1984, 1989, 1990, 1991), as syntax that surfaced rules.

In the development of LLD, McCarty posed generic surface categories reminiscent of the categories that were posed by Aristotle for converting ontologies into knowledge structures that could be subject to logic processing. McCarty (1991, p.186-7) saw common sense categories such as 'count terms, mass terms, states, events, actions, and modalities such as permissions and obligation over actions', as appropriate knowledge structures for processing significant semantic relationships in law by syntactic transformations; he also added purpose, intention, knowledge and Chapter One: Meta-epistemology 42 belief. In his Categories in the Organon, (Hutchins ed., 1952, p.5), Aristotle chose substance, quality, quantity, relation, place, time, position, state, action and affection, as the categories which are not by themselves composite, but, when combined, may be true or false assertions. McCarty realised the potential of LLD categories to check coherence or consistency in the rules of law (McCarty, 1997).

A deep model of legal epistemology must take account of the nature of legal language and its representation of legal ontologies, for the purposes of establishing generic deep related categories, as knowledge structures with attendant processing, and also to establish whether or not legal language is predisposed to logic (cf. Wittgenstein, 1958). Natural language is suited to knowledge hierarchies that can represent relationships between the semantic components of the hierarchy, as shown in the Porphyry tree in Figure 1.3. A sequence of relationships permits a logical process to connect the first and the last in the sequence. As allowed by Korzybski, pointers may establish identified relationships between categories in the tree, such as Socrates is not a beast.

A vocabulary of legal terms was developed from the use of Latin and Norman French in the legal system during the Middle Ages, prior to the development of the orthography of the English language (Gray, 1977, pp.139-40); during the reign of Edward III (1327-1377), English replaced Norman-French as the spoken language of the legal system, and legislation in 1730 and 1732 introduced English as the official written language of the legal system: 4 Geo. 2 c.26 and 6 Geo 2 c.14. The adopted foreign terms, such as debt, contract, trespass, felony, and larceny, permitted discussion that presupposed a set of rules and details; this facilitated communication between legal experts. In legal knowledge engineering these concepts are regarded as open textured i.e. subject to further particularisation, as illustrated Popple's tree in Figure 1.1. (cf. Hart, 1961). Open-texture is to be distinguished from gaps in the law, where the rules of law do not provide for a new case situation; the rules are said to run out and there are no details unless inductive instances are available from the material facts of precedent cases.

Convenient open-textured legal terms introduced the requirement for extended deduction in legal reasoning. The hierarchy of Porphyry's tree might arrive at the evidence in a case via the increasingly finer definition of a legal abstraction. Some legal concepts, such as property, might be hierarchically divisible, with various Chapter One: Meta-epistemology 43 interests, as a Porphyry tree; but mostly law is in the form of rules. Scientific philosophers from the time of Ockham (c.1285-1347), including Russell and Whitehead (1910), sought to remove unnecessary abstraction levels of knowledge and argument, but in legal epistemology, the abstractions were also used to connect contrary disjunctive paths with a common solution, or to connect a spectrum of analogous inductive instances for the sake of consistency. To remove an open- textured concept in a rule of law required law-making authority, and would mean removing an antecedent which allows its finer definition to enter the rule; the finer definition is also removed unless there is some means of it taking the place of the antecedent.

The real problem is that rule logic is not predicate logic although both use open- textured concepts; the Porphyry tree, ipso facto defines its open-textured concepts through its hierarchy of categories but this is not so for propositional logic or hypothetical syllogisms which are the form of rule logic. An open-textured concept in a rule may rely, not on sub-categories, but on finer definition by another rule, and this may produce a hierarchy of rules, rather than a hierarchy of categories. Sometimes, there may be more than one definitional rule, creating alternative definitions which are logical disjunctions; this may complicate the formulation of the ontology of legal possibilities and its processing. To give effect to disjunctions, there must be pro tem processing until all alternatives are exhausted, before the conclusion can be settled. Pro tem processing may be required at various levels of a rule hierarchy where disjunctions occur. In a hierarchy of extended Major deductive premises, the failure of all alternatives may chain down the hierarchy to fail all the consequents and their overlapping antecedents, until a new set of alternatives allows the reasoning to proceed. It may be necessary to introduce choice points as knowledge structures, where there is no open-textured label in the express law for the common consequent of the disjunction. Where definitional or particularising rules are not used, instead there may be a gradient or matrix of deductive instances that are examples of the open-textured concept.

In considering the ontology of legal possibilities, deformations of a rule to establish the combinatorial explosion of possible cases implied by the rule, for the sake of consistent application of the rules, may be complicated where there are open- textured concepts. An antecedent will not be fully deformed as its contradictory, Chapter One: Meta-epistemology 44 unless its inductive instances are also deformed correspondingly. Deformation of an antecedent is not required in processing a user's case if its inductive instances cover both opponents' alternative cases. McCarty's LLD potentially added inductive controls to the processing of rules, as surface syntax knowledge structures. However, their use was not fully explained.

Logically primitive natural language may have arisen because language was originally concerned with communication rather than logic; ad hoc, identifiable oral noises were agreed upon to communicate meaning. The communication might concern a matter of ontology only, or it might contain epistemological information, some of which might be logical. Prior to the evolution of writing, the rhyming of the bards was used to memorise large oral tracts such as laws. Initially, in the legal domain, emphasis was placed firstly on memory and then on literary skills, rather than language suited to logic representation. Legal logic developed covertly. Even when writing removed the problem of memory retrieval of law, the development of natural language was not turned to serve the hypothetical logic structures of law with their open-textured components. With the advent of writing, logic structures themselves still were evolving to suit an adversarial world; securing this evolution in a science of legal choice is a major concern of this thesis, as deductive, inductive and abductive disjunctions are the basis of freedom and informed decision-making. Logical consistency is also a requirement of principles of justice such as 'all people are equal before the law' and 'equity prevails over inconsistent common law', as well as serving to anticipate the range of possible cases of users of legal expert systems.

It may be that, historically, vested interests favoured the status quo of covert legal logic, which allowed law enforcement power to be exercised with greater protection when its logic was not transparent; covert reason was not readily available for critical review or disruption. The covert nature of legal logic was retained in various ways throughout the life cycle of the English legal system to its present stage: priestly procedural secrets, the foreign languages of the common law, the indeterminacy of legal theory, and the massive detail of casuistry (Gray, 1997).

Common natural language was adopted in the legal system to make the legal system more user-friendly and thus more effective. However, ordinary communication may have to evolve to better suit the adversarial world of computational legal epistemology. Legal logic knots in the natural language of black letter law may be as Chapter One: Meta-epistemology 45 dense and tight as the psychological knots of psychotic people, identified by Laing (1970); when they are unravelled, they might be seen to involve untenable information or untenable links between information. For the unknotting and streamlining of legal logic, the thesis poses for the formalisation of rules of law as a system of extended deductive Major premises, a nested graphical representation with a variable tributary structure, like a River, the eGanges River; this epistemological streamlining is shown to be an expert-friendly knowledge structure with an integral flow for processing. A River was initially identified in quality control management as a fishbone. The River formalism may have a hierarchy of disjunctive structures, so that it is not a simple linear knowledge structure; it is not limited in complexity or extensiveness. In eGanges the tributary structure may be broken up and nested to suit the limits of the computer screen and human cognition. The knowledge structure of an eGanges River is suited to the propositional logic of rules of law, just as the Porphyry tree is a suitable knowledge structure for predicate logic.

Part of the work of this thesis shows how legal epistemology can be made transparent and user-friendly in large scale legal expert systems, through the use of epistemological ideographs such as eGanges River maps; in the Chinese culture, ideograms, that could be regarded as icons, make up the form of written language. The formalised logic of eGanges ideographs may introduce a new form of communication, especially where the logic maps are drawn as cognitive art in a memorable way; the logic maps themselves might be treated as icons for complex, efficient communication, and also as new legal ontology in Technological Jurisprudence. As ontological icons of geometric formal logic, legal language may evolve as a computational language for legal discourse, which was first envisaged as LLD by McCarty (1989).

Both natural language and algebraic formal logic are problematic for the construction of large scale legal expert systems, as legal experts who wish to automate their expertise, have to separate extensive complex strands of reasoning, and then order them continuously for automated choice and application; it is difficult for experts to manage the coherence required in this extensiveness and complexity, and more difficult, even if they could do so, to set it out in programming code or pseudo code that would produce an epistemologically sound legal expert system. Chapter One: Meta-epistemology 46

LLD followed McCarty's conception of transformations and deformations as an internal requirement in legal expert system design (McCarty, 1977). The specific meta-epistemological method in this thesis poses epistemological transformations, externally, from domain to program, required for the construction of an epistemologically sound large scale legal expert system; the internal constancy and use of legal ontologies plays a key role in maintaining consistency through these external transformations. An ontology of legal possibilities, which comprehensively and systematically provides for deformations of express rules of law to process possible user's cases, requires computational logic to provide the epistemological configurations of domain ontologies for transformation to other epistemological configurations of software that may include the same domain ontologies.

Commercial shells for knowledge engineering became available during the 1970s and 1980s, but they were not epistemologically suited to the deep model of legal expertise. The development of the earliest rule based expert system, DENDRAL which led the way, in the field of chemical analysis, was followed by MYCIN, in the field of medicine. MYCIN was stripped of its medical rules to a generic shell, EMYCIN. These shells consisted of an inference engine which could process a formalised rule base during a user consultation. Some commercial shells were enhanced to include other modules such as a data-base system and tree visualisation to assist easy construction of an application.

Rule bases are lists of rules that are schematized syntactically, so that pointers occur at syntax structures. A syntax is a knowledge structure; for a rule base system; the formalism of a conditional proposition is used as syntax structure for arranging pointers and instructions on using them. An inference engine uses a set of processing instructions that refer to syntax pointers. The processing instructions of an inference engine produce a sequence of steps as if reasoning had occurred; a reasoned conclusion is produced. The deep structure of legal epistemology such as the conditional propositions of rules of law can be used to design knowledge structures and their processing for a shell suited to the legal domain. Legal knowledge engineering is concerned with all stages of the process of automating legal expertise, from the determination of the deep structure of the domain epistemology to the final outcome of a legal expert system. It is concerned with data that is expert knowledge, and the expert processing of that data; knowledge and its processing are matters of Chapter One: Meta-epistemology 47 epistemology. In an autopsy of legal expertise, there are three areas to be examined in particular, to determine the domain epistemology relevant to legal knowledge engineering: knowledge structures, legal logic and jurisprudential systems control. Kowalski's formula is a sound basis for determination of a deep model of legal expertise for legal knowledge engineering if the ontology of legal possibilities provides the knowledge structures which legal domain algorithms are to process.

The rules in applications of shells constitute the knowledge base to be processed by an inference engine. The syntax of the rules in the early shells took the form of Horn clauses, as knowledge structures that compressed the Major and Minor premises of modus ponens syllogisms of legal deduction. In a Horn clause, there is a conditional proposition but the antecedent(s) of the Major premise are given value(s) that must be established for the conclusion to apply; the consequent of the Major premise which is also given a value is the conclusion. For example, the following Horn clause is taken from the knowledge base or rule base of CLIMS (Contract Law Information Management System), which was constructed in 1987 by the candidate and Carl Jackson as programmer (Gray, 1988), using the small public domain shell, ESIE (Expert System Inference Engine): if offercom is ok and expired is ok and consideration is ok and rejection is ok and revocation is ok and request is ok1 and reasonable is ok then contract status is NO INVALIDITY FOUND (Gray, 1997, p316)

In the legal domain, rules of law are laid down as Major premises for deduction, and Minor premises are based on the material facts in a case to which the Major premise is applied. Rules in Horn clause form seek evidence of the Minor premise and assume the Major premise. Potentially, open-textured terms might be excluded; otherwise they may remain in the rule base as links in the chaining to an outcome. However, in a rule base system, if the expert system is to be epistemologically sound, Horn clauses must particularise severally, all deformations of rules to process all possible user cases, i.e. the combinatorial explosion of the ontology of legal Chapter One: Meta-epistemology 48 possibilities. This is not feasible as a legal knowledge engineering methodology for the construction of large scale systems.

It is the overlap of deductive rules in the legal domain that permits the chaining of an inference engine through the rule base of a legal expert system, as illustrated in the rule base of the first CLIMS Pilot (Gray, 1988, 1990 and 1997, pp.230-58). The small CLIMS rule base (Gray, 1997, pp.316-26) included the contradictories and uncertainties of the antecedents in the rules of contract law that were used, so that the system could process the combinatorial explosion of possible cases within the stated law. However, it was not feasible to scale up to a full contract law system.

The concern of this thesis is the knowledge structures and processes that amount to sound legal epistemology for large scale legal knowledge engineering; the rules of law as the extended deductive Major premises are preferred to Horn clauses, as the domain knowledge structures, with inherent deductive processing provided by the chaining, as user input of Minor premises satisfy the antecedents in the Major premises. In the prototype shell, eGanges, the Major premises of law, in their extended deductive structures, are represented as an interactive visualisation of nested River maps, separately from the Minor premises of the legal deduction which are shown as feedback lists in adversarial windows for each opponent case. Heuristics are developed through the communication system between the program and the user, to take and process instructions as input on whether or not the antecedents in the Major premise are satisfied or not. At the same time, the user is advised, for each question that establishes an antecedent, of the three possible answer inputs, which may be varied to suit the natural language of the interrogation, and the effect each answer will have on reaching the Final consequent of the Major premises structure of extended deduction. Some users will seek the Final consequent shown in the interactive River visualisation as the end of the tributary structure of extended deduction, and some will seek its contradictory or an uncertain result that will effect a contradictory result, given the burdens of proof.

An application of eGanges requires the easy construction of the River map, which may be nested as far as required, given the extent and complexity of the application law, and may be navigated freely; each stream in the tributary structure represents a formalised rule with a node for each antecedent and an inference arrow following the last antecedent in the rule before the consequent. The inference arrow indicates the Chapter One: Meta-epistemology 49 direction of flow of the River; where a consequent of one rule is the antecedent of another rule, there is a tributary confluence and a flow on of the inferencing into the rule that is entered, according to the direction of its inference arrow. No hardcoding of a knowledge base is required. The River that is constructed by the application builder is the River that is consulted by the application user.

As input establishes the user's Minor premises during a consultation of the eGanges application, the antecedent nodes in the River are coloured to record the value of the input for reaching the Conclusion of the River; blue is favourable, red is unfavourable and uncertain is recorded as yellow, which is prima facie unfavourable. If there is an open-textured antecedent with associated inductive information to assist the selection of the Minor premise input, then this is made available as a gloss spectrum which shows the gradient of analogous or authoritatively iterated inductive instances. If there is abductive information available to explain or justify an antecedent or the rule it is in, then an abductive gloss is also available for the user, to assist an informed selection of Minor premises. An authority gloss is also available to show the source of the law, and links to the relevant black letter law in web databases, such as the AUSTLII databases, can be established. An eGanges application may provide a logic front end for AUSTLII.

As a general rule in the legal domain, it is considered fair to advise a witness of the consequence of evidence before it is given, if that evidence may be particularly disadvantageous to the witness; this is especially so if the evidence incriminates the witness and might lead to conviction of the witness for a crime. It has been customary to provide a facility in legal expert systems to allow the user to ask why a question is being asked. In eGanges there is no need to ask because the information is given with every question; answer labels advise the significance of the answers and there may be notes in the Notes window to assist. Questions are asked in the context of the whole map of questions. Pragmatically, advice on the effects of an answer in a legal expert system, may be useful to the user as it minimises the need to try changes, and provides transparency in the application of law to the user's case. The advice on possible input in eGanges shows the reason for seeking the user input, in the full hierarchical context of the rules of law, as represented in the freely navigable River visualisation. Chapter One: Meta-epistemology 50

As client instructions, user input may be compared to Kant's concepts of the

Hypothetical Imperative or the Categorical Imperative (Kant, 1788), in his early construction of the system of the human psyche and reasoning. Once input is given, the deductive syllogism is automated, and, to the extent possible, the extended deduction that follows is automated; when all necessary and sufficient conditions are satisfied, the final conclusion is produced as the Current result. An interim Current result may be obtained at any point in a consultation.

The dominance of the rule system in legal epistemology as extended deduction, and the subordination to the rule system of single cases of inductive instances, makes the necessity of extended deduction the dominant form of legal reasoning. The modus ponens syllogism, simply, or in extended deduction sequences, attracts the least risk of errors in logic, and minimises the risk of fallacies, of the biases and corruption that are the concern of the Realist School of jurisprudence in the USA, and of what Bacon (1620) called, in the First Book of his Novum Organum, the idols that invalidate human reasoning and science. The automata of deductive application of rules to cases was called mechanised jurisprudence by Pound (1908).

The searches for deep structures in the legal domain by McCarty, Deedman and Smith were not founded on the algorithm formula of Kowalski (1979a), who assisted in the development and demonstration of the logic programming language, PROLOG (PROgramming in LOGic); Kowalski and Sergot (1985) constructed a legal expert system, in the legislative field of the British Nationality Act, using the propositional calculus of PROLOG, which took the form of Horn clauses. PROLOG has since been extended to include the sort of surface syntax that was proposed by McCarty. For example ANSPROLOG (PROgramming in LOGic with ANSwer sets) was developed to provide for non-monotonic logic programming (Baral, 2003).

1. Even before Kowalski's formula, the search for deep structure in legal logic for automation, was undertaken by Allen (1957; Allen and Caldwell, 1963). Allen (1961) designed a logic game called WFF 'N Proof . He was the foremost proponent of domain specific legal logic, and was concerned to introduce Hohfeld's categories (Hohfeld, 1913) of jural relations to his deep generic model, rather than leave them to the substantive epistemology, or the semantic components of legal ontologies, which are the approaches taken in this thesis. By 1997, with Saxon, Allen had developed a computer language Chapter One: Meta-epistemology 51

for the legal domain, based on Hohfeld's categories, including the correlative and opposite relationships between them (Allen and Saxon, 1986, 1991, 1993, 1994, 1995 and 1997; Allen, 1968, 1974, 1982, 1983, 1995, 1996).

Non-monotonic logic in legal reasoning could not be automated by the necessary reasoning of Horn clauses or other deduction forms of logic. When inductive and abductive components were interspersed in deductive argument, the priority of the components had to be specified; this was especially so for the abductive component which might over-ride the inductive or deductive parts, as strong or weak argument in support of or opposing a rule. Specific modelling of non-monotonic sequences was required, comparable to a state machine that was not regarded as artificial intelligence and was difficult to program and maintain due to the volume of detailed coding required. Computation of non-monotonic modelling in the legal domain was explored (Prakken, 1993, Sartor, 1994, Gordon, 1994). Theoretical explanations had recourse to the concept of rhetoric used in early Greek legal practice; the Greek term for lawyer was pragmaticus, and classical Greek rhetoric encompassed alternative realities and variable sequences of deductive, inductive and abductive components, to suit the purposes of the client of the pragmaticus. Rhetoric could be presented with oratorical skills of persuasion, as noted above by Waller.

Valente (1995, p. 17) observed that the use of logic in legal knowledge engineering was not epistemologically sound for large scale generic use:

With respect to the role of logics as a reasoning tool, it is doubtful whether any logical model of legal reasoning can be an appropriate one. All these formalisms propose a basic logical inferencing scheme such as a non-monotonic inferencing system which is to be used homogeneously. However, no matter how these inferencing systems may model correctly a small number of sentences, description of actual problem-solving requires a different emphasis and a different language, which is that of tasks, goals, and methods [Chandrasekaran et al., 1992]. One such inferencing engine may be good in solving some tasks, but not all of them. That is, these reasoning schemes are too general to be efficient or adequate in practice, and a legal knowledge-based system is likely to require a range of different engines in combination to use the knowledge available in the best possible way.

CLIMS Pilot No.2 (Gray, 1997, p.258-9), was constructed with a system of inference engines. It was designed as a prototype expert system program, not a shell, by Tim Flannagan and the candidate in 1989, and programmed in PROLOG, as part of the first ESPRIT legal project, Foundations of Legal Reasoning, which was led by Chapter One: Meta-epistemology 52

Flannagan's company, Machine Intelligence in Cambridge, England. The rule base of the first CLIMS Pilot was rewritten as PROLOG code for CLIMS Pilot No.2. The system of inference engines could apply rules of contract law to the user's case in three different ways, depending on assumptions that the user wanted the quick solution, namely the Final conclusion simpliciter, advice of any uncertainties, or the full strength of a case; reasons for the conclusion of any of the engines were also available. The user could select the required engine. A user's case could be reworked by changes of user input that would amount to deformations of the user case, requiring deformations of rules in the logic programming of the ontology of legal possibilities. The early version of PROLOG did not manage deformations requiring double negation, and it was part of the Project to investigate this problem; Flannagan had worked previously with the company Logica in the Alvey Diamond Project, which also addressed this problem (cf. Poole, 1988; Reiter, 1980).

Difficulty was found in programming a River, as distinct from a tree, as, subject to disjunction, the antecedents in all tributaries were prima facie necessary and sufficient conditions for reaching the Final consequent. Decision trees for programming require only one pathway through the branching. Extended legal deduction, with a hierarchy of disjunctions, consists of alternative, overlapping sets of necessary and sufficient conditions; algorithms for testing the tributaries of the River object are required as logic procedure. For large scale systems, such algorithms may be provided generically, no matter how extensive or complex is the application River. The computational epistemology of 3d legal logic is the basis for determining the jurisprudential system control for these algorithms.

The duplication of the CLIMS Pilot No.2 prototype in another substantive area of law, would be costly to construct and maintain, and would require PROLOG programming expertise. Systems of inference engines may be just as difficult to program as ad hoc modelling of legal logic for state machine systems that proceed from one prescribed information state to the next. The model of legal reasoning on which CLIMS Pilot No.2 was designed, prior to the discovery of the visualisation of 3d legal logic by the candidate shortly afterwards, was too shallow to provide all the design solutions for a user-friendly, versatile shell. The design of a system of inference engines, or a system of algorithms, is only epistemologically sound if it accords with a deep model of legal expertise; the computational epistemology of 3d Chapter One: Meta-epistemology 53 legal logic is part of that deep model, as it provides the holistic knowledge representation structure of legal logic for the management of the ontology of legal possibilities. Negation must be dealt with in accordance with these implied knowledge structures of the judicial method for handling combinatorial explosion, the deformations of the ontology of legal possibilities.

Formal logic which communicates, algebraically, clear alternatives and requirements for consequents, is not expert-friendly for the legal domain and not user-friendly as a trace. The unfriendly code of the first CLIMS expert system could be accessed during a consultation as a trace to monitor the processing of the inference engine. When the CLIMS rule base was rewritten in the logic language, PROLOG, as part of CLIMS No. 2, a prototype system of inference engines, there was no trace available, but reasons were given for the conclusion reached in a consultation, in natural language that set out the extended deductive path for the user's case. Interestingly, at an ESPRIT Project seminar at St John's College, Cambridge University in 1989, a member of the English legal profession who saw and compared the first and second CLIMS programs, preferred the first ESIE CLIMS to the more sophisticated PROLOG version, because it offered trace transparency. The legal profession is not at liberty to assign its responsibilities to clients, without a trace. Technological Jurisprudence must provide the user-friendliness of transparency in the interface of a legal expert system, both in the construction and consultation of an application; the transparency must be suitable for large scale systems.

In this thesis, it will be shown that legal epistemology has an idiosyncratic pattern of logic and system control meta-rules, in order to process its ontology of legal possibilities, and these may be given effect through an interface that conforms to the requirements of legal epistemology. This is the basis for determining a program design with a system of processing algorithms for the construction of a transparent, expert-friendly, legal knowledge engineering shell.

● Semantic nets and command language

List languages showed how semantic networks (Quillian, 1968), relational databases (Codd, 1970), frame-based systems (Minsky, 1981) and agenda systems (called blackboard systems), first produced by DARPA (Defence Advanced Research Projects Agency), might be constructed. Chapter One: Meta-epistemology 54

Formal logic programming per se did not raise concern for symbolic meaning; however, the semantic content of formal logic code could invalidate the logical assertions, as illustrated by Waller, above. After the General Semantics of Korzybski freed up the modelling of conceptual relationships for processible meaning, from Porphyry hierarchies to irregular graphical representations of the relationships between objects of meaning, Quillian (1968) developed the concept of semantic networks. An early approach to artificial intelligence was to model the human brain as neural networks (Minsky and Papert, 1972). Quillian departed from this approach by modelling the metaphysics of the mind through language use. His semantic networks were flowcharts of meaning; they were a graphical representation of the related components that constitute an object which might be a thing, matter, circumstance or situation. A semantic network consisted of nodes containing information linked to other nodes containing information; links were given relationship labels which might be 'is', 'has', 'holds', or something else. Processing of a semantic network proceeded from one node of information via the relationship link to another node of information; a user might interrogate the semantic network to find out the information in nodes and how these posits of information were related.

Semantic networks were used as legal knowledge structures with integral processing links in another early legal knowledge engineering project, the LEGOL Project, which sought to develop a legal language for computer programming. Also beginning in 1972, the LEGOL project (Stamper, Tagg, Mason, Cook and Marks, 1982; see also Cook, Hafner, McCarty, Meldman, Peterson, Sprowl, Sridharan, and Waterman, 1981), from the outset sought to develop a computer language for law, using semantic networks to represent rules of law (Stamper, 1980). Meldman (1975, 1977) also used semantic networks as well as logic for his expert system model.

The formalisation of the extended deductive order of overlapping rules may produce an expression of rules different to the black letter law expression; however, the meaning must be the same, given the logical interpretation of contradictories and uncertainties. Depending on the language of the black letter law, this may be easy or hard to achieve (cf. Gardner, 1987). Natural language is sometimes logically primitive, so that it may be ambiguous, and sometimes does not clearly communicate the choices available in a rule system. If the natural language of a black letter law suggests alternate logical interpretations, then they must be accounted for, as such, in Chapter One: Meta-epistemology 55 the legal expert system. Detachment of antecedents in black letter rules as ontology, permits a reconstruction of the rules as an extended deductive system with the same meaning as the black letter rules; this is an epistemological use of semantics and ontology.

Semantic networks produced irregular knowledge structures, depending upon the language of the rule. The linking of the semantic networks as rules for extended deduction, problems of open-texture, and the coding of the ontology of legal possibilities as consistent deformations, remained problematic. In the legal domain, a rule such as if a, b and d then c might be consistent with the rule if not a. b and d then not c, so that the duplication of b and d in rules with contradictory consequents, had to be accounted for in the processing of the semantic nets that represented the two rules.

The potential invalidity of meaning in logical forms and logic processing, might be cured by semantic networks, especially where inductive semantic networks link to open textured nodes in the rule semantic network, and justificatory semantic network link as abductive adjuncts to the system of connected semantic networks. However, semantic networks also raised the problem of detailing when applied to the legal domain: what is the detail requirement of a deep model of legal expertise, and at what level of detail is there a workable common knowledge structure for common algorithmic processing?

The generic deep model of epistemology in the legal domain must certainly allow for the substantive detailing of legal information in knowledge structures, to the full extent required by legal epistemology, as eGanges does. Knowledge requires memory, and legal epistemology provides for knowledge storage, processing and retrieval, to be systematic. Program epistemology may trim storage requirements to efficient levels, as it is not limited to logic; the eGanges shell epistemology minimises the requirement for detail specification to ensure a shell and applications that require little memory. eGanges requires less than 250 KB, with Java of about 16 MB. Program epistemology can deal also with the means of ensuring semantic validity in logical forms, and processing within its efficient data storage system. The freely navigable interactive visualisation of the extended deductive rule structure, the nested River system, in eGanges may be treated as a storage system for inductive material facts of cases and abductive glosses of the rules, as well as a procedure for Chapter One: Meta-epistemology 56 categorisation of client cases in order to determine legal outcomes and provide a basis with scope for new cases.

The distinction between an antecedent in a rule and a fact in a case that satisfies that antecedent, is primarily the distinction between law and a material fact that establishes the antecedent. However, sometimes the material fact becomes an antecedent in a new rule, with open-texture; for instance the term 'offer' in contract law is an antecedent, but evidence in terms that an 'offer' was made is acceptable also as fact. The facts of an 'offer' being made will nevertheless be considered to see if the antecedent is satisfied. According to the doctrine of precedent, it is the rule(s) identified in the judgment(s) of a case as the reason(s) for the order(s) made, namely the ratio(nes) decidendi, that are binding as rules of law, not the material facts of the case. The precedent is an authority for the deductive rule, how the rule applies in the inductive fact situation of the case, and the abductive reason for the rule; the material facts are the inductive instances of rule antecedents. As a deep model of legal epistemology, a computer language for law would have to effect this model through its language categories, as well as be reconciled with machine code. The knowledge structure of rules are separate from the knowledge structures of material facts; each is processed by distinct but related algorithms. The algorithms of extended deduction process rules that have extended deductive structures; induction algorithms process material facts to establish which antecedents in rules are established as Minor deductive premises. Users may select their appropriate material facts from an inductive list in gradient order of analogy, to activate the extended deductive algorithm.

During the 1960s and 1970s, computer languages higher than machine code, such as ANSI-COBOL (American National Standards Institute-Common Business Oriented Language), BASIC (Beginners All-purpose Symbolic Instruction Code) and PROLOG were developed with some natural language to effect processing controls. The simplest controls were conceived in terms of commands such as go to, get, match, sort, list, run, break, cont, print and save; these were translated by an Interpreter or compiler into appropriate machine code made up of a predetermined pattern of 0s and 1s, such as 0001001110, that operated as processing instructions. Processing instructions depend on these sort of commands, so that epistemological algorithms in the legal domain must be transformable into such commands. Chapter One: Meta-epistemology 57

In early computer languages that built on machine code, including in Assembler languages that are one generation above machine code, and in third generation languages, if ... then clauses, which are processing rules, are building block branching instructions that perform by jumping from one instruction point to another. The fundamental rule paradigm of programming is comparable to the fundamental rule paradigm of the legal domain; in legal knowledge engineering, these fundamental paradigms have to be matched up in detail at the appropriate deep level. With the development of higher languages, algorithms could be implemented by a sequence of commands and branching instructions.

Although McCarty, beginning with rules as lists, and Allen, beginning with legal logic, had both posed a language for law as a deep domain model, and the LEGOL project had produced semantic networks of rules, Du Feu and Adler had already shown by 1975 that ANSI-COBOL, a third generation computer language suited to commercial domains, could be used to program an effective legal expert system (Du Feu, 1980). Their system, which provided welfare benefits advice was so successful when it was made available to the public in Edinburgh for six weeks, that it significantly increased the number of welfare claims; then it was closed down.

● Hybrid rule-base and case-base systems

Smith and Deedman (Deedman, 1987, Smith and Deedman, 1987) approached the development of a deep model for the legal domain through the reconciliation of rule- based and case-based reasoning. This reconciliation of rule-based and case-based reasoning is a major problem in the legal domain and an integrative structure or process has not yet been settled in jurisprudential theory. The problem arises because casuistry has been used to particularise and develop rule systems since case reports became popular and reliable, following the development of the printing press. Smith, like McCarty and Allen, also eventually sought a computer language for law solution.

Legal practice has evolved through four stages, and the dominance of rule systems and then casuistry belong to the last two stages respectively (Gray, 1997). This transition from the third stage to the fourth stage has not been fully rationalised. In the first stage, the ritual of spoken formulae, whereby oaths and oath-helpers could prove a case, was dominant until it was diminished with the abolition of the Chapter One: Meta-epistemology 58 substantive significance of the formulaic oath; evidence is still given under an oath or ritual affirmation to tell the truth, and officers of the legal system such as legal practitioners and judges take an oath to mark the commencement of their appointment. The second stage was dominated by forms of action in which a case was pleaded in terms of the applicable writ or written form of action; this is still required in a more flexible way in modern pleadings, which have some bearing on the use of rules in relation to cases.

It is through judicial judgments, and judicially approved juristic works, that the legal profession is able to glean the nature of the rule systems and how they are formulated through the reconciliation of precedent cases, or otherwise. In judgments, commentary is restricted to relevant matters, as obiter dicta are not binding; overall, in judgments, there is a reluctance to closely and holistically specify the system of rules and the process of reconciliation of rules and cases, in order to leave flexibility for further clarification in the light of new cases and social changes.

McCarty saw the transformation from a case situation, back to precedent cases, and then a return to the current case with a resolution and explanation of differences, as requiring deformations of the rules in the precedent cases to suit the case situation before the court. He closely examined the process of deformations in a line of decisions from one case, Eisner v. Macomber, 252 U.S. 189 (1920) (McCarty, 1995) and arrived at the following conclusion:

Legal reasoning is a form of theory construction....Legal theories should be consistent with past cases, and they should give acceptable results on future cases.... One term that is sometimes used to describe these requirements is “coherence”. Although difficult to specify precisely, the link between coherent legal theories and persuasive legal arguments seems to be quite strong.(McCarty, 1997, p.221)

The deep model posed in this thesis provides knowledge structures and processing as a means of legal theory construction. The ontology of legal possibilities is a framework for coherence of all established rules,and the possible cases that fall within their ambit, as well as a basis for incorporating future cases coherently, as potentialities realised in new cases.

Rules of law are theoretically connected in various ways, even in legislation, but they are applied deductively, and threads of rule systems are to be found in precedent case authorities. A linking of rules of law for extended deduction sometimes Chapter One: Meta-epistemology 59 fragments the case threads of these rule systems. The holistic extended deduction system prevails over the coherence of a case. Sometimes the coherent structure of a case may be mapped to the rule system as a constellation of points or by an irregular structure that, in parts, stretches above points in the extended deductive rule structure to other various points in the extended deductive rule structure where it connects. It is very difficult to construct a holistic rule system from coherent case structures that must be fragmented or stretched, discontinuously to the extended deductive rule structure, to points in the rule system that they support. Concern is warranted as to the implications of these distortions for the justice of the case authority; it is assumed that coherent justice arises from the holistic rule structure, because each component provided by a case is founded in justice. Justice does not spring from one case alone, but from a system of rules that applies to all cases consistently and coherently, albeit sometimes discontinuously in regard to the holistic rule structure.

Rule systems were developed prior to the Casuistry Stage of the legal system, as a way of extending the law by theoretical necessity; in both the times of monarchical dominance and times of rising democracy, the judiciary could 'find' law, but not make it. Rule systems that were drawn from written forms of action, retained their prevalence in the Casuistry Stage because they brought coherence to the chaos of casuistry. However, cases arose profusely with diverse minutiae stretching the cohesion of rule systems; they did not arise in an order to suit incremental extensions of the rule systems, and some fragments of new rules were established in isolation, pro tem.

In the legal domain, what is chaos and a lack of scientific precision passes as an acknowledgement of what is required for justice continuously over time. Justice was not pre-empted by theoretical structure, but it arose from established custom and the empirical base of case facts as they occurred. Like keyhole surgery, justice is usually achieved in litigation; cases are pleaded for the relevant keyhole and the same Gray's anatomy of law is presupposed for the many keyholes required. Otherwise, the practice of law would appear to be an intelligence of intuitive mystique; intuitive justice is a gift that cannot be learned. When people cannot learn, access or anticipate their law, they are alienated from their legal system; this is inappropriate in an information age. The weight of the recent information explosion in the legal domain has largely killed off the availability of the law to people, so an autopsy of the legal Chapter One: Meta-epistemology 60 domain is now appropriate to discover the deep structure of legal intelligence which could provide vitality to artificial legal intelligence as Stage 5 legal practice that follows the Casuistry Stage in the life cycle of the legal system (Gray,1997, p.135).

Shannon and Golshani (1988, p.311) defined deep models as ones that model meaning and not just words. For legal knowledge engineering, full meaning models such as the ontology of legal possibilities, with its associated comprehensive logic of deduction, induction and abduction, have to be suited to translation into a computer language which, at its deepest level, is binary code. The keyhole of a legal expert system consultation must reveal the relevant part of the anatomy of law, operated by binary level processing. The developments in AI and law have followed progress made in the field of artificial intelligence, as programming expertise is required to translate binary code into higher levels as human language or vice versa. It was assumed that the generic intelligence that was automated as artificial intelligence, included legal intelligence. Computer data can symbolise human language just as any writing can do; intelligent use of those symbols requires a deep model of intelligence, according to which a computer can be provided with instructions on what to do with those symbols. For artificial intelligence, the deep model must be computational, and for artificial legal intelligence the deep model must be a computational legal epistemology.

A student of Smith and Deedman, Andrzej Kowalski (1991), posed, as a deep model solution, a cuboid of cases as the knowledge structure, with depth and breadth searches through the cuboid; this is reminiscent of the filing cabinet model used in the design of early databases (cf. Hollis and Hollis, 1969). Figure 1.7 shows the Hollis and Hollis filing cabinet model and its development; Figure 1.8 shows Kowalski's case retrieval system.

In order to provide integration of rule and case systems, as a deep model of legal expertise, Smith (1997) ultimately resorted to a language solution, namely, the postmodernist view of language of Lacan (1977, 1988, 1993) and Derrida (1976, 1978), to establish a 'teleo-analytic' jurisprudence that combined goal structures of law with dynamics of decision making. Signifiers and chains of signification in legal doctrine, were to be matched to the signified material facts of life. Chapter One: Meta-epistemology 61

Figure 1.7: Development of the filing cabinet 3-D scheme for data retrieval – J. W. Hollis and L .U. Hollis (1969), Personalizing information processes, the Macmillan Company, London, Figure 7-1, p. 137 and Figure 7-2, p. 144 Chapter One: Meta-epistemology 62

Figure 1.8: Case retrieval cuboid system of A. Kowalski (1991) – Figure 1 in Case-based reasoning and the deep structure approach to knowledge representation in Proceedings of the Third International Conference on Artificial Intelligence and Law, ACM, NY Chapter One: Meta-epistemology 63

The early techniques of artificial intelligence are not necessarily adaptable to the legal domain; they may fall short of the deep model of human intelligence that is used in the legal domain. Legal epistemology may supplement these shortcomings and, by doing so, it may throw new light on generic human intelligence. Even if existing techniques of artificial intelligence could be used in legal knowledge engineering, this technology does not allow for quick and easy construction of large scale legal expert systems by legal experts with limited understanding of computer programming; to be effective, an epistemologically sound methodology for such construction is essential. In this thesis, solving the quick construction problem, solves the problem of providing a deep model that is computational and epistemologically sound. It is not so much that a language for legal discourse must be superimposed on rule processing, but that a communication system based on lawyer-client dialogue must drive the logic system of extended deduction, induction and abduction, through the ontology of legal possibilities of an application, as it does in legal practice.

The development of the deep model of legal epistemology in this thesis, is a progression of the work of legal logician, Fraunce (1588), an Elizabethan lawyer of Grays Inn, who is regarded in this thesis as the first legal knowledge engineer, long before computer technology established the field. Fraunce provided the first graphical representations of case reasoning. His graphical analysis of the arguments and judgement in the Earl of Northumberland's Case (1567), according to the Ramist method (Ramus,1543), is discussed in Chapter Four.

The logic method of Ramus (1515-1572), a scholastic logician in Paris, who saw the transition from the Renaissance to the Reformation, was a systematisation of Aristotle's Organon so that the various techniques of reasoning could be used more readily in appropriate sequences, non-monotonically if required; it was popular learning in academic circles in England during the time of Fraunce. In Chapter Four, the thesis also draws on the Tree of Porphyry whose introduction to Aristotle's work, the Isagoge, accompanied the Organon through the Middle Ages; it represented Aristotle's ontology of substance, showing its integral deductive structure. As a resource, Aristotle's Organon may be approached as a study of human reasoning which might assist automation. Although such a comprehensive study is not within the scope of this thesis, some few inroads are made. The tree of Porphyry is the Chapter One: Meta-epistemology 64 original paradigm of the decision tree that is now a common knowledge engineering method of representing domain knowledge. This thesis explains the River in contrast to the tree.

The adversarial patterns of legal argument revealed in Fraunce's book, Lawyier's Logic (1588), may have contributed to Lord Chancellor Francis Bacon's Novum Organum (1620), the New Organon, which suggested that patterns discovered by empirical epistemology through systematic consideration of negation in the patterns, might be a basis for reasoning that would lead to discoveries of new knowledge. Fraunce and Bacon were contemporaries and both were at Gray's Inn.

The works of Ramus were brought to New England by the Pilgrim Fathers, and influenced scholars at Harvard University (Leith,1990, p74) where early computers were designed and built after World War II. Ong (1958, 1983, p.viii), who made an extensive study of the works of Ramus, later observed a connection between the Ramist systematisation of logic and computer technology:

One connection that would have to be brought out would be the resemblance of Ramus' binary dichotomised charts ... to digital computer programs. Like computer programs, the Ramist dichotomies were designed to be heuristic: ... The quantifying drives inherited from medieval logic were producing computer programs in Ramus' active mind some four hundred years before the computer itself came into being.

It may be that the teachings of Ramus influenced Pascal (1623-1662), who built the first simple digital calculating machine in 1642, and subsequently, Leibniz (1646- 1716), who constructed a stepped wheel system in 1673 and exhibited it before the Royal Society in London, as well as setting out computational theory which advocated the binary number system of the ancient Chinese philosopher Won-Sang (1182-1135 BC), who wrote The Book of Permutations (Bowden, 1953, p.33). In turn, Ramus may have been influenced by the work of Abelard (1079-1144), Sic et Non, that illustrated inconsistent, contradictory arguments in theology; and in turn, Abelard may have been influenced by the new approach to teaching called the computus which was developed by Alcuin of York (c.735-804 AD), during the Carolingean renaissance. Charlemagne, the first Holy Roman Emperor, employed Alcuin as an educational adviser; Alcuin reintroduced classical dialectic epistemology to school instruction and established scholasticism within the Roman Chapter One: Meta-epistemology 65

Catholic Church. Schools were ordered by Charlemagne’s Capitulary 805 to teach the computus (Deanesly, 1929, p.774).

Leith (1990, p.75) agreed with Ong:

Ramus was, indeed, an early programmer. His whole attitude toward algorithmic method (in fact it could be said that he invented the term method as we now know it – see also (Gilbert. 1960)), knowledge and handling information is strikingly modern.

A method common to the expert domain and to the programming of a legal expert system, facilitates the transformation from one epistemological configuration to another in the development of a large scale legal expert system. It may be that a study of the history of philosophical dichotomies would produce a rich source of material for the development of artificial intelligence or its domain applications, but such a study is outside the scope of this thesis. Legal reasoning certainly pivots on inductive distinctions which are significant and can be fine. The trichotomy of the legal practice lawyer-client communication system, namely goal-positive, goal- negative and goal-uncertain, that require a three dimensional view of the rules of law that can be derived from the express black letter law, are translated into binary language in the eGanges program.

In philosophy there is an evolution of major epistemological dichotomies from the pre-Socratic dichotomy of existence and non-existence through the dichotomy of existence and appearance developed in rhetoric, the dichotomy of appearance and truth of Socrates, the dichotomy of form and reality of Plato, the dichotomy of valid and invalid arguments of Aristotle, to the dichotomy of inconsistent valid arguments of Abelard, and beyond to modern binary code. The dichotomy of consistent valid arguments with contradictory final results that is the adversarial reasoning of legal epistemology, is clarified in this thesis.

The dichotomy of Ramus is shown in the graphical structure of his method which is set out in Figure 1.9. He distinguishes the reasoning of invention and the reasoning of judgment. This is an important distinction in the legal domain. Applying the law without question is judgment; arguing for a change to, or adaptation of , the law, is invention. Both judgment and invention, in legal epistemology, may use rules and cases; as shown by Fraunce, both judgment and invention may be required in a case. In this thesis, the scope of legal expert systems is limited to judgment, but such a Chapter One: Meta-epistemology 66

Figure 1.9: The dialectic method of Peter Ramus source: P. Miller (1954) The New England Mind, Harvard University Press, Cambridge MA p. 126 Chapter One: Meta-epistemology 67 legal expert system establishes the basis for invention; indeed an understanding of established law is a prerequisite for inventive argument. Whitehead also understood this distinction as noted by Buchler (1961, p.159) in his study of method:

It is not easy to understand what Whitehead means by “a method of novelty” and a “method of repetition.” The gross import of the context seems clear enough. Methods can deteriorate from inventive processes into sterile rituals or habits. But an inventive method does not deserve so poor an appellation as “a method of novelty.” Is such a method one that is somehow in behalf of novelty? Is it a method that is not necessarily in behalf of a novel end but novel only in its use of means? Is it a method that yields results each of which is novel or only the type of which is novel? And what is “a method of repetition”? All methods, inventive or routine, good or bad, are repeatable as methods. Does it make sense to say that some methods aim solely to accomplish repetition? But repetition must be either of some process or of some result. Are some processes repeated in total disregard of an end? If no end is sought, can we speak of a method at all? If a given end is sought repeatedly – if similarity of results is what motivates the method – then the value of the method is to be determined not by the mere occurrence of repetitions, but by the kind of repetitions that occur, and by the perspectives in which they occur. Both repetition and novelty are compatible either with uninventive method or with method informed by query. Novelty by itself is not enough to introduce the interrogative temper. On the other hand, the utmost regularity in method may serve as a vehicle of query. Thus ... the most deeply rooted of customs in moral action, may be instrumental to invention.

In legal knowledge engineering of case-based systems which retrieve the nearest case(s) to the user's case, the case 'factors' that are regarded as significant are, jurisprudentially, a hybrid of rule antecedents and material facts. The difficulty of the distinction remains. The factors may have a place in the hierarchy of rules and also in the spectrum of inductive instances of abstract antecedents, both of which must be taken into account in determining the applicability of the Final conclusion in the user's situation.

Reconciling available cases without reference to the hierarchy of rules stated in their judgment(s), presents the task of putting together the hierarchy of rules like pieces in a jig-saw puzzle, where there are irregular overlaps in the jigsaw pieces, so that, as separate pieces, the pieces do not fit together as matching gig-saw pieces. This is a task that must be undertaken by lawyers where a new hierarchy of rules is emerging and there is no available system of rules. Retrieval of relevant cases may assist the jigsaw construction of the system of rules, so that pieces of the puzzle can be allotted to the relevant case authorities. Just as formalised rules overlap to produce a system Chapter One: Meta-epistemology 68 of rules, sometimes cases overlap as a system of case authorities; sometimes case pieces are fragmented over the whole constructed jig-saw of the rule system, even though the case points can come together again when separated from the jig-saw of the rule system.

The use of semantic nets in legal knowledge engineering, which brought with it the problems of the computation of natural language, and legal issues of the distinction between antecedents in a rule of law and material facts in a case situation, also highlighted the role of language in the jig-saw problem.

Overlaps in the jig-saw do not always occur in adjacent pieces. Branting recognised that retrieval of relevant cases should pertain to potential or actual issues in litigation, and it was not necessary to find precedents that were similar in all respects in regard to the extended deduction pathway of a case and the inductive instances in all the antecedents along the pathway. In his system, GREBE (GeneratoR of Exemplar-Based Explanations), Branting (1991) refines the granularity of precedents for retrieval by identifying precedent constituents that permit multiple partial matches of cases to be strong authorities, not weakened by differences. Overlaps in the pieces of the jig-saw, where more than one case is an authority for that piece of the jig-saw, may resolve an issue of whether or not an inductive instance satisfies the rule antecedent. The relevant pieces of several cases also may indicate the gradient of analogy or iteration to allow argument of the scope of the range, and where the sector boundaries might fall in the range of instances that support an antecedent, its contradictory and its uncertainty. Pieces from contradictory cases may be also useful in this determination of argument. However, there is still the requirement to consider the hierarchy of rules before determining the appropriate final conclusion of the rule system applicable to the user's case.

The leading case-based intelligent retrieval system, HYPO (Ashley, 1990), deals with an area of law, trade secret protection, which has a rule system, but, like equitable discretion, requires the relative weighing up of a number of factors in order to establish a material fact situation that is one of the antecedents, namely a trade secret. The facts that are weighed up in regard to one factor may affect the way in which other factors are weighed up. It is possible to enumerate the facts that contribute to the weighting of each factor with a view to eliciting some rule in a case about how the discretion will be exercised. Such rules are not deductive rules of law; Chapter One: Meta-epistemology 69 at best they are rules of exercise of discretion or power. A subsequent exercise of discretion that has an inconsistent result is not a matter where the law has not been followed. The law is that there is a discretion to be exercised in each case, and there can be no fettering of the exercise of the discretion by reference to an earlier decision. Nevertheless, precedent cases are helpful to arguments about how the discretion should be exercised; discretionary weights given to each factor in judgments of the court are not numeric, are difficult to deduce, and difficult to determine precisely in relation to each other. Legal epistemology does not include fuzzy logic or statistical assessment of the proximity or similarity of cases.

In the early jurisdiction of equity, Lord Chancellor Bacon presided over the formulation of substantive rules of equity that were integrated into common law rule systems which they qualified. Following the amalgamation of the courts of equity and common law in the nineteenth century, discretionary elements have been introduced to new common law rule systems; the reasonable person test which is an antecedent in the system of negligence rules, is really a matter of equitable discretion. Equitable discretion remains in granting equity orders of injunction and declaration; it is subject to a list of factors such as a consideration of the delays of the parties in bringing the application before the court, and whether the applicant has 'clean hands' in the matter. The legislature will set up discretionary powers where the formulation of a system of rules is too contentious, such as in the settlement of property disputes in divorce proceedings.

As pointed out by Popple (1996, p.118), Berman considered that the representation of rules of law, falls far short of fully representing legal knowledge. Berman asserted the value of a legal expert system if it was useful. HYPO, which retrieves cases for argument, is appropriate and very useful in discretionary areas of law. This thesis is not concerned with discretionary fields of law and the retrieval of the most useful cases by methods that fall outside legal epistemology. It is predominantly concerned with the development of epistemologically sound large scale legal expert systems which locate case authorities among the inductive instances of antecedents, in deductive rule systems. The jig-saw problem is managed in this way. Case pathways through the ontology of legal possibilities allow case coherence through the rule structure. Coherence and consistency between cases is required for justice (cf. McCarty, 1997): all people are equal before the law, means that people in the same Chapter One: Meta-epistemology 70 situation are treated in the same way. The ontology of legal possibilities sets out the field for assessing coherence and consistency between cases. The investigation of whether or not there could be an ontology of legal possibilities derived from the collective of precedent cases in an area of equitable discretion, is outside the scope of this thesis. Filling the open-texture of discretionary factors may be a matter of relative induction.

It is to be noted that sometimes cases can be matched by abductive authority as well as deductive and inductive factors. Equity was originally to be found in an abductive rule system that qualified the common law rule system. If the equity sub-system embedded in a common law system of deductive rules, is discretionary, to that extent it makes the common law system discretionary.

In equity, discretion exercise rules are not of a deductive nature; they establish inductive clusters of instances with a collective effect of relevant factors. Contradictories and uncertainties of each of the factors in the cluster, may affect a different result. Clusters of material facts that effect each other may effect satisfaction of discretionary factors in unstable ways. The whole is greater than the parts, is a system argument; in this sense, alternative, overlapping cluster systems replace the alternative, overlapping pathways of extended deductive Major premises. Fragments of the cluster factors with their relationship to other factors in their cluster, become the discretionary jig-saw puzzle. Only fragments with certain relationships attached can be the new antecedents for argument; a form of relational logic is required for deduction. Discretionary instability in the systems is not an area of legal epistemology fully accommodated in the limitations of this thesis. The discretion exercise rules that might be determined may or may not provide deemed extended deductive premises for a full ontology of legal possibilities in the relational logic that could be treated as a rule system; but even so the problem of fettering discretion remains.

The ontology of legal possibilities in equitable discretion allows the combinatorial effects of possible combinations of material facts to be a level of meta-possibilities not contained by rules but accessed and managed by discretion. It is the material facts that combine, not antecedents, that delineate the scope of the discretion. Even in assisting case arguments, it is the possible combinations of instances of material facts that would be difficult to process heuristically. The thesis does not venture this Chapter One: Meta-epistemology 71 far, due to its major focus, but the methods used in the thesis indicate how the inductive heuristics of legal epistemology might be specified.

In a sound legal epistemology, legal ontologies, rules and cases are given a place relative to each other for use in legal practice, so that the law may be applied to clients' cases and purposes. In its prototypes, this thesis poses and implements such an epistemology.

A deep model that integrated rule-base and case-base reasoning through the structure of an agenda was posed by Skalak and Rissland (1991) in a blackboard system called CABARET. The agenda required epistemological soundness as it drew on the case- based module and the rule-based module where required. Both modules failed to capture the ontology of legal possibilities. In the case module, like HYPO, only decided precedents were available, and in the rule module, only express black letter law rules were included. Processing of the agenda did not incorporate any algorithm to implement the ontology of possible cases. The premises for legal reasoning require categorisation as deductive, inductive and abductive, so that their relative significance in legal argument can be understood. The design of these systems are derived from technology precedents, not sound legal epistemology, in accordance with the prevailing practice in the field of artificial intelligence and law to follow, rather than lead, the advances in artificial intelligence; the component modules and an agenda for using them in one system, do not provide for the generic mosaic of logic forms used in the teleological epistemology of lawyers, and thus the applications, in specific substantive areas of law, lack sound epistemological structure and processes. To be epistemologically sound, legal expert systems that provide access to rule-base modules, case base modules, and databases co-ordinated through a blackboard agenda, must have an epistemologically sound agenda with epistemologically sound modules. Combination per se does not produce an epistemologically sound deep model for the legal domain.

In the meaning and expert practice of law that may be stated ontologically as data, there is an ostensible authoritative meta-rule that requires the application of law to cases by deductive necessity: when the facts of a case satisfy the requirements or antecedents of a rule of law, then the rule applies to determine the outcome of the case. Thus, the rules of law are the Major deductive premises in legal reasoning, and the facts of a case provide the Minor deductive premises. Once there is established Chapter One: Meta-epistemology 72 by the facts of a case, the Minor deductive premise(s) of a case, then the consequent of the Major premise applies necessarily or automatically as the legal conclusion. It is this deductive necessity in the application of law to cases that determines the epistemological structures in the meaning of ontological statements of law; the consequent in the Major deductive premise may be adopted as the client's goal. Logic is a matter of epistemology that can be used as teleology in legal epistemology.

Hence, legal experts apply law by first devising in the meaning of ontological statements of law, the logical structures that permit deductive application. This determination of logical structures is particularly important when many rules are relevant to a case and complex extended deduction is required. These are the concerns of large scale legal knowledge engineering. It is the logical structure in the meaning of ontological statements of laws, i.e. their epistemological structure, which is the knowledge that experts apply in determining the interim and concluding legal consequents or categorisation of a client’s case.

In addition, case law may provide inductive instances that expand on antecedents in the rules of law, to better facilitate extended legal deduction. Reasons for rules or other information may also provide a strengthening or weakening of a rule in the deductive or inductive argument; thus abductive premises may be woven into deductive and inductive legal argument.

Along with deductive rules of law and their inductive expansion by case instances, abductive premises may also be established by law-making authority. The penumbra of principles identified by Dworkin approximates these abductive premises. For instance, in Donoghue v Stevenson [1932] AC 562; [1932] All ER 1, Lord Atkin established the duty of care as a consequent of the abductive commandment ‘Love thy neighbour.’ If this commandment were adopted as a deductive antecedent, then it would be a defence for the defendant to allege that the plaintiff was loved by the defendant. As this is not a defence in negligence, the commandment is abductive only, providing support for the deductive antecedent, the duty of care; the defendant’s subjective emotional state of love would be difficult to counter, as it can co-exist with negligence toward the plaintiff. As only an abductive support for the duty of care, the commandment does not prevent actions or omissions from speaking Chapter One: Meta-epistemology 73 louder than words or emotional states of love, in the deductive and inductive detailing of a breach of the standard of care.

Case facts include material facts and mere facts that are wholly or partially immaterial to the requirements of the rules; for instance the colour red of a traffic light is an antecedent in a traffic regulation, whereas it may be a mere fact that a murder victim had brown skin, albeit a mere fact that might be significant evidence for the purposes of proof of the crime.

The approach taken in this thesis is that, in legal domain epistemology, the meaning of black letter law may be completed with the assistance of ontology and logic; system controls that ensure the application of that meaning add to the deep model.

This thesis takes the approach of distinguishing the legal domain epistemology from the epistemologies of artificial intelligence, so that the requirements of the legal domain epistemology can be specified and then given effect in the design and programming of a generic shell that suits the legal domain. This also distinguishes the generic system in the legal epistemology as a deep model, from the deep model of meaning in the substantive field of an application; the generic epistemology must accord with the substantive form and vice versa.

In the specific meta-epistemological method, the initial determinations of legal knowledge, legal logic and matters of jurisprudential system control in legal epistemology, are subsequently transformed through a series of further epistemological constraints of computer science which ultimately produce an epistemologically sound legal expert system that may be large scale. The legal domain has idiosyncratic knowledge structures, logic patterns, and system controls that may be the infrastructure of artificial legal intelligence through the requisite epistemological transformations. The prototypes explained in this thesis illustrate the transformations required.

Meta-epistemological methodology in legal knowledge engineering provides for the prior analytics of black letter law that determine its scope of legal ontologies, or possible cases within its range, and the formalisation of rules of law as Major premises for application to cases by way of extended deduction; on this basis, knowledge representation, interrogation, communication and computational heuristics for a legal expert system may be determined for automation. As a method, Chapter One: Meta-epistemology 74 the meta-epistemological methodology may be as rich as the dichotomies specified by Buchler (1961, p.159): it may be both creative and repetitive.

In the legal domain, there is presently a knowledge acquisition bottleneck (Feigenbaum, 1981, p.226) due to a lack of knowledge acquisition methodology. If there is to be a computerised codification of law, with the vitality of intelligence, requiring large scale legal expert systems, as a new stage in the life cycle of the legal system, then it is unlikely that legal logic will remain covert. Technological Jurisprudence will require precise computational legal logic.

Waller (p.181) maintains a systems view of legal logic by reference to Wisdom (1951, p.195):

Professor Wisdom made a penetrating remark: he proposed that lawyers' arguments “are like the legs of a chair, not like links in a chain”. Common sense, history, analogy and so on, support one another if the issue is at all complex. This is the type of logic that the ancients knew well and valued highly under the name of rhetoric. It was extensively used in medieval times for practical judgments. Only in the last three years did logic – in a vain effort to make thinking mechanical and perfect – come to include only formal logic. But throughout these centuries lawyers have gone ahead using rhetorical reasoning with excellent results. (“Rhetorical” here is not to be confused with fulsome oratory, unfair appeals to emotions and extravagant language.)

For large scale legal knowledge engineering, this thesis also poses a metaphorical chair of artificial legal intelligence, with four legs being, respectively, the ontology of legal possibilities, deduction, induction, and abduction. The rule system of extended deduction, which is produced from the prior analytics that gives effect to the ontology of deductive possibilities, provides the chaining for automation that may bring with it strings of inductive possibilities and chunks of abductive possibilities. The seat of the chair is the communication system of the interface, and the back of the chair is the heuristics or system controls that give effect to the operations. Metaphorically, the deep model of artificial legal intelligence is the system of this chair; it is useful.

The thesis takes issue with Popple's (1996, p.50) conclusion:

Deep conceptual models of legal reasoning are inappropriate for legal expert systems. They contribute to the difficulty of knowledge acquisition. They also operate at the same level of abstraction as jurisprudence, so they have little relevance to the pragmatic level of abstraction at which lawyers operate. A conceptual model of legal reasoning attempts to Chapter One: Meta-epistemology 75

model precisely a process which is not fully understood. Hence, developers of these deep models confuse precision with accuracy.

It is shown in the thesis that its deep conceptual model of legal reasoning is appropriate for large scale legal expert system development. The knowledge structures of the deep model allow ease of knowledge acquisition and operate at both a jurisprudential level and at the level of legal practice. The model is precise and based on a deep understanding of legal practice; it is also accurate. The pragmatism of lawyers limits the apprehension of possible cases to the case at hand; when it is fully expounded, in a communication system with the client, it can be used to design a legal expert system with sound legal epistemology.

1.2.3 Epistemological standard in legal knowledge engineering

The thesis addresses the epistemological requirements for a legal expert system as a matter for clarification and development of legal knowledge engineering; this is necessary for small and large scale systems, but it is essential for large scale legal knowledge engineering as efficient epistemological design may provide an information management system for feasibility of construction. Repetition may be identified as transparency, and add to the ease of construction that is required for large scale construction. The specific meta-epistemological method permits the development of epistemologically sound large scale legal expert systems, in accordance with the standards of the legal profession, rather than epistemologically adequate legal expert systems in accordance with the standards of artificial intelligence.

Soon after the field of artificial intelligence was established, with the first logic heuristic program, Logic Theorist, by Allen Newell and Herbert Simon in 1956 (Simon, 1966), and the earliest expert systems had been developed, epistemological adequacy was prescribed by McCarthy and Hayes (1969) as one of the requirements for validity of an intelligent program, along with heuristic adequacy and metaphysical adequacy; the following descriptions were given:

The epistemological part is the representation of the world in such a form that the solution of problems follows from the facts expressed in the representation. The heuristic part is the mechanism that on the basis of the information solves the problem and decides what to do. (McCarthy and Hayes,1969, p.466) Chapter One: Meta-epistemology 76

A representation is called metaphysically adequate if the world could have that form without contradicting the facts of the aspect of reality that interests us. (McCarthy and Hayes, 1969, p.469)

In the same year that McCarthy and Hayes published their standard of epistemological adequacy, Foucault (1969) pointed out that ‘epistemes’, which operated unconsciously, varied from time to time, and determined how knowledge was formulated; he carried out an ‘archaeology’ to expose the layers of epistemes in human history. Modernism so increased the diversity of ‘epistemes’ of individuals and groups, that contrasts were more apparent for evaluation. The multiculturalism of postmodernism further intensified this diversity of modes of epistemology so that meta-epistemology may have an expanded use in the humanities and social sciences, beyond legal knowledge engineering.

One of the pioneers of legal expert system epistemology, Susskind (1987, p.48), noted the standard of epistemological soundness in MYCIN, required by the medical profession. He posed black letter rules of law as the suitable starting point for legal expert systems of limited use. However, he did allow for semantic variations in the automatable rules of law, to accommodate variations in the Analytical School of jurisprudence. As a justification for commencing with positivist rules, he explored jurisprudential variations that had in common a positivist basis. For this, the soundness of his legal epistemology was criticised by Leith and Moles (Popple, 1996, pp.16-18) and ran counter to the anti-formalist schools in the U.S.A. (Peritz, 1990). However, for legal practitioners in common law countries, especially advocates, there is an ever-present ring in their minds of the judges' words, 'But what is the law on this?', like the conditioned stimuli given to Pavlov's rats. Another ringtone is, 'Do you have any authority for that?', meaning what is the law-making authority for the proposition of law or outcome being argued.

As legal expert systems perform the tasks of legal practitioners, it may be that the pre-computer schools of jurisprudence do not provide the epistemology for these computer tasks. Acquisition and representation of a practitioner's expertise may not be obtained soundly from jurisprudence, but from practitioners directly. The starting point of this thesis is the practitioners' expertise, but assistance is drawn from jurisprudence and other sources where it increases soundness. Knowledge of the Chapter One: Meta-epistemology 77 rules of law and their application, primarily constitute the epistemology of legal practitioners, who are distinguished on that basis as legal experts.

Extended deductive application of systems of rules of law to case facts is ostensibly the prime automatable form of legal reasoning. Inductive and abductive embellishments of the Major deductive premises in a system of rules of law, may be added to a deductive argument where they arise in the system; but the deductive necessity is only embellished, not invalidated, by this non-monotonicity, unless some change in the system of rules is argued on the basis of a new factor in a case, or abductive cogency such as the avoidance of an absurd or unjust result, inherent inconsistency in the rule system. Since there is no law-making authority for legal expert systems to change the law or produce new law, in this thesis, they are limited to providing only assistance in this process. This limitation also demarcates the epistemological soundness required; it might be otherwise for a judicial expert system (cf. Pethe, Rippey and Kale, 1989).

Rule application in common law countries was entrenched in the nineteenth century when the judiciary developed the doctrine of precedent to control the expanding casuistry. Prior to this, precedent cases were treated as examples of the use of a rule of law. The doctrine of precedent asserted that it was the rule, not the facts, in a precedent case, that might be the binding or persuasive part in legal argument. This common law clarification of the doctrine of precedent was not binding in the U.S.A. which had established its independence from Britain a century before. In the U.S.A., early in the twentieth century, Harvard Law School developed the case method that came to prevail in the training of legal practitioners. By the middle of the twentieth century, Australian Law Schools were able to offer the case book method in determining doctrine of precedent argument; the two were synthesised for a sound legal epistemology. In considering a case authority, it was necessary to look at what was said in the judgments: what was expressed as a binding or persuasive rule, and what antecedents might be distinguished as deciding factors in a decision. If a rule had been established, and subsequently in another case, one antecedent in that rule had not been established, that failure of a necessary and sufficient condition was a deciding factor of a contradictory result; this did not mean that the original rule changed, or that the significance of the antecedents in the original rule changed. There could be deformations at the point of deciding factors, that changed the Chapter One: Meta-epistemology 78 outcome of a case. Jurists who specialised in a field of law could construct extensive complex systems of rules by reference to the deciding factors of precedent cases; possibilities cases could be anticipated.

The legal epistemology of case reasoning in the U.S.A., without the constraints of the common law doctrine of precedent that treated cases as authorities for rules, or their inductive particularisation, provided a jurisprudential framework for the Realist School of jurisprudence in that country, which in turn eroded the rule of law so that law was regarded as perhaps something other than rules, namely a system of power; without rules as the constraints on law-making authority, power was left naked and a means for corruption.

Holmes (1897, 1899), who sought to develop the science of law, extolled experience over logic for sound legal epistemology, although a science of law requires both the empirical material facts of cases as well as their logical coherence in the systematisation of rules. Case method was a method used in social science. Pound (1921, 1942) may have supported legal case method so that his 'science of social engineering' might better devise law by reference to sociological verification in case argument. He saw sociological jurisprudence as a source of ideas to ensure that 'social facts were noted and analysed in the formulation, interpretation and use of laws' (Curzon, 1979, p.148). Neither Holmes nor Pound rejected the rule of law, but without the rigour of the doctrine of precedent, the juristic emphasis in the U.S.A. was not on development of a system of rules for extended legal deduction, or the range of possible cases as the content of the system of rules.

The epistemology of Ashley (1990) rested on the case method of finding the closest case and the basis on which to distinguish cases. He attempted to find for automation, a case epistemology, without reference to the complex rule systems established by cases. Primarily he relied on the statistical nearest neighbour techniques as the method of finding the closest case to follow, distinguishing cases by reference to differences. He used statistical analysis to measure the distance between cases; however, this measure is not sound legal epistemology and there was difficulty in weighting the ontological factors of cases for statistical processing without regard for the hierarchy of the rule system that the collective of cases produced, and without regard for whether a factor was an antecedent in a rule, attracting a necessary deductive application, or an inductive instance of an Chapter One: Meta-epistemology 79 antecedent in a rule, requiring no change in the rules of law, but perhaps an adjustment of the inductive particularisation of an antecedent. Statistical measures of difference for precedent case retrieval were also used by Tyree (1989) whose vector system, FINDER, was criticised after testing (Gray, 1997, pp.39-48); he admitted the unsoundness of the statistical epistemology as legal epistemology, but valued his system as useful.

In the clarification of legal epistemology for the purposes of large scale legal expert systems, the synthesis of case reasoning as inductive material facts and rule reasoning as logical systematisation of rules, offers sound legal epistemology that requires rule logic supported by the authority of cases and legislation, with an opportunity for critical evaluation in the abductive information available, that need not be limited to judicial dicta or explanatory memoranda that originates in political policy.

The thrust of the thesis is synthetic. It selects from available sources of legal practice and jurisprudence, suitable components for refinement and incorporation in large scale legal expert systems. The selection does not imply a rejection of the remaining resources. The scope of the synthesis does not consider all available resources that might be included, and justify the rejection of some. In any event, in the field of legal knowledge engineering, which involves some intellectual property and protection of software, a full investigation of available resources is not always available. A limited approach is inevitable, even apart from the limits of doctoral theses in an age of information explosion. The thesis focuses on a justifiable synthesis that is ostensibly effective, as an innovation open to further improvement.

A sound legal epistemology may be equated with the search for a deep model of computational legal reasoning in the legal domain. When the need for a deep model was posed by McCarty, who thought that it could be provided by the development of a computer language for law. McCarty, he saw that the technology required transformations; but he saw these transformations as being within the program itself, rather than as a designer's meta-epistemology.

The anti-formalism factions in the U.S.A. which followed the Realist School of jurisprudence realised that there was no deep model of legal epistemology available for sound legal knowledge engineering. This thesis remedies this shortcoming. It Chapter One: Meta-epistemology 80 incorporates heuristics and ontology in the scope of legal epistemology, in a way that is suitable for large scale legal expert systems.

1.3 DEMONSTRATION PROTOTYPES

1.3.1 Five Prototypes

Stages 1, 2, 3 and 5 of the specific meta-epistemological method are the foci of the thesis; prototypes are posed in these four Stages to demonstrate the meta- epistemological method and its possible effectiveness. The Stage 5 prototype was programmed by Xenogene Gray, and its programming epistemology is not explained in this thesis.

Stage 1 Prototype – Legal domain epistemology

The Stage 1 domain epistemology that forms the basis of the prototypes is specified as required in the development of the prototypes; it is the acquired legal expertise of the candidate. This is a similar approach to the problem of acquisition of expert knowledge as that taken by Shortliffe in the development of the first medical expert system, MYCIN, from 1975. In the course of acquiring the expert knowledge from medical experts, Shortliffe recognised that it would be sounder to obtain for himself a medical qualification and experience. Accordingly, he undertook medical studies and qualified as a doctor. As noted by Dickerson (1963, p.69):

No one can successfully program a problem for machine solution without developing a deep and detailed understanding of the substance of what he is dealing with.

The legal expertise of the candidate was developed firstly, through her undergraduate studies of law for her Bachelor of Laws at Melbourne University, where Waller (1995) was her first year law lecturer on legal method; these studies were completed in 1966. Secondly, from her professional studies of law at Melbourne University in 1967 and at the College of Law in London in 1973, to qualify for her admissions to practice, thirdly, in her postgraduate research for her Master of Laws by thesis at the University of Sydney, completed in 1990, and fourthly during her experience in legal practice. She was admitted to legal practice in three Australian jurisdictions, namely, Victoria (1968), High Court of Australia (1968), and Northern Territory of Australia (1974), as well as in England (1974). Over a period of about fifteen years, she Chapter One: Meta-epistemology 81 practised law, in Australia (1967-1969, 1974-1978, 1981-83) and in England (1970- 73). Her practice covered most fields of law, with some areas of specialisation; she was engaged as an employee and as a self-employed practitioner, as an articled clerk, a solicitor, a barrister, and a litigation manager. Matters that she conducted ranged through the hierarchy of courts in Australia and in England, other than the Privy Council and the House of Lords; in many, highly skilled senior counsel were instructed.

The candidate also completed a short course in BASIC programming in 1984 and the Teknowledge knowledge engineering methodology course in 1987.

The development of the candidate’s legal expert epistemology was significantly influenced, firstly, by the ex tempore judgment skills of the late Lord Denning in the Court of Appeal which she witnessed, as litigation manager of the case concerned, River Lea Rowing Club v British Rowing Association (1973) and, secondly, by the civil trial preparation skills of Gerard Brennan QC, as he then was, subsequently Chief Justice of the High Court of Australia, by whom she was directed as instructing solicitor in the case concerned, Budget Rent a Car Systems v B.M. Auto Sales trading as Budget Rent a Car (1976-77) 12 ALR 363; (1951-76) NTJ 919. The candidate’s practice experience is further enhanced by extensive research and teaching of law during her continuous academic employment since 1977. In the articulation of the Stage 1 domain epistemology, the candidate relies on her own legal expertise.

The Stage 1 domain epistemology of the candidate is an English legal system epistemology that may be partly or wholly applicable in other legal systems. It is not posed absolutely, and it is not claimed to be necessarily the only epistemology or the comprehensive epistemology used by all legal practitioners in common law legal systems.

A legal expert’s epistemology is relativistic (cf. Harre and Krausz, 1996, Chapter 3; Lee, 2005). In the collective of the legal profession, each legal expert uses an epistemology of her or his own, that is wholly or partly similar or different to the epistemologies of other practitioners. Some epistemologies may produce more successful legal practice than others, and it might be true to say that practitioners tend to learn with experience and adopt the most successful epistemologies. It is not Chapter One: Meta-epistemology 82 presupposed that there is a single legal domain epistemology, because this would be inconsistent with the premise of freedom of thought that is tenable and realistic in the protestant-based contemporary English legal system; this premise of freedom of thought itself could be regarded as part of the legal domain epistemology of the candidate.

However, the Stage 1 legal domain epistemology of the candidate relies on a jurisprudential system that is inherent in the legal system, given its ostensible purposes, and the nature of its rules and their application in cases; these matters are generally undisputed among experienced legal practitioners. The candidate’s expert epistemology that is used in the development of the thesis prototypes, is not a complete articulation of her domain epistemology, but it is sufficiently substantial to demonstrate the specific meta-epistemological method; it is put forward to illustrate the meta-epistemology as an alternative to prior legal knowledge engineering methodology, where knowledge engineers have not directly or adequately addressed legal knowledge acquisition as a matter of domain epistemology to be formulated for computation.

Epistemological matters have been dealt with by other authors of legal expert systems, in various terms that may be evaluated by reference to the candidate’s Stage 1 domain epistemology and the prototypes developed in this thesis; the specific meta-epistemological method raises new critical perspectives, and is itself open to critical evaluation and improvement. For example, Susskind (1987) reviewed works of the School of Analytical Jurisprudence and accepted the rule epistemology that was popular in legal knowledge engineering as consistent with their views. Gardner (1987) refined the rule epistemology with her thesis that some case situations involved more difficult applications of rules than others; there were hard cases and easy cases. Ashley’s (1990) analysis of the case reasoning of lawyers is further evaluated in Chapter Three and Chapter Four, by reference to the case epistemology in two of the prototypes posed in this thesis, namely the computational epistemology of 3d legal logic and the eGanges shell epistemology. As will be shown, more so than the work of many other authors of legal expert systems, Ashley’s work is pertinent to the further development and clarification of legal knowledge engineering with which this thesis is concerned. Chapter One: Meta-epistemology 83

None of these other authors developed or used a meta-epistemological method for legal knowledge engineering. None explicitly distinguished the epistemological stages in the development of a legal expert system, and recognised the need for retroduction and anticipation of requirements in the sequence of the stages. McCarty sought the deepest integration of the meta-epistemological stages by the development of a computer language for law. Application epistemology and programming epistemology were integrated by the state machine solution of Mowbray's Intest (Greenleaf, Mowbray and Tyree, 1987), and the pragmatic solution of Popple (1995), SHYSTER. Generally, known artificial intelligence programming techniques were used for program design, whereas in the specific meta- epistemological method, the program design determines the programming epistemology. The development of legal knowledge engineering largely proceeded by the application of artificial intelligence techniques to the legal domain; as these techniques emerged, they were adopted in the field of artificial intelligence and law to show how they might work in the legal domain. The approach was from the computer perspective, not from the jurisprudential perspective. The approach of this thesis is from the perspective of jurisprudence for legal knowledge engineering.

Systematic and thorough critical evaluation of existing legal expert systems, and existing legal knowledge engineering methodology, by reference to the specific meta-epistemological method, the candidate’s Stage 1 domain epistemology, and the thesis prototypes, is not within the scope of this thesis. The candidate’s Stage 1 domain epistemology and the thesis prototypes serve to demonstrate the use of the specific meta-epistemological method, to verify its effectiveness. It is not necessary to show the defectiveness of the work of others in order to show the effectiveness of something else; no reliance is placed on systems to the contrary, except to show how some limitations are remedied or evolved. Effectiveness is relative; it may be improved by first developing and clarifying scope for it, which is the undertaking of this thesis. In the realms of legal practice, effectiveness is also relative, and likely to depend on variable contexts and circumstances. A case may be won by different epistemologies.

Throughout all of the Chapters of the thesis, the Stage 1 legal domain epistemology is articulated as it becomes pertinent. The Stage 1 domain epistemology contains some aspects of legal reasoning, that have not been clarified before in the field of Chapter One: Meta-epistemology 84 legal knowledge engineering; these will be identified where they arise. Generally, in the specific meta-epistemological method, after acquired knowledge of legal expertise is provided, it is further reiterated where it is relevant in the knowledge engineering task as follows: firstly, the computational part of the task, then the design part, and finally, the application part. In this way, focus can be provided seriatim in the task, as an appropriate knowledge acquisition process; for the purposes of the thesis, this sequence is adopted for specifying the Stage 1 domain epistemology.

Stage 2 Prototype – Computational epistemology of 3d legal logic

The Stage 2 prototype is the computational epistemology of 3d legal logic that is a refined version of the visualisation of 3d legal logic, first posed in the candidate’s Master of Laws thesis (Gray, 1990). A revised version of the Master’s thesis was published as a book (Gray, 1997). The original visualisation of 3d legal logic, which is a 3d logic that also applies to some domains other than law, was formulated by the candidate in the course of various legal expert system experiments in each of which the candidate acted as expert and legal knowledge engineer, alone or in collaboration with a different programmer. The possibility of three dimensional computer graphics was shown to the candidate in Cambridge in 1989, by Ian Braid of Three Space, who developed the first three dimensional engineering drawing program, now Generic 3d; soon after, virtual reality and interactive visualisation software followed. The Master's visualisation suggested metaphors of three dimensional objects as the names for various parts of the jurisprudential system.

The epistemological use of metaphors in this way was shown by Lakoff and Johnson (1980, 2003) to constitute a deep model of human intelligence, which appears to be related to the human potential for imprinting revealed by Lorenz (1977); use of metaphors is extensive in the communication of thinking and reasoning, and is especially useful as method at the cutting edge of knowledge, where there are no names established for new systemic discoveries. Metaphors are adopted as legal knowledge engineering methodology for epistemological system identification in the specific meta-epistemological method and its prototypes. In their Afterword, Lakoff and Johnson (2003, p.252) confirmed this suitability for use of metaphors: Chapter One: Meta-epistemology 85

We first saw conceptual metaphors as mappings in the mathematical sense, that is, as mappings across conceptual domains. This metaphor proved useful in several respects. It was precise. It specified exact, systematic correspondences. It allowed for the use of source domain inference patterns to reason about the target domain. Finally, it allowed for partial mappings. In short it was a good first approximation.

However ... Mathematical mappings do not create target entities, while conceptual metaphors often do.

As Lakoff and Johnson point out, metaphors can provide gestalt concepts and concepts for a totality of interactions or experience, which otherwise are not named or identified as a single coherent entity; complex coherence may also be sustained in metaphors. Purposes may be achieved through the use of metaphors. Metaphors may overlap, forming a network of connections; they may give meaning to form and produce new meaning.

The prototype computational epistemology of 3d legal logic itself is a metaphorical concept; it also contains other metaphorical concepts, such as Rivers, Poles, Stars, and Spheres. It uses conceptual space as a metaphor for locating relatively, the ontologies of legal possibilities, like a Universe. In this metaphysical space, ontologies and their epistemological structures and processing can be constructed or represented. Forsyth (1984, p.5) confirmed the early use of a notion of space by pioneers of artificial intelligence design:

They viewed problem solving as a search through a space of potential solutions, guided by heuristic rules which helped direct the search to its destination.

Operations in four dimensional metaphysical space may be determined by epistemological heuristics; processes occur in the space of potential solutions. Four dimensional processing may be viewed, designed or reviewed from the fifth dimension where meta-epistemology is located. The metaphors of the computational epistemology of 3d legal logic are used in the specific meta-epistemological method of this thesis for precise, exact, systematic mapping from and to the domains that are traversed in the production of a legal expert system, in particular, from the legal domain epistemology to the epistemologies of computer science, and back again to the legal domain epistemology. Five dimensional metaphysical space provides the human intellect with its own cosmos where its intellectual artefacts may be created, used and shared in the four dimensional metaphysical world; the metaphysical Chapter One: Meta-epistemology 86 cosmos is the seat of human intelligence which has its own judges and system of freedom and democracy.

Winter (2001) showed the use of metaphors in the development of law. He established that the Supreme Court in the U.S.A. had used metaphors to develop categories founded in precedent cases; the law commonly uses metaphorical concepts such as real property to identify a group of legal rights, and corporations as persons with constitutional rights.

The Vienna Convention itself refers to the 'sphere' of the Convention; in the Appendix of this thesis, this sphere is mapped in detail, and these maps may be understood in the context of the spherical framework of the computational epistemology of 3d legal logic.

The concept of mental space of Fauconnier (1985, 1997; Fauconnier and Turner, 2002), provided for construction of image schemas, mapping and other mental models that might be structured by blending different concepts from conceptual systems. A set-theoretical concept of a category may be given coherent structure and processes by a metaphor. Where categories are open-textured, metaphors may redefine them more systematically and bring to them new consequences and new scope for truth; metaphors can redefine reality (Lakoff and Johnson, 2003, p.122). Hart (1975; 1961) suggested that the solution for open-textured concepts was further defining rules; this might be in addition to inductive instances of material facts of cases that established an open-textured antecedent. Cases may be analogous in the set of antecedents they apply, or in the material facts they contain. Metaphors may assist the identification of the set of properties of an open-textured entity just as maxims may: in verbis non verba sed res et ratio quaerenda est (in interpreting words, one ought to look not only to the words but to the things and meaning behind them).

Ramus suggested that different logical arguments might be connected as a whole; they could be said to be adherent, or metaphorically, glued together; in transformations and reconfigurations, the glue must be dissolved and then used, perhaps artfully or technologically, to connect other sub-arguments. There may be a regluing of a new configuration. Metaphors keep the internal adherence as coherence. Chapter One: Meta-epistemology 87

As the basis for design of a legal expert system, 3d legal logic poses geometric paradigms of logic, using, inter alia, the same form as quality control fishbone graphics (Ishikawa, 1985), instead of the sort of decision tree used by Capper and Susskind (1988). In this doctoral thesis, 3d legal logic is refined as a technological epistemology, by a precise identification of its metaphors in terms of logic. As the basis for design of a shell epistemology, the logic descriptions of the three dimensional graphic representation of legal logic, clarify the computational nature of legal epistemology. The 3d visualisation assists understanding of the coherence of complex legal logic. For the shell design, the system of 3d legal logic, as a graphic representation of data and flows, provides the logic heuristics, which are more clearly understood by reference to the logic descriptions of the computational epistemology.

The knowledge that concerns technological epistemology is the knowledge of the art or skill of the expert. Inevitably, this is a teleological epistemology that is concerned with the purpose for which the knowledge is applied. The art or skill of the legal expert lies in the knowledge of premises, particularly the conditional propositions of law, a knowledge of how these premises are categorised as deductive, inductive or abductive, and how arguments appropriate to a case, are arranged or glued together, and flow, as a composition of deductive, inductive and abductive sequences. It is the composition and flow that are the art. The knowledge of legal experts is a knowledge of the systemic operations for artfully applying the propositions of law to a client’s case.

In the Stage 2 prototype computational epistemology of 3d legal logic, there is a reconciliation of paradigms of knowledge structures, developed in the candidate’s Master’s dissertation, and paradigms of logic. This mix of metaphor and logic provides two integrated perspectives in the Stage 2 prototype: a visualisation perspective and a computation or processing perspective. Thus, the Stage 2 prototype has (1) a three dimensional graphical model of legal logic which has three different structures that contain, respectively, deductive premises, inductive premises and abductive premises, and (2) possible alternative flows in the deductive structure for selection and composition of deductive, inductive and abductive premises for legal argument or the categorisation of cases. The three different graphical structures which are located relative to each other, make up a spherical structure, and contain Chapter One: Meta-epistemology 88 the categories of logical premises: they consist of a River system for adversarial deductive premises, spectra for inductive instances, and strata for various abductive information. The possible alternative flows in the deductive structure arise from disjunctions in the premises

The River system is three dimensional and represents rules of law that have been formalised as conditional propositions; each tributary stream in the three dimensional River system represents a Major deductive premise that is a rule of law, formalised as a conditional proposition. Minor deductive premises arise from the input of facts of a user's particular case, and determine the selection of the relevant Major deductive premises for argument composition. A series of selected Major deductive premises can be used for extended legal deduction.

When correctly formalised as a conditional proposition, each rule of law takes the format: ‘if (antecedent(s)) then (consequent)’. In formal logic, for example a → c, an inference arrow symbolises ‘then’ in the conditional proposition. The inference arrow is adopted in the computational epistemology of 3d legal logic to symbolise ‘then’, and also to indicate the direction and flow of extended deductive argument that is possible in a tributary system of rules linked by the overlap of antecedents and consequents from different rules. The spherical framework of rules is the result of representing the ontology of legal possibilities as alternative, overlapping sets of necessary and sufficient conditions, establishing five alternative possible final results, namely, positive, wholly negative, partially negative, wholly uncertain or partially uncertain.

In the legal domain, there are systems of rules of law, such as the Vienna Convention. The flows of the formalised conditional propositions in these rule systems conjoin at points where some of the rules share the same antecedent or consequent; an antecedent in one rule may be also a consequent in another rule, or, there may be an antecedent or a consequent that appears as such in two different rules. The conjoining of the formalised rules of law, in a large scale application is extensive and complex, as illustrated in the Vienna Convention eGanges nested River maps in the Appendix of the thesis. In Chapter Three, Chapter Four and Chapter Five, it is shown how all the conjoined rules in a system have a confluence of rule flows with an hierarchical tributary structure, called a River because it resembles the merging streams of a tributary system. The tributary structure of the Chapter One: Meta-epistemology 89 three dimensional River system has possible alternative pathways of flows to five Final outcomes. The Appendix shows a two dimensional River extracted from the three dimensional River, as the tributary structure leading to the positive Final result. The River could be called a type of decision tree or a flowchart; however it is clearly distinguished as a River in Chapter Four.

In an adversarial system of rules, there is an hierarchical tributary structure for each opponent in litigation; these are contradictory Rivers. There is also a tributary structure that provides for uncertainty in cases: a third River, the Uncertainty River. As shown in Chapter Three, together, these three tributary structures constitute a three dimensional entity which can be formed up further as a spherical system of alternative possible valid legal arguments; the sphere of 3d legal logic is the form of a truth table for the legal domain. The truth table sphere is a representation of alternative overlapping sets of necessary and sufficient conditions, which may have some unnecessary and insufficient conditions, together with their Final consequents; the alternative sets constitute possible pathways through the sphere, each of which has a Final consequent and is a valid argument.

Apart from a knowledge of deductive, inductive and abductive premises, legal expertise includes knowledge of unnecessary and insufficient conditions that are antecedents, as well as the consequents of rules as available interim goals or final outcomes, antecedents as choices of means, and relevant information for informed selection of alternative means or pathways of antecedents to interim goals or final outcomes. Conditions which are unnecessary and insufficient, called neutral conditions, may not be pertinent to a Final outcome but this status must be shown if it is relevant in legal argument. However, neutral conditions may be relevant and important in a practical situation; they may be part of the choice available. In practical reasoning, the focus is on understanding choices, i.e. the alternatives from which to make a selection, given the adopted priority of goals and relevant information for informed decisionmaking. An enquiry by an offeree as to the meaning of a contractual offer, is a neutral condition in contract law, that is nevertheless a choice which could avoid a path to conflict and litigation.

The metaphor or paradigm of the sphere of an adversarial River system, permits visualisation, and thus management, of a large and complicated system of legal rules. Use of such imagery was posed by Buzan and Buzan (1995) in the development of Chapter One: Meta-epistemology 90 mind maps; an aerial view of the Colorado River is reproduced in their work to illustrate a potentially useful representation. As noted in the candidate’s Masters work, human intelligence may have developed sometimes by an imitation of a natural phenomenon. For instance, the sail paradigm may have been derived from a leaf blown across a puddle; the world wide web takes its name, metaphorically, from the spider’s web. It may be that some capacity for a form of psychological imprint, as suggested by Lorenz (1977), is naturally selected as a survival solution.

Inductive instances of an antecedent in a rule are derived from case law; usually they are part of, or based on the facts of a case. In the Stage 2 prototype computational epistemology of 3d legal logic, these instances are arranged in spectral gradations according to the degree of difference between them; the spectrum is derived from the rainbow paradigm with its three primary colours, namely red, blue and yellow. The inductive spectrum has three adversarial sectors that may be demarcated sometimes by fine lines of distinction; the middle sector contains the instances of a particular antecedent (blue), above the middle sector is another sector that lists the instances of the antecedent that is the contradictory of that particular antecedent (red); the remaining sector, below the middle sector lists possible instances about which the law is not clear, i.e. uncertain instances that establish the uncertain antecedent (yellow). Inductive spectra may systematically expand on antecedents in an adversarial deductive River system. They are also called adversarial triads and link the antecedents of the three tributary structures, so that contradictory antecedents and their uncertainty are illustrated continuously by their spectrum of inductive instances. Spectra are explained further in Chapter Three.

Apart from the adversarial colour coding of antecedents, according to the inductive spectra, the Stage 2 prototype also has a colour scheme that indicates the position of a rule or Major deductive premise, in the hierarchy of rules or extended Major deductive premises that make up a tributary River structure. Thus, the mainstream is blue, the secondary stream is orange, the tertiary stream is green and so on, so that there is sufficient contrast of colours in the gradation from one level to the next. The hierarchy of colours are illustrated in the Stage 5 application prototype that is explained in Chapter Five, and illustrated in the Appendix. Colours and their structures depict logical distinctions in the Stage 2 prototype, and are the palette and Chapter One: Meta-epistemology 91 lines for the legal knowledge engineering art and geometric science. The colour coding is used in the Stage 3 shell design.

Finally, there is a strata structure for the representation of different types of abductive commentary and links, that provide either reasons for the rules of law, as strong support, or otherwise, criticism of the deductive or inductive premises of law. The strata paradigm is geological or archaeological, like layers in the earth's sphere. Each antecedent is represented as having abductive strata below it to solidify the sphere; equally it could be represented as having strata above it, so that deduction is buried under it, to show how the terrain or topography of available commentary varies from antecedent to antecedent. Abduction may weaken or strengthen the flow of continuity of the deductive River system that permits chains of extended deduction, interspersed with inductive clarification, in the adversarial system of hierarchical rules.

The logic categories of deduction, induction and abduction, provide knowledge infrastructure in the nature of partitions of legal information, as well as an indication of the processing permitted for each partition. Deductive River systems may be used for the process of extended deduction; inductive spectra may be used for inductive inferencing to classify Minor deductive premises provided by the case to be categorised; and abductive strata may contribute to these processes, further matters that strengthen or weaken the deductive or inductive structures and their processes. The three knowledge structures in the visualisation of 3d legal logic have a fourth dimension of processing, indicated by the directional flow arrows in the deductive adversarial River system: streamlined legal logic flows. Inductive and abductive premises, associated with a deductive antecedent, may be added to that deductive antecedent in the flow of extended deduction.

Knowledge of meta-rules for structuring and processing the rules of law, and of the restructured rules themselves are a substantial part of the Stage 1 legal domain epistemology. The Stage 2 computational epistemology of 3d legal logic is a transformation of part of this Stage 1 epistemology to a generic computational epistemology that is common to various fields of law. This transformation and the Stage 2 computational epistemology of 3d legal logic are set out and explained in detail in Chapter Three. The Stage 2 generic computational epistemology of 3d legal Chapter One: Meta-epistemology 92 logic is then transformed to the Stage 3 shell epistemology of eGanges, that is suitable for various applications, as shown in Chapters Four and Five.

Stage 3 Prototype – Shell epistemology of eGanges

The Stage 2 prototype is a generic computational legal epistemology that does not include all of the Stage 1 legal domain epistemology; there are non-computational elements of the Stage 1 legal domain epistemology that are not captured in the Stage 2 prototype. Nevertheless, the Stage 2 computational epistemology of 3d legal logic is a sufficient basis upon which to proceed to a shell design that does accommodate further generic parts of the Stage 1 legal domain epistemology.

The Stage 3 prototype, namely the design of the eGanges shell, gives effect to the Stage 2 prototype as well as further aspects of the Stage 1 domain epistemology. A computer program requires an interface for communication with its user. Therefore, it is appropriate to reconsider the Stage 1 legal domain epistemology with a view to determining if it has matters relevant to the design of such an interface. The Stage 2 computational domain epistemology of 3d legal logic is transformed to the Stage 3 prototype in such a way that suits these communication aspects of the Stage 1 legal domain epistemology.

In particular, user-friendliness is identified as a requirement of the Stage 1 legal domain epistemology that is relevant to the design of the Stage 3 prototype. Subjects of the legal system benefit from its rules and must obey them. Accordingly, the interface of the shell must be user-friendly for ordinary people if the legal system is to be effective. In Chapter Three, the Stage 2 computational domain epistemology of 3d legal logic is construed to suit the ordinary cognition of Lord Bowen’s man on the Clapham omnibus (Rogers, 1979, pp.46-7), who is representative of the majority of subjects of the law. Thus there is a transformation of the Stage 2 prototype to the Stage 3 prototype. Aspects of the Stage 3 shell epistemology are different to the Stage 2 computational domain epistemology of 3d legal logic, but only as a transformation, that is, as another form of the Stage 2 computational domain epistemology, one that has Stage 3 constraints.

Sometimes, constraints that need to be addressed, must be considered in one Stage by way of anticipation of those matters being required to be addressed in a later Chapter One: Meta-epistemology 93

Stage, in order to provide for the requirements of the later Stage; and sometimes constraints must be considered retroductively in a Stage, to provide for matters in a previous Stage. The transformation process of the specific meta-epistemological method allows this flexibility; lawyers’ intelligence, especially in equity, as will be explained further in Chapter Five, usually includes the exercise of a mental agility of toing and froing, so that a transformation process that includes anticipation and retroduction is not averse to legal expert epistemology.

The use of the further constraint of user-friendliness from the Stage 1 generic domain epistemology, in the shell design, is an example of a Stage 1 matter being addressed in Stage 3, retroductively. Generally, user-friendliness is accepted as a constraint in program epistemology. According to Peirce (1931, p.28), the concept of retroduction was used by Aristotle, and subsequently interpreted as abduction. Retroduction might be regarded as a form of abduction that indicates going back a stage or two to pick up relevant premises, constraints or information that must be resolved before proceeding further.

Apart from the use of anticipation and retroduction, as forms of abduction, in the meta-epistemological methodology, for the transformation of epistemologies from one stage to the next, the way that abduction is provided for in the Stage 2 computational legal epistemology of 3d legal logic is a major consideration in the Stage 3 design of the shell. The term abduction is used in the Stage 2 computational epistemology of 3d legal logic to indicate a form of reasoning that is parallel, but connected, to deduction; various layers of abduction that allow for different types of abductive information, such as relevant authority, ethical evaluation, strategic guidance, or social implications, are represented as strata logic. In the shell epistemology, abduction is accommodated in its various forms by gloss facilities that permit a menu or strata of different glosses; this is explained more fully in the shell design set out in Chapter Four.

A major feature of the Stage 3 eGanges shell design is its interactive visualisation of two dimensional rule structures, called Rivers, derived from the three dimensional River system of the Stage 2 prototype. The three dimensional tributary structure is potentially too complicated for ordinary human cognition; virtual reality experiments of the candidate referred to in her revised Master’s thesis (1997, p.229), showed that the three dimensional representation of the adversarial rule system was visually Chapter One: Meta-epistemology 94 disorienting. Furthermore, virtual reality software requires large memory facilities that, as yet, are not privately available to most people. This is not to say that the cognitive skills for virtual reality law could not be learned as something essential for an evolved civilisation that travels into space in the physical universe. eGanges epistemology requires a streamlining of law according to the paradigm of a River’s tributary structure. The two dimensional tributary object of the Stage 3 shell epistemology, presupposes the wholly-formed three dimensional rule mapping of the Stage 2 computational domain epistemology of 3d legal logic. The three dimensionality of law and its application to cases, has been a mystical aspect of the system of rules that is law; in the precision of the Stage 2 computational domain epistemology of 3d legal logic, this mystical quality is clarified through the use of three dimensional conceptual mind space. In the eGanges interface, it is simplified for user-friendliness, without any loss of complexity. The three dimensional system of rules is a precise system of legal choices that can be managed through an interface design that is limited to a two dimensional interactive visualisation.

The complexity of the Stage 2 computational domain epistemology of 3d legal logic is transformed to the level of user friendliness by extraction of the central two dimensional tributary structure (the blue river), that is the equatorial cross-section of the 3d legal logic sphere. An interactive two dimensional visualisation of this central River of blue or positive rules in an application, is available in the Rivers window of the Stage 3 prototype interface. The extended deductive process is recorded in the visualisation, in the consultation of an application, by the white nodes in the River, changing to either red, blue or yellow, depending on the user’s interrogation input. Further three dimensionality of the Stage 2 prototype, is accounted for in the Stage 3 interrogation system and the processing of input that is received from the user; sorting heuristics and feedback structures simplify the complexity, and clarify the logic system so that it can be learned by law students as the game of legal logic.

The eGanges design retains the three adversarial Rivers in the form of three adversarial windows in which, respectively, are listed the Minor premises of the user as points for the Positive (blue) case, points for the Negative (red) case or Uncertainties (yellow) which usually favour the Negative case, depending on the burden of proof. Entries in the adversarial windows confirm the colour changes in the nodes of the River visualisation. The epistemological transformation from the Chapter One: Meta-epistemology 95

Stage 2 computational epistemology of 3d legal logic to the Stage 3 shell epistemology of eGanges, and the design of the Stage 3 eGanges shell epistemology is set out and explained fully in Chapter Four.

The logic categories of legal information in the Stage 2 computational epistemology of 3d legal logic, are represented in the Stage 3 shell design without any requirement for the user to know of deduction, induction and abduction. This adds to its user friendliness. The urban intelligence of the traveller on the Clapham omnibus who can understand a street map or a bus route, is all that is required of the user of the eGanges shell. Although worked out by strict adherence to the domain epistemology of law, the appearance of the interface, with its primary colours, contrast colour hierarchy, and various sized window rectangles, seems to have been foreshadowed by the Mondrian genre of twentieth century art, as suggested to the candidate by Professor Daniele Bourcier at the Sorbonne in 2005.

Where appropriate in Chapters Three and Four, as a further development and clarification of legal knowledge engineering, the specific meta-epistemological method is reconciled with systems methodology and legal knowledge engineering methodology. Systems theory, analysis and design, are included in the concept of epistemology and in the development of the prototypes. The specific meta- epistemological method itself is an epistemology that may use, in its stages, any of the available epistemological techniques such as systems analysis and design, data- lists, and heuristics.

Two dimensional graphics and logic heuristics help keep the eGanges shell small for the technology of Personal Digital Assistants (PDAs), known as Handhelds. Other aspects of the Stage 3 shell design, particularly the single mouse click communication with the system that triggers logic processing, the navigation of extensive, complex, mnemonically visualised knowledge, and saving of consultations, also make eGanges suitable for PDA use.

Stage 4 Prototype – Programming epistemology of eGanges

Following the candidate’s Stage 3 design of the eGanges shell during her doctoral candidature, in 2002, eGanges was programmed in Java in the period 2002-5, by her son, Xenogene Gray, who is a computational physicist; he implemented the Chapter One: Meta-epistemology 96 candidate’s Stage 3 design of the shell that is posed and particularised in Chapter Three. His Stage 4 programming epistemology, is primarily a computing study within the jurisprudential project of the thesis; it is explained only briefly, where it is relevant, in the remainder of this thesis.

The Stage 4 programming epistemology of eGanges is object-oriented programming, derived, retroductively, from the Stage 2 computational epistemology of 3d legal logic; the object-oriented epistemology implements the logic of the computational epistemology of 3d legal logic by reference to its system of notional three dimensional objects. The Stage 2 computational epistemology of 3d legal logic makes object-oriented processing possible because its formalisation produces objects; it provides notional three dimensional objects through its formalisation of rule structures, according to meta-rules for structuring, as well as meta-rules for their processing. Each notional object is constrained in the sphere, which is made up of an extended deductive hierarchy, inductive spectra and abductive strata. Object-oriented programming permits backward, forward, and sideways chaining by reference to the characteristics of the object; the shell is as flexible in its leaps between data as human intelligence requires.

The programming code that implements the Stage 4 prototype is not disclosed in the doctoral work and the thesis does not attempt to explain or verify the Stage 4 programming epistemology. However, the Stage 3 shell epistemology of eGanges incorporates a Stage 1 constraint of transparency, that allows the user to detect transformation errors in the Stage 4 programming epistemology. The transparency constraint is part of the constraint of user-friendliness, as explained in Chapter Four. Machine justice is not assumed; in using eGanges, it does not appear that justice is being administered by a black box. The interface and feedback systems that are part of the Stage 3 shell epistemology, are designed to show, ex facie, if there is an inconsistency in the Stage 4 programming epistemology. The Stage 3 shell epistemology anticipates and provides for Stage 4 difficulties. This is explained in detail in Chapter Four.

The eGanges program was actualised by the Stage 4 programming epistemology that was produced by the transformation process of Stage 4 carried out by the programmer. The programming epistemology includes the constraints of the programming language, Java, that were necessary to produce the shell. The Stage 5 Chapter One: Meta-epistemology 97 prototype, the Vienna Convention application, assumes the Stage 4 programming epistemology.

Stage 5 Prototype – Vienna Convention application epistemology and ontology

The actual eGanges shell that is the prototype produced in Stage 4 of the specific meta-epistemological method, gives effect to the thesis prototypes of Stages 1 to 3 and, in turn, permits the prototype in Stage 5, namely an application of the shell. Chapter Five and the Appendix, sets out the Stage 5 prototype application epistemology for the Vienna Convention, and illustrates the use of eGanges. Thus, the Stage 5 prototype is made possible by the jurisprudential prototypes in Stages 1- 3, and the programming of the Stage 4 shell. As the last prototype in the sequence of the five stages, the Stage 5 prototype demonstrates the effectiveness of all the prototypes in the earlier stages, including the Stage 4 prototype, as well as the cumulative effectiveness of the specific meta-epistemological method itself.

As legal knowledge, the Vienna Convention is part of the legal domain epistemology of Stage 1. In the demonstration of the specific meta-epistemological method, Stage 5 is concerned with this part of the Stage 1 legal domain epistemology. Substantive ontology particular to the Vienna Convention, such as, within the sphere of the convention, concluded contract, remedy etc., is construed to the generic epistemology of the shell so as to produce, by this transformation, the particular substantive epistemology of the application. Considerable prior analytics were required to formalise the rules of the Convention and link them as extended deductive Major premises in their tributary structure. Thus, the Stage 5 application epistemology, as a prototype, has particular substantive epistemology that is derived from particular substantive ontology, shown to be in accordance with the generic epistemology of the shell. The social epistemology that is implicit in the Vienna Convention is taken into account and given effect, within the constraints and context of the shell epistemology. The transformation that produces the specification of the substantive application ontology and epistemology of the Vienna Convention, according to the generic epistemology of the shell, as the Stage 5 prototype, is described in Chapter Five; it completes the transformation demonstration of the thesis. Chapter One: Meta-epistemology 98

In the course of developing an application, it may be necessary to consider in addition to the particular selection of substantive law, the eight Stage 1 sub- epistemologies of legal experts, as listed above, which go beyond the representation of the law to the use of that knowledge in the provision of legal services; legal experts use a hybrid of pure and practical reasoning that is a system of epistemological segments, including a litigation sub-epistemology of how court orders are obtained.

The eight segments of the legal practitioner's sub-epistemologies may overlap and interact. For instance, law such as the Vienna Convention, produced by law-making authority is created to enforce, through litigation and commercial practice, legal strategies that may be developed to take advantage of the benefits offered by the law, as well as minimising any costs. In the process of litigation, evidence is required to prove a case for court orders sought. It is the answers to questions that are put to a witness that prove the material facts to establish applicable rules of law and deductive reasoning. Questions must be designed accordingly, subject to the constraints of the law of evidence; ultimately the open-textured antecedents are established by questions, not definitions, but questions must be based on definitions and inductive instances of material facts. Nodes in the River indicate questions so that the River map is really a map of the questions, as well as the formalised rules.

In the Stage 5 prototype, the use of litigation practicalities is added to the substantive provisions of the Vienna Convention, as a matter of legal strategy associated with the Vienna Convention law. Recourse to sub-epistemologies of legal practitioners, is a further retroduction to Stage 1 that is required in Stage 5.

Many of the epistemological problems raised in philosophy are settled in the system of the legal practitioner's sub-epistemologies. To some extent, legal process provides epistemological certainty. For instance, in litigation, every witness gives evidence under oath or affirmation of truth. Evidence so given may be accepted or rejected, depending on whether or not the witness is believed. Where necessary, conflicts in the evidence of witnesses are resolved as issues of fact. In a jury trial, the final verdict of guilty or not guilty obviates the need to expressly show findings of fact and justify these findings. It is possible for jurors to arrive at a common verdict by different processes of reasoning and the law allows for this diversity in the evaluation of evidence. In a non-jury trial, the judiciary makes findings of material Chapter One: Meta-epistemology 99 facts (Curzon, 1979, p.244), by reference to the evidence, and whether or not witnesses are believed. It is not necessary for the judiciary to claim that these facts are true in any absolute sense; truth is determined relatively according to the available evidence. Usually findings of fact may be challenged on appeal.

Whether or not there is a jury, findings must be based on admissible evidence. The body of rules of evidence limit the admissibility of evidence to ensure that empirical and expert witness standards are met. This thesis is not concerned with legal domain epistemology insofar as it deals with evidence. Once the necessary findings of material facts are made, a process of legal reasoning determines the consequences of those material facts; it is this process of reasoning with which this thesis is primarily concerned.

Apart from the practitioner's eight sub-epistemologies, influential works of jurists may be taken into account in formulating the ontology and epistemology of an application. Allen (1982; Allen and Saxon, 1986) and Susskind (1987) were concerned with these sort of structures in the development of a legal expert system. Sometimes black letter law adopts a juristic structure, such as the Hohfeld categories of jural relations, and sometimes, such categories may be useful in formulating the substantive ontology or epistemology of an application. These juristic considerations are taken into account in the schematisation of the Stage 5 prototype application in Chapter Five.

The Stage 5 Vienna Convention application prototype is not complete. Chapter Five sets out the further research required to complete eGanges applications. This entails the development of another part of the generic application epistemology, namely the epistemology of interrogation. The eGanges interrogation facility allows questions to be asked in natural language and the significance of answers to be adjusted to suit the natural language. However, the question/answer logic of epistemological philosophy, such as that of Collingwood (1940), in legal knowledge engineering methodology, requires reconciliation with the law of evidence and with the expertise of client interviewing and taking proofs of evidence; as this would be a major undertaking in itself, it is outside the scope of the doctoral work, due to the doctorate limitations. Further, interrogation epistemology also might make provision for the burden of proof. Chapter One: Meta-epistemology 100

The jurisprudential particularisation of interrogation epistemology, and the further enhancements of the shell suggested in Chapter Five, are not necessary for demonstration of the specific meta-epistemological method and its effectiveness; however, identification of the further developments and how they might be introduced as extensions to the eGanges shell, provides clarification of further areas of legal knowledge engineering methodology, in accordance with the objective of the doctoral thesis.

All of the prototypes result from an implementation of the specific meta- epistemological method. Other implementations might be possible and might be better. The thesis objective of clarification and development of certain areas of legal knowledge engineering, is reached by particularising the selected four meta- epistemological stages, Stages 1, 2, 3 and 5, through the prototypical demonstrations.

1.3.2 Teleological epistemology

Available goals in a system of rules of law limit goals that are available in a legal expert system or for a client. Like a legal practitioner, a legal knowledge engineer must obtain instructions on the purposes and requirements for a legal expert system before designing and constructing an eGanges application. Thus, a further step in knowledge engineering is required for legal knowledge engineering methodology, namely, the preliminary step of taking or negotiating instructions on what is required by the client and users, prior to commencing the acquisition of substantive knowledge for an application.

The specific meta-epistemological method is an effective method for epistemologically sound legal knowledge engineering, that is suitable for large scale systems development, because it provides the scope and focus necessary for the discovery and implementation of the deep structure of potential purposes of legal expertise as artificial legal intelligence.

Within the scope of the legal expert system purpose, a user's case must be identified in the range of the ontology of legal possibilities, in order to apply the rule structure in the relevant part of the legal ontology. If a client is seeking a certain legal consequent, advice is required on how it is possible to achieve that; sound legal epistemology is a matter of possibilities, that are encompassed by law, and that are also given legal consequents according to the rules of law that structure them, in Chapter One: Meta-epistemology 101 relation to client purposes. There is a teleological duality in legal knowledge engineering: both the system purpose and, within that, the user purpose, must be accommodated.

Epistemology was entwined with teleology from its outset. The philosophical concept of epistemology originated in Greece in the sixth century B.C., following the early science of Thales, that employed teleological epistemology. Thales used the scientific methods of speculation, observation, theorisation, and systematisation as epistemology; for instance, he observed the relative locations of objects and formulated geometric generalisations as theory for systematically calculating distances that were not known, from distances that were known. This was useful for commercial shipping; Thales was an entrepreneur in the Eastern Mediterranean Greek city of Miletus, now in Turkey. He developed a speculative ontology of the changing flow of primary natural substance, probably based on the new social paradigm of the flow of state currency in trade; it was modified in different ways by the further speculation of his followers, Anaximander, who drew the first map, Anaximenes, Pythagoras and Heraclitus.

Scientific methods are the epistemology of science; they are used to convert the ontological statements of scientific hypotheses to knowledge. Pythagoras conceived atomism, and the Logos was formulated by Heraclitus (c.540-475 BC) to incorporate the ordering plan or formulae for unity. According to Heraclitus, opposites could be combined harmoniously by the ordering principle of the Logos. A contemporary of Heraclitus, Xenophanes (c.570-480 B.C.), posed the first epistemological questions (Everson, 1990, p.6), for example: what is knowledge and how do we get it?

Logic, the tool of theorisation, became a major epistemology for scientists, philosophers and lawyers. It was a pupil of Xenophanes, Parmenides (c.540-480), who was regarded by Russell (p.66) as the inventor of logic. Grant (1989, p.48) regarded Empedocles (c.493-422 BC) as the inventor of rhetoric, because he posed a plurality of alternate realities. Empedocles was able to devise scientific epistemology to demonstrate the validity of an hypothesis by experimentation; for instance he showed the existence of air in a tube by holding his finger over it and immersing it in water. Other remarkable hypotheses of Empedocles were that light travelled so fast that it could not be observed, and that only the fittest combinations of atoms survived Chapter One: Meta-epistemology 102 in the evolutionary process. The relationship of rhetoric and epistemology at this time was summarised by the candidate (Gray, 1997, p.87):

Words recorded observed phenomena. Meanings of words continued to exist irrespective of the changes in the physical world which they could represent. Recollections provided an ongoing existence for past events. Thus existence was to be found in both thought and also in the study of meaning in language and logic. Since appearances were not reliable indicators of harmony, one appearance could be as valid as another. Reasoning skills could be used validly to create any appearance.

The multiple possible valid and invalid arguments of rhetoric, allowed the corruption of truth. Plato (429-347 BC) answered this problem by posing true or perfect forms, as distinct from appearances. The Platonic forms provided a basis for the true form of logic of Aristotle (384-322 BC), a student of Plato, particularly in his Organon which introduced the formalisation of paradigms of valid and invalid arguments, respectively his syllogisms and sophistical fallacies. Truth was a stable reference point for the systematisation of logic. It was assumed that once an ontological statement was proven to be true, then it was transformed into knowledge. Logicians provided proofs for extending knowledge.

Plato's forms were posed as static absolutes, whereas Aristotelian logic forms were dynamic, transforming ontologies to knowledge and further knowledge to ontologies. In the legal domain, some ontologies, such as property interests, are static, while others, influenced by the twentieth century behaviourist school (Watson, 1919), such as contractual performance, negligence, and dangerous driving, are dynamic. Law uses static and dynamic legal ontologies, statically and dynamically.

The 'truth table' of legal epistemology is teleological; in it, pure and practical reasoning are simultaneous and continuous. The law is both descriptive and prescriptive. Pure reasoning in the legal domain is concerned with valid legal arguments that arrive at a valid conclusion by categorisation; practical reasoning in the legal domain treats a valid legal argument as the pathway of necessary and sufficient conditions required to reach an objective. In logic, pure reasoning is regarded as predicate logic, which is descriptive, and practical reasoning is accommodated by propositional logic which is prescriptive. Truth is conceived differently for pure and practical reasoning; pure reasoning is concerned with the truth of a description and true ontology, whereas practical reasoning is concerned Chapter One: Meta-epistemology 103 with the validity of a predicted outcome and that the rule has worked consistently to reach the objective. However, a predicted outcome is described, and a described possible ontology may be predictable. The interchangeability of these forms of logic, indicates a versatile truth that is multi-focused, to suit variable user foci; it may be an absolute, constant truth, like the speed of light, that stabilises different perspectives. In Chapter Four, the interchangeability of pure and practical reasoning is illustrated as truth transformations, by reference to the use of Porphyry's taxonomy trees, decision trees and eGanges Rivers in legal knowledge engineering design method.

Waller (1995, p.170-1), in a first year law text, acknowledged the interchangeability of pure and practical reasoning in deductive legal reasoning. However, the interchangeability must be validated by law-making authorities in the legal domain.

Truth tables in logic have regard to the truth of premises. The truth of the rules of law as Major deductive premises, is better understood in contrast to the truth of scientific knowledge. Sound scientific epistemology or method, initially produces scientific knowledge from hypothetical ontologies that may or may not be possible; thereby ontologies may be proven as possible or not possible. Once knowledge statements are established, further epistemology may process this information to derive further knowledge for conversion into ontological statements that are true, being indirectly proven, or into further hypotheses for testing. Scientific expertise assumes the knowledge that arises from ontology established in science as true or proven, and applies it in the real world. A teleological epistemology is required for the application of scientific expertise to the real world; depending on purpose, scientists must know what knowledge to apply, and how to apply it with skill.

In the legal domain, epistemology is conceived differently to science; truth rests on law-making power. Prima facie, law is laid down by law-making authorities as legal knowledge, which assumes legal ontologies. Authority displaces the scientific requirement for proof of a law; this is not to say there are no scientific authorities but scientific authorities are learned in what has been proven. Proof of authority in the legal domain, prima facie establishes the validity of law. Furthermore, as observed by Waller (1995, p.50):

Truth for many legal purposes may be prescribed truth and not discovered truth. Chapter One: Meta-epistemology 104

However, the authority for law is not the authority for the range of possibilities that may be derived from the content of that law, the validity of logical interpretations, or the validity of logical interpretations for various client purposes; the range of possibilities and the validity of logical interpretations rest on sound legal epistemology that is teleological because it has rule structures involving consequents that may be adopted as client goals. The 'table' of valid legal arguments in law is used for practical purposes. The logical extensions of rules in legal epistemology fuse, seamlessly, the truth of authority and the logical completeness of truth in both the pure and practical reasoning of the legal domain.

The determination of particular ontologies of legal possibilities begins with positivist rules expressed by law-making authorities as black letter law, and then these limited possibilities are extended by the logical semantics of legal epistemology, to cover the whole field of meaning in its adversarial dimensions. The whole field of meaning requires an expression or account, not just of black letter rules, but also their adversarial contradictories, and their significance in the reality of uncertainties and the burden of proof in litigation. Logical conjunctions and disjunctions add complex choice to the adversarial rule structure so that special heuristics are required to exhaust alternatives before arriving at a final consequent. Teleological epistemology derives the full extent of the ontology of legal possibilities implicit in black letter law, claims the extension of the truth of express law for its extensions for logical completeness of contradictories and uncertainties, and then offers the various pathways to the various final outcomes of the total deductive scheme of the ontology of legal possibilities.

For the construction of legal expert systems, extensive prior analytics of the black letter rules of law are required. The whole field of meaning must be given its epistemological structure that permits application of all the relevant adversarial rules by extended deduction, as it is expanded by inductive instances, including iterative and analogous instances, and further expanded by strong or weak abductive support.

Through its specific meta-epistemological methodology, this thesis provides the nested logic scaffolds, the tributary paradigm of legal epistemology, for the prior analytics required for an epistemologically sound legal expert system. Large scale systems are accommodated by this methodology. Moreover, in accordance with that part of legal epistemology that is concerned with cognition, communication and Chapter One: Meta-epistemology 105 justice, the prototypes ultimately provide a means of producing large scale legal expert systems that require minimal computing expertise and are user-friendly for both legal experts and their clients. The prototype shell, eGanges, serves teleological epistemology.

1.3.3 Technological epistemology

Computers and their software exist; they have been created by scientists according to the constraints of computer epistemology; computing knowledge, its structures and processes constitute the epistemology of computer experts. Legal knowledge engineering must take account of computer epistemology if it is to produce large scale legal expert systems as viable software. Thus, legal expert epistemology must be transformed to accord with computer epistemology; this is a transformation from one epistemology to another epistemology, and thus it falls into the realm of meta- epistemology.

This specific meta-epistemological method transcends the transformation from ontology to knowledge and vice versa. The transformation it carries out may originate in the teleological epistemology of the legal domain, but the method of application of law to the real world is via technology. A fourth new area of epistemology, requiring meta-epistemology is:

(4) the application of knowledge to the real world for some practical purpose via technology.

The specific meta-epistemological method, posed as legal knowledge engineering methodology in this thesis, provides for the epistemological transformations required to produce large scale legal expert systems. The logic refinements of the prototype computational epistemology of 3d legal logic, which capture the ontology of legal possibilities and the teleological epistemology of the legal domain, facilitate the meta-epistemological transformation, since computer epistemology is intrinsically logical.

However technological epistemology for epistemologically sound large scale legal expert systems is also constrained by the domain epistemology requirements. For instance, by the twentieth century, reasoning backward from and forward to a proposition that is regarded as true as a matter of scientific fact, was an established epistemology (Ritchie (1923, p.10). Chapter One: Meta-epistemology 106

In the legal domain, it might be possible to chain backwards and forwards along the extended deductive rules of law, to learn the system of rules; this free navigation is offered in eGanges. However legal reasoning only proceeds in the direction of the inference arrow, forward, and this constrains technological epistemology in legal knowledge engineering methodology.

The synthesis that is developed for the prototypes is processed through the constraints of all stages of the specific meta-epistemological method. The holistic complex logic system established in this thesis, incorporates the ontology of legal possibilities, rule reasoning as extended deduction, case instances as inductive spectra, abductive reasons for a rule or part thereof, user-friendliness including transparency, that is required to sustain justice, ad hoc links between antecedents and between parallel sub-systems of rules, and links to any relevant information in databases, websites or other programs.

Development of sound computational legal epistemology, shell epistemology and application epistemology has permitted the development of jurisprudence for legal knowledge engineering to suit technological epistemology. Although epistemology is not used in the legal domain, in the same way as it is in the physical sciences, the study of computational legal epistemology is a matter of applied legal science. The jurisprudence of legal knowledge engineering incorporates this applied legal science. 2 CHAPTER TWO:

ARTIFICIAL LEGAL INTELLIGENCE

AND META-EPISTEMOLOGY

2.1 PROBLEM OF ARTIFICIAL LEGAL INTELLIGENCE

The metaphorical chair of artificial legal intelligence may be implemented as real technology by the specific meta-epistemological method posed in Chapter One. The previous postgraduate work of the candidate which explored legal intelligence and artificial legal intelligence, did not arrive at the metaphorical chair of artificial legal intelligence. The specification of the problem for the specific meta-epistemological method in this thesis produced the metaphor.

2.2 EXTENSIONS OF MASTER’S WORK

The doctoral work significantly extends the Master’s work of the candidate. The revised version of the original Master’s thesis (1990) that was published as a book (Gray, 1997) further developed the Master’s thesis, but is treated as part of the Master’s work in considering how the doctoral thesis extends the Master’s work. The doctoral extensions amount to the development of the meta-epistemological methodology and the experimental prototypes that illustrate it. These developments extend the Master’s work in the following ways:

2.2.1 Legal intelligence and legal epistemology

The revised Master’s thesis (1997) began with the concept of artificial legal intelligence which was defined as:

...the computer simulation of any of the theoretical and practical forms of legal reasoning, or the computer simulation of legal services involving the communication of legal intelligence (Gray 1997, p.3).

Largely, the Master’s work explored the notion of legal intelligence and how it manifests in the field of legal knowledge engineering. The doctoral work begins with the problem of developing legal knowledge engineering methodology for large scale Chapter Two: Artificial Legal Intelligence and Meta-epistemology 108 legal expert systems; it poses legal expert epistemology as the intelligence to be specified for computation. A specific meta-epistemological method is developed to transform legal domain epistemology from one prototype to the next until a Vienna Convention application can be demonstrated.

2.2.2 Collective legal intelligence and individual legal expert epistemology

The Master’s work established the concept of the collective legal intelligence, the body of past and present legal expertise, as the subject to be automated by legal knowledge engineering. The collective legal intelligence might be coherent or disparate; it might be evolving from disparate chaos to a coherent system.

The doctoral thesis recognises that individual legal epistemologies are relativistic within the collective legal intelligence: past and present legal experts, who are members of the collective, contribute their own expert epistemologies to the collective legal intelligence. The body of expertise includes all of these epistemologies; legal expertise is one, more, or all of these epistemologies. The doctoral work does not seek to reconcile differences in the collective of legal epistemologies in order to produce for automation a legal epistemology common to all present legal experts; it allows any computational discordance in the relative legal intelligence to remain in the collective.

As the basis for the doctoral automation, the fully-formed individual expert epistemology of the candidate is used in the knowledge acquisition phases of the specific meta-epistemological method for construction of a large-scale legal expert system. Inter alia, the ontological investigation in the Master’s work was taken into account in the formulation of the candidate’s legal expert epistemology.

2.2.3 Human intelligence and computational legal intelligence

A consideration in the Master’s work was the span of early human intelligence from the digital artefacts of money in seventh century B.C. Turkey and the digital conception of Pythagorean atomic and mathematical theory, to pre-Socratic conceptions of law and maps, to Platonic forms, to pre-Socratic mystical notions of the realms of thought as infinite, eternal, indefinite, undifferentiated and indeterminate. It was Anaxagoras (c.500-428 B.C.), building on earlier Greek conceptions, who called the newly found boundless intelligence the nous, an autonomous controlling force, adapted more recently by Teilhard de Chardin (1955) Chapter Two: Artificial Legal Intelligence and Meta-epistemology 109 as the noosphere. Huxley (1974) added the mystical uncertainties of Darwin’s evolution by Natural Selection to the scope of human intelligence. The indeterminate aspects of human intelligence, including the indeterminate aspects of legal intelligence, limit its computer simulation. Rather than seeking a simulation of the indeterminate, the doctoral work isolates what, in the span of legal intelligence, is computational legal epistemology for a smart aid.

The Master’s work was also concerned with the history of epistemological phenomena as human intelligence, especially those phenomena that were used or could be used in the legal domain, from the early development of ancient Greek rhetoric and logic to the recent casuistry of the English legal system and systems science.

The doctoral work is concerned to identify computational epistemology, howsoever it has arisen in history, to maximise the sector of contemporary legal intelligence that can be automated. Primarily, legal domain epistemology, includes knowledge of black letter law and how it is processed to provide legal services; for this reason, the data of black letter legal information, that is, the expert knowledge of the rules of law and their inherent computational epistemology, found the methodology developed in this thesis. It is the extensive and complex black letter law which requires large scale legal expert systems, so that, for large scale methodology, maximisation of automation must be derived from black letter law systems and their epistemology. Legal expertise is identified as a system that has attributes of the computational epistemology of logic. This approach is adopted as a basis from which to specify computational legal epistemology for large scale systems and maximum automation.

An indeterminate human intelligence can be captured in the meaning of digital symbols, and their digital processing. The full span of legal intelligence, including the mystical infinite of human intelligence, is not excluded by the doctoral thesis, but a computational part of legal intelligence is identified and a method for automating it is specified as a framework to funnel fuller spans of meaning. In legal knowledge engineering, technological epistemology uses digitalised symbols, to the extent permitted by legal expert epistemology, and digitally processes those symbols according to the processing of them permitted by legal expert epistemology; this is a digitalisation of meaning as units for the computation of meaning. The expert Chapter Two: Artificial Legal Intelligence and Meta-epistemology 110 epistemology and the technological epistemology contain constraints on each other in the processing of meaning. Despite these constraints, meaning may be mystically infinite in an eGanges application if that is the content of its digital symbols of legal information or the significance of their constrained processing.

2.2.4 Deconstruction and reconstruction of legal intelligence

The emphasis in the Master’s work was on determining the components of legal intelligence and artificial legal intelligence, as resources for producing legal expert systems; components were seen to be paradigmatic. Many paradigms of legal intelligence were set out in the Master’s work as components of the collective legal intelligence, potentially for automation. Paradigms of legal intelligence included paradigms of human intelligence, and were shown to emerge over the centuries since the time of primitive legal systems. At the time that the original Master’s thesis was conceived, Kuhn (1962, 1970) had shown that a new paradigm could produce a revolution in a field of expertise. The Master’s study was made in the spirit described by Tarnas (1993, p.397):

The prevalence of the Kuhnian concept of "paradigms" in current discourse is highly characteristic of postmodern thought, reflecting a critical awareness of the mind's fundamentally interpretive nature. This awareness has not only affected the postmodern approach to past cultural world views and the history of changing scientific theories, but has also influenced the postmodern self-understanding itself, encouraging a more sympathetic attitude toward repressed or unorthodox perspectives and a more self-critical view of currently established ones.

The revised Master’s work was divided into three Parts. Part 1 traced the history of artificial intelligence and law as a study of paradigms of jurisprudential systems containing legal choice. Choice was seen as a key to automation; a program might make choices with or without user instructions. Freedom that is a fundamental premise in the legal system, might be expanded or contracted by legal choice, in a world of natural relative freedom; one person’s freedom secured by law brings also legal constraints on that person as well as others. These jural relations were systematised by Hohfeld (1913); the importance of Hohfeld’s system of jural relations in the design of an expert system was posed by Allen and Saxon (1986).

Then, in Part 2 of the revised Master’s work, for the purposes of deconstructing legal intelligence with a view to further development of artificial legal intelligence Chapter Two: Artificial Legal Intelligence and Meta-epistemology 111 through reconstruction, phenomena of legal intelligence were set out as macroparadigms, cyclic paradigms and microparadigms. Part 3 developed new paradigms for the field of artificial legal intelligence: technological jurisprudence, jurisprudential systems science, 3d legal logic, the science of legal choice, and designer legal intelligence. Paradigms from Part 1 and Part 2 were also used in Part 3 by way of reconstruction, as content and as context for the new paradigms.

The doctoral work continues the reconstruction with further new paradigms for legal knowledge engineering methodology: the meta-epistemological methodology and the prototypes used in its demonstration, including the synthetic paradigms of the computational epistemology of 3d legal logic and eGanges. These further new paradigms accommodate and build on many paradigms of legal intelligence identified in the Master’s work. Some further work that could be done following the doctoral dissertation is also identified. The new paradigms allow a specification of legal choice in terms of logic, and the supplementation of logic by antecedent choice points that are not otherwise named but which provide necessary links for extended deduction; thus the doctoral work further advances the science of legal choice established by the Master’s thesis. Choice points give effect to disjunction by bringing it into an extended deductive scheme. They also account for missing language – unlabelled necessary choice points. Lawyers deal with these unnamed points by asking a question. The science of legal choice is epistemology suited to the teleological epistemology of legal knowledge engineering technology.

Whereas Kuhn was concerned with paradigms, Foucault was concerned with epistemes. There are various strands of epistemology that may be referred to as paradigms, and the use of paradigms may be an epistemology. Foucault’s epistemes are paradigms of epistemology. The concept of paradigm is common to ontology and epistemology. Comparing the paradigms of Kuhn and the epistemes of Foucault, they are similar concepts used in different ways, by reference to the main arguments of each author. Largely, Kuhn was concerned that entrenched scientific paradigms unduly restricted the development of scientific ontology; Foucault was largely concerned that entrenched epistemological paradigms unduly restricted the development of understanding. The candidate’s Master’s thesis used the concept of the paradigms of legal intelligence to deconstruct legal intelligence and construct a computational version of it. The doctoral work identifies epistemological paradigms Chapter Two: Artificial Legal Intelligence and Meta-epistemology 112 in the Master’s deconstruction for use in the new episteme of the specific meta- epistemology, to meet the standard of epistemological soundness in the doctoral automation.

2.2.5 Reconstructed legal intelligence and intellectual artefacts

Whereas the Masters thesis was concerned to deconstruct the collective legal intelligence to arrive at a jurisprudential system of legal choice that could be automated, the doctoral thesis is concerned with the paradigms of artificial legal intelligence itself, both in the course of legal knowledge engineering and when there has been a reconstruction in the form of a legal expert system with the attributes of a jurisprudential system of legal choice. Artificial legal intelligence in its eGanges forms has intellectual artefacts that amount to an interactive visualisation of a system of legal choice; these phenomena take their place as new content of the collective legal intelligence that can be evaluated as such by reference to the prevailing standards and requirements of the collective legal intelligence.

2.2.6 Jurisprudential systems of legal choice and their transformation

The Master’s work used the two paradigms of jurisprudential systems and the science of legal choice, as themes in a deconstruction of legal intelligence for its reconstruction as artificial legal intelligence. Computerisation of any of the theoretical and practical forms of legal reasoning, or of legal services involving the communication of legal intelligence, requires first, a determination of the relevant jurisprudential system and the legal choices that it contains. This was advanced by a process of some deconstruction of the legal domain and some reconstruction of it in the form of 3d legal logic as a jurisprudential system of legal choice. Human intelligence is largely a matter of determining choice and making informed selections; legal intelligence is concerned with legal choice and informed selection.

Deconstruction and reconstruction of legal expertise were used in the Master’s work as the knowledge engineering methods for knowledge acquisition and representation of legal choices for informed selection. The deconstruction of legal intelligence was a form of systems analysis and the reconstruction involved a determination of the system of legal choice, as clarification of the inherent design in legal intelligence.

The doctoral work recognises that the jurisprudential system of 3d legal logic, which is a paradigm for the formalisation and representation of legal choices, is also a Chapter Two: Artificial Legal Intelligence and Meta-epistemology 113 computational epistemology; accordingly, in the doctoral work, 3d legal logic is developed as an epistemology by identifying its structures and processing in terms of logic. Thus the science of legal choice posed in the Master’s work is further developed in the doctoral work by the precise logic descriptions of legal choice; the doctoral work founds the logic of legal choice. The resulting computational epistemology of 3d legal logic is seen as the completion of the transformation of a core of legal domain epistemology that is suitable for computation. The process of deconstruction to extract, for computational reconstruction, a core of the legal domain epistemology, amounts to an identification of the entity for metamorphosis or transformation from one form of legal epistemology to another form of computational legal epistemology. Systems analysis is required to preserve the core jurisprudential system of legal choice in its metamorphosis to meet different constraints; by this systems analysis, deconstruction and reconstruction are managed.

Next, the doctoral work recognises that further transformations of epistemology are required to incorporate more of the legal domain and produce actual legal expert systems in the various fields of law. Thus the doctoral study of technological method in legal knowledge engineering, is also an epistemological study of epistemologies; this is the realm of meta-epistemology where the relative nature of different epistemologies and their consistencies are accommodated. Since computer technology and the legal domain have their own established epistemological constraints, in relation to each other, they require meta-epistemological study. Legal method has not before been subjected to such a comparative meta-epistemological study, whereby it is treated as expert epistemology and subjected to epistemological analysis as such, so that it can be transformed to a computational epistemology, then a program epistemology, a programming epistemology, and an application epistemology.

The theoretical and practical forms of legal reasoning, or of legal services involving the communication of legal intelligence, can be computerised through meta- epistemological methodology that incorporates a determination of the constant jurisprudential system and the legal choices that it contains. In the sequential stages of the specific meta-epistemological method, the jurisprudential system of legal choice is transformed variously to comply with the different constraints in each stage of the method; there is a schematisation with various forms of epistemology, each Chapter Two: Artificial Legal Intelligence and Meta-epistemology 114 incorporating its own version of the constant jurisprudential system with its legal choices. The specific meta-epistemological method sustains a continuity of system identity in the transformations. What was used as deconstruction and reconstruction themes in the Master’s work, namely the jurisprudential system of legal choice, is used as essential constancy in the specific meta-epistemological method of the doctoral work.

2.2.7 Jurisprudential systems science and specific meta-epistemological method

The Master’s work employed systems theory, analysis and design in the specification of paradigms as elements or sub-systems of legal intelligence; it also developed jurisprudential systems theory, analysis and design as jurisprudential systems science for legal knowledge engineering. In particular the Master’s work explained the development of the jurisprudential system of 3d legal logic as legal knowledge engineering methodology for representing legal choices in a formalised pattern. 3d legal logic was posed also as a basis for designing standardised processing of legal information to communicate to users the theoretical and practical forms of legal reasoning that amount to legal services. The jurisprudential system of 3d legal logic was shown to be a new paradigm with potential for the precision of reasoning required for automation.

In the doctoral work, paradigms of legal intelligence in the Master’s work, are located firstly in the legal domain epistemology, and then systematised in the subsequent transformation processes of the specific meta-epistemological method. In particular, the doctoral thesis further develops the Master’s paradigm of 3d legal logic as a prototype computational domain epistemology by the identification of its logic characteristics; logic is a major part of epistemology and readily computational. Further doctoral prototypes that also illustrate the specific meta-epistemological method use this computational epistemology of 3d legal logic; epistemological transformations of the computational epistemology of 3d legal logic that ultimately produce a new genre of smart computer aids for the legal domain, are explained.

The doctoral work places systems theory, analysis and design into the framework of epistemology, to further develop legal knowledge engineering methodology; Chapter Two: Artificial Legal Intelligence and Meta-epistemology 115 jurisprudential systems science and the science of legal choice are incorporated in the specific meta-epistemological method.

The Master’s paradigms might still be useful in the particularisation of other epistemologies in the collective legal intelligence, and in the further enhancement and particularisation of the thesis prototypes and eGanges applications, as discussed in Chapter Four and Chapter Five.

2.2.8 Several and integrated knowledge engineering methodologies

It was acknowledged in the Master’s work that knowledge engineering methodology was used in the legal domain. Although there was an exploration into the use of systems science to develop additional legal knowledge engineering methodology, the Master’s work did not integrate knowledge engineering methodology and jurisprudential systems science. The doctoral thesis provides this rationalisation and at the same time integrates knowledge engineering methodology, jurisprudential systems science and meta-epistemology, as large scale legal knowledge engineering methodology. In particular, the specific meta-epistemological method is shown to be a system for legal knowledge acquisition, legal knowledge representation, and system design in legal knowledge engineering. As legal knowledge engineering methodology, the integrated method also provides, through the specific meta- epistemological method, for the transformations required in the process of construction of a small or large legal expert systems.

2.2.9 Paradigms of legal intelligence and logic

The Master’s work developed the theory of 3d legal logic as a paradigm of legal intelligence that emerged when computational constraints were applied to legal expertise; the paradigm was not identified as epistemological, as the focus of the Master’s study was ontological. In the candidate’s revised Master’s thesis (Gray, 1997, p.174-5), epistemology was explained as follows:

The ontologies and paradigms that constitute scientific knowledge, are established through scientific methodology. ...

Scientific knowledge is the end result of an epistemologically sound method for attaining knowledge. The term, epistemology, has Greek origins associated with the study of what knowledge is and how people can know something. Chapter Two: Artificial Legal Intelligence and Meta-epistemology 116

Eight categories of scientific method were listed in the Master’s work: (1) observation, (2) categorisation, (3) quantification, (4) theorisation, (5) systematisation, (6) speculation, (7) experimentation and (8) instrumentation (Gray, 1997, p174). An epistemology may use one or more of these methods at a conceptual level; logic may infiltrate any of the eight methods which are not mutually exclusive, but it is primarily a theorisation method.

The Master’s work suggested that paradigms may be useful as pro tem ontology, and also as methodology; they allow the determination of what is to be categorised as ontology and what is to be categorised as methodology, and thus as epistemology. In the doctoral work the ontological and methodological paradigm of 3d legal logic is further developed as a computational legal epistemology so that the reified three dimensional structure that is ontological is given descriptions from the field of logic which is methodological.

The doctoral work draws on the paradigms of logic, as paradigms of human intelligence, to develop the computational legal epistemology of 3d legal logic as an epistemology with the three types of logic, namely, deduction, induction and abduction. From the wording of the rules of law, Major deductive premises are wholly formalised as a system of conditional propositions. Each Major deductive premise is treated as a line with antecedent nodes concluding in an inference arrow followed by a final consequent node; thus the premise has a reified representation that can be visualised like Australian aboriginal message sticks with dots and arrows.

In developing the doctoral epistemology of 3d legal logic, the distinction is drawn between legal domain epistemology and computational domain epistemology; this provides new dimensions for epistemology that makes the philosophical concept useful for legal knowledge engineering.

Further, in the doctoral work, a generic domain epistemology for a shell is distinguished from application epistemology that is required for the creation of specific applications. Every application requires (1) applicable generic domain epistemology that is captured in the shell, (2) other generic domain epistemology that is not captured in the shell but is captured in a specific application, and (3) specific application epistemology that is exclusive to the specific application. These distinctions are explained in Chapter Four and Chapter Five. Chapter Two: Artificial Legal Intelligence and Meta-epistemology 117

By enhancing the Master’s theory of 3d legal logic as a computational epistemology, the doctoral work identifies more precisely the nature of legal expertise as a category of human intelligence; the same attributes might apply to other domains of expertise as well. Legal expertise is characterised by the knowledge of deductive, inductive and abductive premises in the legal domain, how they are related, and how they are applied to client cases. Moreover, those premises implicitly provide scope for possible cases, and control of combinatorial implosion and explosion, as well as control of conditions that are distinguished as neutral, that is as unnecessary and insufficient, in the process of establishing the legal consequent of a user’s situation.

The deductive and inductive premises that are known by legal experts as black letter law, can be precisely formulated in an automat able scheme; for extended deduction processing, they are alternative, overlapping, hierarchical sets of necessary and sufficient conditions that may be interspersed irregularly with unnecessary and insufficient conditions. The abductive premises, such as reasons for the rules of law, may be located precisely in relation to the deductive and inductive premises of law, to illustrate the extent of the abductive support and/or critical evaluation of the deductive and inductive content. The expert knowledge of the division of deductive, inductive and abductive premises is set out in the legal domain as a matter of law- making authority; legal experts recognise the authoritative division and use it. It tells them what is to be enforced, how to argue for change, and how to suggest modification and adaptation, homeostasis and heterostasis of the system of rules.

The doctoral thesis redefines epistemology for legal knowledge engineering. It is not just methodology that establishes knowledge, but it is also the way of ascertaining new knowledge from available knowledge; legal expert epistemology is the method used in providing legal services. Legal experts provide for their clients, from their available expert knowledge and know-how, new knowledge about the legal status and consequents of the client’s situation, particularly with a view to assisting the client to plan actions. A meta-epistemological methodology is developed to transform legal domain epistemology to applications of a shell. Transformations of the logic in legal epistemology are required to permit human expertise to be computerised. Chapter Two: Artificial Legal Intelligence and Meta-epistemology 118

2.2.10 Rule-based legal intelligence and case epistemology

The development of 3d legal logic in the Master’s work, emphasised rule-based reasoning. A similar emphasis is given in the refinement of 3d legal logic as a computational epistemology in the doctoral thesis. However, the doctoral thesis also explains case-based reasoning in the light of the eGanges shell epistemology. The doctoral work gives greater precision to the location of rule-based reasoning and case based reasoning in the legal intelligence mosaic of hierarchical deduction, induction and abduction. Antecedent analogy, as a matter of one of several alternative instances in an authoritative inductive spectrum, is distinguished from hierarchical antecedent analogy, which is a matter of extended deduction of a case’s rule path through the sphere of possible valid arguments.

2.2.11 Survival metasystem and survival abduction

Apart from seeking to provide faster and cheaper access to legal services, for a more effective legal system, the Master’s work was concerned to use the new technology to solve some of the problems of justice in the legal system. For instance, it was concerned to address a paradox of justice that is inevitable when a new case arises: it is unjust to create new liability for actions that the law did not curtail at the time they were done. The faster and cheaper access to existing law that the doctoral shell, eGanges, makes feasible, deals with possible cases within the scope of known rules; this reduces the inherent injustice of new cases where they fall within the knowable combinatorial explosion of antecedents in the known rules. eGanges deals with possible cases that are known and knowable.

In the Master’s work, a jurisprudential system SURMET (SURvival METasystem), with survival goals that included quality of life, was posed as a framework for evaluating law, in order to anticipate its development, and for planning its use; SURMET is an evolved jurisprudential derivative of Kant’s psychological system, that incorporates some Darwinian tenets. In relation to eGanges, SURMET might be used as an application design aid, and as a source of abductive gloss commentary on the law. Through its gloss facilities, eGanges also provides for abductive anticipation of new cases that require further antecedents beyond those of the existing rules. If obiter dicta and persuasive rules are included in the rule system, then they may be annotated with that status; if not, binding rules may be annotated with authoritative Chapter Two: Artificial Legal Intelligence and Meta-epistemology 119 obiter, persuasive rules, and other commentary and criticism. Thus, users may be given available evaluation of potential modification and adaptation of the authoritative premises of law; the inherent injustice of new cases can be minimised in a smart aid.

The doctoral thesis is concerned to produce a shell that allows SURMET information as commentary on the law, or SURMET premises that might be interwoven, as abductive premises, in the extended deduction of legal argument. The doctoral provision of the eGanges glossing facility recognises the importance of the annotation paradigm of the glossiters of Bologna, from the eleventh century A.D., who studied the Roman codification by margin annotations of its provisions; the Master’s thesis considered the gloss as a paradigm of legal intelligence. The glosses produced in the new Law School at Bologna, following the Dark Ages, were sometimes cross references, sometimes suggestions of inductive detail, and sometimes abductive by way of supportive or critical comment.

In SURMET, the legal system is treated as a sub-system of the human survival system that allows for a reconciliation of common, science-based, survival ethics or morality with the law. This evaluation framework is concerned to ensure an understanding, by users, of the direct and indirect human survival implications of the law; such an understanding may motivate better conformity to the law, improvement of it, and a cultural force vital for rational co-operation. To optimise user opportunities for this understanding, in the doctoral work, eGanges provides two dimensional interactive visualisation that requires no more than urban street map intelligence, as a simplification of complex knowledge, and transparent processing that is apparent to the user in a way similar to a game. Conscious human selection that has logic structures can be integrated with evolutionary selection.

2.2.12 Science of legal choice and epistemology of legal choice

The Master’s work recognised that the technology of legal knowledge engineering required a specification of legal choices, and so too did lay users who were entitled to the benefits of the law through the choices it offers, as well as being entitled to know how to avoid its liabilities by making informed choices. In a book review of the revised thesis in the Harvard Journal of Law and Technology, Fall 1998, the science of legal choice was seen to be the major new contribution of the Master’s Chapter Two: Artificial Legal Intelligence and Meta-epistemology 120 work. The science was developed in the Master’s work through the theory of 3d legal logic; the system of rules also defined the system of legal choices.

The doctoral thesis takes the science of legal choice, with its logic specifications, even further through the specific meta-epistemological methodology; systems of legal choice might be transformed in the development of a legal expert system. In the final development of the shell design, logic heuristics are reduced to simple computational processing of storing data, matching data, listing data, and reporting data. The generic framework of choice in the shell is distinguished from the substantive system of choice in an application. Further, the eGanges shell can accommodate large-scale systems of legal choice; the whole of the law’s choices and associated information for informed choice, could be codified for lay users. In the course of constructing the applications, choices become more clearly presented and understood; this also suits the purposes of law reform, especially where choices are complex and indirectly related. The Vienna Convention prototype in Chapter Five illustrates the mapping of legal choices as a science with a defined epistemology.

2.2.13 Small and large legal expert systems

The Master’s work described a variety of small legal expert systems, including the candidate’s many different experimental legal expert systems, that had been constructed in the twentieth century, following the second World War. As many of these authors were not experienced in legal practice and the legal experts with whom they worked were unable to articulate their computational epistemology, legal epistemology was not considered expressly or fully in the construction of these twentieth century systems.

Some researchers viewed legal expertise as non-monotonic (Sartor, 1991, 1994). Actual lawyer’s arguments are certainly presented in this way; they are often a mixture of deductive, inductive and abductive premises that are ordered in various ways. Lawyers, like most people, speak and write non-monotonically; original rhetorical embellishment might also be added as abductive reinforcement to an argument in the course of advocacy. It was difficult for legal knowledge engineers to extract the deductive premises for an automatable extended deductive argument, locate inductive instances in relation to the deductive premises, and properly apportion weight to abductive premises in relation to the appropriate part of the Chapter Two: Artificial Legal Intelligence and Meta-epistemology 121 extended deduction and induction. What had to be acquired was legal expert meta- knowledge of the process of categorisation of premises as Major deductive premises (rules of law), inductive instances (case facts or legislative lists) and abductive premises (reasons for rules and reasons for inductive instances, or authoritative criticism thereof); even if this were sought, it was not explicit meta-knowledge of lawyers that a legal expert might articulate.

Legal method is not traditionally taught in logic categories or as non-monotonic compositions of logic categories. Also lawyers sometimes use a systems argument that maintains that the whole of their argument is greater than the sum of its parts, in order to strengthen any weak links in their argument; systems arguments allow for a totalling up and rationalisation of abductive weights. However, weak links are often the reason given for the failure of an argument in court; they are certainly the focus of the opponent’s argument which might offset a systems argument with the contradictory systems argument. An eGanges application can provide for a systems argument if this is allowed in the substantive field, as an antecedent in a rule of law, or an abductive consideration, as it might be in equity. The nature of equity epistemology is considered in relation to eGanges in Chapter Four and Chapter Five.

The Master’s history of artificial legal intelligence showed that, with a few exceptions, such as the work of Layman Allen, the early legal expert system experiments of the twentieth century mostly followed in the wake of advances in the broader field of artificial intelligence; as new techniques were produced by the field of artificial intelligence, so they were tested in the legal domain. Thus the legal domain was subjected to the epistemological scope of (1) semantic networks (Stamper, 1980), (2) computer languages such as ANSI COBOL (du Feu,1980), BASIC (Argy; Gray, 1997, p.34-5), and PROLOG (Kowalski and Sergot, 1985), (3) rule base and inference engine systems (Capper and Susskind, 1988), (4) vector systems (Tyree, 1989), (5) pragmatic state machines (Greenleaf, Mowbray and Tyree, 1987), (6) intelligent database systems (Hafner, 1978), (7) neural network systems, (8) blackboard systems (Skalak and Rissland, 1991), (9) workstation toolboxes and, more recently, by way of the full circle, to (10) natural language parsers (Ruleburst, 2005).

The doctoral work provides new evaluation criteria, namely epistemological criteria, for legal expert systems. The elemental and holistic characteristics of eGanges are Chapter Two: Artificial Legal Intelligence and Meta-epistemology 122 also the epistemological criteria that it satisfies. Although the twentieth century systems are not re-evaluated directly and specifically, by reference to this new criteria, in the doctoral thesis, relevant landmark aspects of these early precedents are considered where they arise in the remainder of the doctoral thesis, particularly where their epistemological inadequacy provides useful illustration.

The doctoral epistemology provides a specific technique for managing large scale legal expert system development; it can manage large-scale legal ontology. The Stage 5 prototype set out in Chapter Five shows how the large-scale legal ontology of the Vienna Convention is managed by the shell epistemology of eGanges. Just as philosophy is concerned with the epistemological matter of how we know things, and with the ontological matter of what exists, so legal intelligence is either legal epistemology, how we find and apply law, or legal ontology, what existences there are in law; epistemology manages ontology.

2.2.14 Justinian and technological codification

The eGanges prototype, based on the epistemological work of the doctoral thesis, indicates the nature of the potential computer codification of legal services suggested in the Master’s work. In a comparison of the history of the ancient Roman legal system with the history of the English legal system, the Master’s thesis revealed analogies that suggested there were cyclic paradigms of legal intelligence. Further, the English cycle has just entered its final stage, that of codification, comparable to the period in Rome which began with the Theodosian Code in the early part of the 5th century A.D. and ended with the Justinian codification in the 7th century A.D. Whereas the Roman jurists produced a large-scale comprehensive Code that exclusively set out all laws, eGanges legal knowledge engineering is capable of producing large-scale applications that provide comprehensive legal services; the Roman Codes were static but eGanges applications are interactive.

As a tool for epistemological codification, eGanges assists in the application of the law to the user’s situation, whereas users of the Roman codification could only wonder at that.

In the course of producing the prototype eGanges application, the techniques of the Roman jurists in producing their codes are shown to be useful methodology: the Roman jurists rationalised inconsistencies between rules, simplified unnecessary Chapter Two: Artificial Legal Intelligence and Meta-epistemology 123 complexity, removed trivia, unnecessary detail and fine distinctions, and removed obsolete provisions. The emphasis in constructing an eGanges application is on logical streamlining for legal services; a reformer’s power in intelligent electronic codification might employ the Roman techniques to improve the interactive legal services and their resulting social organisation. The Master’s work suggested that intelligent electronic codification might facilitate the political creation of designer civilisation; the doctoral thesis provides the eGanges shell as an opportunity to take this political path.

2.2.15 Workstation and shell

The revised Master’s thesis concluded that a workstation, TECLAW (Total Environment for the Computerisation of Law), with twenty different facilities for constructing a legal expert system, should be developed (Gray, 1997, pp.303-4). The facilities list included a chaining process, hypertext processes, database retrieval, a document modeller, virtual reality graphics, judicial reasoning processes, and probability processes. It was envisaged that a comprehensive toolbox could permit a legal expert system to be constructed with various facets of legal intelligence. Development of legal expert systems, using workstations, during the 1990s proved too costly; lack of an expedient knowledge engineering methodology suited to legal expertise, added to the cost. Workstation development, which produced legal expert systems that were not transparent in their processing, has not been acceptable generally to the legal profession.

The doctoral thesis suggests that an epistemological shell, which can link to any other program, file, database, website, etc., is more effective than a workstation. Generic legal intelligence is captured in the eGanges shell epistemology. Provision is also made in eGanges to link any aspect of the application premises to another program, file, database or website. The meta-epistemological methodology was used to design the shell for use, as illustrated by the prototype Vienna Convention application. Construction of an eGanges application may proceed by direct input of expert knowledge by the legal expert, without the intervention of a knowledge engineer who seeks to acquire legal expertise. The shell permits the paradigm shift of the legal profession that is required for proliferation of legal expert systems. Chapter Two: Artificial Legal Intelligence and Meta-epistemology 124

2.2.16 Aims and their realisation

Following the Master’s work, the problems remained of developing (1) a prototype shell, based on a computational legal domain epistemology, and (2) a methodology for large-scale legal knowledge engineering. It is these problems that the doctoral work addresses: it refines the Master’s model of 3d legal logic as a computational domain epistemology, and develops a meta-epistemological method for producing large-scale legal expert systems. The aims of the doctoral work are to:

(i) develop a specific meta-epistemological method as large-scale legal knowledge engineering methodology;

(ii) reconcile the specific meta-epistemological method with systems science, the science of legal choice, and knowledge engineering methodology;

(iii) reconcile 3d legal logic with logic, legal reasoning, and legal practice heuristics, to formulate a prototype computational domain epistemology that will determine the form of knowledge representation and processing for expert systems in the legal domain;

(iv) develop, as a jurisprudential communication system, a user interface for legal experts and lay users that is user-friendly in the facets of cognition, speed, cost and transparency.

(v) produce a prototype shell and demonstrate its use in a large scale Vienna Convention application.

These five aims are not mutually exclusive, but are woven together at various points.

The law is a large-scale body of information; therefore large-scale methodology is required to produce legal expert systems. The law is a complex system of choice; therefore a science of legal choice is required. In a large social population in an era of science, technology and information, law is part of the science of intelligent social organisation and management that is necessary for human survival. Systems science and epistemology provide methodology to manage large-scale, complex choice; this thesis integrates this large scale methodology into legal knowledge engineering methodology and in doing so clarifies and develops the technology. 3 CHAPTER THREE:

COMPUTATIONAL LEGAL EPISTEMOLOGY

3.1 COMPUTATIONAL EPISTEMOLOGY OF 3D LEGAL LOGIC

3d legal logic is a visualisation of legal logic that requires and uses notional three dimensional space (Gray, 1990, 1995, 1997). It has sixteen knowledge structures that are named according to the forms or paradigms of their nature or shapes. Logic descriptions of the sixteen knowledge structures of 3d legal logic refine the description of the jurisprudential visualisation as a computational epistemology. The sixteen knowledge structures sequentially, are as follows: 1. Monads 2. Rivers 3. Fans 4. Strata 5. Nested Logic 6. Triads 7. Spectra 8. Double Negatives 9. Poles 10. Star 11. Adversarial fishbone 12. Neutral nodes 13. Criss-crossing 14. Rings 15. Sphere 16. Universe

These sixteen knowledge structures might be thought of as semi-fictions (Vaihinger, above p.) because they have their own ontological being that borrows a suitable name of another being, for epistemological purposes. The borrowed name assists Chapter Three: Computational legal epistemology 126 identification, understanding and memory of the extensive complex structure and processing of legal information.

The names of the sixteen knowledge structures also might be thought of as metaphors that fit together in a conceptual system (cf. Lakoff and Johnson, 1980, 2003; Harre, 1996). While Lakoff and Johnson were concerned to find, in the use of language, metaphorical entailment that provides logic for a coherent system of metaphorical concepts, 3d legal logic was constructed by identifying the paradigmatic structures in legal logic and how they fitted together; then, by reference to the nature of the paradigms, those metaphorical descriptions were determined, rather than logic descriptions. Once the metaphors were identified and fitted together, it was then possible to examine the resultant holistic jurisprudential system and specify precisely, a description of its gestalt in terms of logic, for the purposes of the design of a shell for the legal domain.

Lakoff and Johnson (2003. p.81) recognised that the concept of a gestalt, as a multi- dimensional structured whole, could capture both the structural representation of a conceptual system in various linear or non-linear forms, as well as the function, operations, and heuristics that applied to or were carried out by virtue of those structures. In the statics and dynamics of a conceptual system, metaphors could provide orientation for action which is based on that system, particularly through spatialization metaphors, such as that Lakoff and Johnson explained.

It is possible to find in the language commonly used in the legal domain, metaphorical slants on legal logic that are consistent with the direct development of a metaphorical system of semi-fictions for legal reasoning. Thus, it is common to speak of: Rules within the sphere of law The flow of an argument Cases on point The parties in a case being poles apart One party running rings around another party

Until the nineteenth century, trial by battle was a trial procedure, that was permitted instead of legal argument. This may be the source of the metaphors that colour the Chapter Three: Computational legal epistemology 127 language of adversarial litigation. As emphasised by Lakoff and Johnson (2003. p.5), ARGUMENT IS WAR:

The essence of metaphor is understanding and experiencing one kind of thing in terms of another. …

It is not that arguments are a subspecies of war. Arguments and wars are different kinds of things – verbal discourse and armed conflict – and the actions performed are different kinds of actions. But ARGUMENT is partially structured, understood, performed, and talked about in terms of WAR. The concept is metaphorically structured, the activity is metaphorically structured, and, consequently, the language is metaphorically structured. …

The language of argument is not poetic, fanciful, or rhetorical; it is literal. We talk about arguments that way because we conceive of them that way – and we act according to the way we conceive of things.

War metaphors in discourse about argument may be a stronger part of American language than the English of other cultures. Lakoff, as an expert in linguistics, and Johnson, a philosopher, list some war metaphors in the language of argument including:

He shot down all of my arguments.

He attacked every weak point in my argument.

3d legal logic provides a major paradigm shift for the legal domain, from the WAR metaphor to the SCIENCE metaphor; its macro form is astronomical, its microform is atomistic, and its daily operations are geometric, geographical and computational. Just as the paradigm of WAR permitted scope for justice, so too, the paradigm of science permits a framework for engineering optimal freedom and quality of life through relative choice.

The visualisation of the conceptual system of 3d legal logic assists mapping of large scale law and the determination of processing heuristics, for the purposes of shell and application design. To the extent demonstrated in Chapter Four, the common familiarity of the metaphors is suited to human cognition and the management of jurisprudential intelligence.

3.1.1 Monads

The task of jurisprudential analysis for mapping large scale law, begins with the identification of legal monads, that is, the single units of information that are the Chapter Three: Computational legal epistemology 128 quanta of legal data. For instance, in contract law, each of the concepts of invitation to treat, offer, acceptance, and consideration, is a legal monad. The idea of a monad is derived from the ontology posed by Leibniz (1714), the monadology. For epistemological use, monads also have logical descriptions; they may be understood in the light of Russell’s logical atomism. In a computer program, legal monads are units of data.

The term monad is chosen because a Leibnizian monad is unique, with its own locus, yet it may have some attributes in common with other monads. Black letter law lays down its units of information with this monadic consistency, as ontological units that can also be used as epistemological units, in its system of application to possible cases within its ambit. The eGanges application set out in Chapter Four, demonstrates the monads of the Vienna Convention and their systemic use. The interface of ontology and epistemology in relation to the units of black letter legal information, is considered in this practical framework of Chapter Four.

Computer technology carries constraints of data that are similar to those of the legal domain. Every unit of data in a computer program has its own locus; a unique unit of data allows efficient identification of the locus of the unit. As understood by Lovelace (above p.), a computer program is a set of instructions on what is to be given as output and what is to be done with user input. An instruction on a response to input might identify directly or indirectly, data at an address in the system that will be the output; the computer program is the sequence of steps through certain loci in the computer’s memory. If the same unit of data appeared in two or more locations, the sequence of steps to the output would have to be adjusted by instructions that accommodated the different locations, to achieve the intended result; uniqueness of data units allows efficient identification of the locus of the unit.

Likewise, every unit of legal information in an area of substantive law has its own locus in that system of ideas; a unique unit of information with a certain place in the system of information ensures consistency in its use. If the same unit of legal information appeared in two different locations, then consistency of use would have to be ensured and special measures would have to be provided to ensure the alternate locus did not redirect the user into another, inappropriate sequence of steps to an incorrect conclusion. Legal reasoning is the sequence of steps through certain loci in Chapter Three: Computational legal epistemology 129 the substantive field, according to the instructions of a client on the facts and goals of the client’s case.

In legal knowledge engineering, the distinction between similar units of information and the same unit of information is important. For example, intention to create legal relations in a contractual transaction is similar to, but not the same as mens rea, the evil intent in crime; human intent may be the common factor, but the nature of it in contract and in crime is a major distinguishing feature, so that consistency in the use of each is not required. Each form of intent occupies a different locus in substantive law without interference in the reasoning steps in which each is involved. In Chapter Three where the design of eGanges is discussed, the problem of the same monad in two different loci is discussed further.

The content of legal monads may be concrete or abstract. Insofar as it is concrete, a monad may raise a question or issue of fact; this is an ontological monad that may be established directly by the evidence in a client’s case. Insofar as the content of a monad is a legal abstract, the monad may raise a question or issue of law; it is epistemological and may be established indirectly by the evidence in a client’s case. Thus legal monads may be ontological or epistemological; mixed questions of law and fact are also possible. Legal argument may be concerned with the content of ontological monads and the content of epistemological monads.

Monads with ontological or epistemological content are also antecedents and/or consequents in the rules of law; they are logic units in conditional propositions. Ontological monads can be used as epistemological units; internally they are ontological and externally they are epistemological. Where monads have epistemological content, they may have further epistemological use as antecedents and/or consequents in rules of law; they are both internally and externally epistemological.

Any monad may have any length, whether it be an ontological monad, or an internally and externally epistemological monad. As computing data, a unit of legal information may be a short string or a long chunk of symbols or signs. The term quanta indicates variable measures; the fundamental units of a legal expert system may be diverse in length or size, like a mosaic. An example of a short ontological monad is ‘offer’; an example of a two word wholly epistemological monad is Chapter Three: Computational legal epistemology 130

‘vitiating element’. The longer a monad is, the more difficult it may be to use as an antecedent and/or consequent in a deductive premise, or as a subject or predicate in an existential inductive premise. Long chunks may be acceptable for the purposes of an abductive argument, even though the chunk may be broken up into some deductive or inductive components to, strongly or weakly, support or challenge, a rule of law.

The logical atomism of Russell (Russell and Whitehead, 1910) was developed after his critical review of the philosophy of Leibniz in 1900. It produced a Platonic realism whereby, in the real world as distinct from Plato’s ideal world, a plurality of irreducible, sensible entities with qualities entered into relations with each other. This logical paradigm was widely accepted in the twentieth century and was consolidated by the theories of relativity of Einstein (1920, 1922). In logical atomism, the irreducible, sensible entity could have many different relations with many other irreducible, sensible entities; it could be repeated in various places. The problem of its consistent use in each locus was not fully considered.

Russell pruned wholly epistemological units, using Ockham’s razor, from the scope of his logical atomism; every atom was ontological so that it could be established empirically. However, in the legal domain, epistemological units are used to simplify communications between legal experts or to indicate legal choice, and any such pruning would make the communications more complicated and points of legal choice less articulated. Purely epistemological monads facilitate disjunction which allows multiple alternative possible cases in the real world; they are the language of potential. This is discussed further in Chapter Three which poses a solution to the problem of unnamed or implied choice points in law; it is demonstrated in Chapter Five and the Appendix where the Vienna Convention has extensive disjunctions.

Insofar as the content of a monad is internally epistemological or ontological, it may be possible to find meta-rules about monad attributes that affect how they may be used epistemologically; such further epistemological attributes might expand or contract the derivations of meaning conveyed by the monads. For instance, in the legal domain, terms of law are sometimes triggers for rights and duties or other jural relations as described by Hohfeld (1913): power, privilege, immunity, liability, no rights, disability. This thesis does not explore such possible meta-rules. Chapter Three: Computational legal epistemology 131

Of course there may be a valid use of Ockham’s razor in the realms of logic, especially where brackets can be used in formal logic descriptions instead of an epistemological monad. For his logical atomism, Russell adopted the concept ‘terms’ to identify atoms that were suited to logical operations that determined truth or falsity. Terms might be true or false wherever and whenever they occur. Russell’s terms could be duplicated and relocated; he was not restricted, as law is, by a large, extensive system of terms, both ontological and epistemological, within a context of social power and organisation. An important principle of legal epistemology is that all people are equal before the law, i.e. that there will be the same judicial outcome in analogous case situations. A legal monad, especially one that is purely epistemological, must be used consistently in each case where it is used; the legal outcome is the exercise of judicial power to effect social organisation, not a matter of truth.

Constraints on duplication of terms through loci and uniqueness, is not a requirement of generic logic, and may be a hindrance to generic logical descriptions and operations.

Indeed, generic logic can transcend social power and organisation, and may be required to do so for critical evaluation. Such transcendental logic enters legal logic as abductive views of law; it may produce an heterostatic modification of the system of rules or it may have a homeostatic effect that stabilises the system of rules.

Like a context of social power and organisation, computational constraints also restrict duplication and location of units of data for the purposes of a computational system and program design. In the retrieval processes of computation, units of data have their own addresses; uniqueness of address is efficiently achieved by uniqueness of data. For large scale data, every efficiency reduces memory requirements. Efficient handling of legal information permits efficient programming.

The earliest stage of the legal system was ritualistic with procedures that focussed on the uniqueness and location of words in the perfect recitation of oaths of formalised claims (Gray, 1997, pp.116-8); duplication precision was required for a successful court action. So too, a computer program requires data precision in order for its computer language to be directed by the instructions it receives from the user; automation is ritualistic. Chapter Three: Computational legal epistemology 132

The division of legal information into discrete monads is a matter of legal expertise; it is something lawyers learn to do and to recognise. Insofar as there are different expert opinions about the division, these differences can be accommodated in a legal expert system and explained to the user. Monads are also the quanta of the visualisation of law in notional logic space; mostly, each can be accommodated in a unique place although each may have some attributes in common with other monads. As data in computer memory, monads may be strings or chunks of bits, the binary digits of information.

Legal knowledge engineering is concerned with both jurisprudential constraints and computational constraints on monads. Even where legal terms and legal ontologies are one and the same, location and use of monads are epistemological, not ontological. Given that this location and use will serve social organisation and power, legal experts maintain these epistemological constraints as a matter of domain standards. For this reason, epistemological soundness, rather than epistemological adequacy, is required in legal knowledge engineering.

3.1.2 Rivers

It is the rules of law that are applied to any client situation to determine its legal description or outcome. In logic, formalised rules are called conditional propositions. A rule is formalised as a conditional proposition when it is put into the format:

if (antecedent(s)) then (consequent)

Formalised rules are epistemological structures. The formalisation assists the specification of rules for the purposes of their application or processing for some purpose. In formal logic, rules are conditional propositions that are expressed algebraically as follows:

a → c

A conditional proposition with seven antecedents, connected by the logic notation ‘∧’, meaning ‘and’, indicating conjunction, could be represented as:

a1 ∧ a2 ∧ a3 ∧ a4 ∧ a5 ∧ a6 ∧ a7 → c

In the legal domain, many rules are connected in a system of rules that has a tributary structure like a River, as illustrated in Figure 3.1. Recently, a logic flow has Chapter Three: Computational legal epistemology 133

Figure 3.1: River map – a system of rules  Pamela N. Gray, 1990

been called a fluent (Son and Tu, 2006). The arrow that signifies ‘then’ is used in Figure 3.1 to represent the direction of flow of the River system. The notation ‘∧’ is straightened to a stream line that clearly separates each node on the River and provides streamlining. A conjunction stream is like a list of items that implies conjunction, addition, accumulation, a progression or a pathway. Each rule stream Chapter Three: Computational legal epistemology 134 has an arrow. In Chapter 5 and the Appendix, the large scale River system of the Vienna Convention is mapped.

The epistemology of 3d legal logic, transforms the algebraic representation of a system of conditional propositions to the geometric representation of a River system. Geometric rule formalisation provides object standardisation of legal information for determining knowledge structure and navigation, for identifying system heuristics, and for object-oriented programming, as explained in Chapter Four.

Legal monads, with either ontological or epistemological content, as antecedents and/or consequents in rule systems, are nodes in a River system; a legal monad may be both an antecedent in one rule and a consequent in another rule. In Figure 3.1, the monads c, d, e and f are antecedents of a primary rule as well as consequents of secondary rules. A jurisprudential system of rules arises where one or more monads occur in two or more rules. When each common monad is constrained to one locus by locking together the rules in which it occurs at the overlapping point of the common monad, then the conditional propositions flow together like tributaries of a river system. The visualisation of the jurisprudential system of 3d legal logic, allows for the overlap of monads where they occur as an antecedent in one rule and a consequent in another rule such as c, d, e, f and q in Figure 3.1, where they occur as the same consequent for two or more different rules, as a matter of disjunction, and where they occur as the same antecedents in two rules that have different flow directions, as a way of managing combinatorial explosion of possible cases. Below, disjunction is provided for as fan structures, and the control of combinatorial explosion in the legal domain is explained in 3d legal logic as part of a sphere of possible cases.

Once the monads in a selected field of law are identified, they can be ordered as antecedents and consequents in a system of rules of law, according to their meaning. Like the division of legal information into discrete monads, the characterisation of monads as antecedents, and/or as consequents in rules, the ordering of the monads as such, and the ordering of rules into a tributary structure, is a matter of legal expertise; different expert opinions about this can be accommodated and explained in a legal expert system. Chapter Three: Computational legal epistemology 135

The tributary structure of a River system represents a hierarchy of meaning. An antecedent on the mainstream may be defined or particularised by the antecedent(s) in a secondary stream. In turn, an antecedent in a secondary stream may be defined or particularised by the antecedent(s) in a tertiary stream. Quaternary stream antecedents, quinary stream antecedents and further upstream antecedents up to the watershed of the tributary structure, extend particularisation in a similar way.

The flow of a mainstream rule runs from its top or first antecedent through the subsequent order of antecedents to the consequent that is the Final result of the whole tributary structure. Secondary streams flow into mainstream antecedents where the consequent of a secondary rule is the same as a mainstream antecedent; the secondary rule is a particularisation of the mainstream antecedent. The particularisation of tertiary, quaternary, quinary etc. rules, as far as the black letter law actually goes, flows downstream in a similar way. It is in this sense that rules are hierarchical: the further upstream, the more intensive is the particularisation. The fine details of cases arise in the watershed of the tributary system. The paradigm of a River system accommodates the interlocking of a system of rules of law, for the logical management of extensive, complex meaning.

Some mainstream antecedents may be particularised more than others. The more particularisation of an antecedent there is, the more abstract a concept that antecedent is likely to be. A rule may contain both highly abstract antecedents as well as more concrete concepts. Some parts of a rule may be more particularised than other parts. The concept of granularity is useful to delineate the range of concrete to abstract antecedents. The principle of Ockham's razor (Russell, 1961, pp. 462-3) of only departing from the concrete to the extent necessary, can only be applied to the legal logic of Rivers by a holistic consideration of the effects of any pruning on flow. Unnecessary abstracts can be pruned for more efficient processing, just as they may be pruned by lawyers in taking instructions from a client. However, sometimes abstracts are useful to indicate choice points or provide deductive links between different concrete points, for the purposes of extended deduction.

Both legislative and common law rules may be formalised as Rivers. This representation of the rules of law is devised by an expert rationalisation of the collective of rationes decidendi, obiter dicta (noted as such) and statutory provisions. Chapter Three: Computational legal epistemology 136

For instance, the mainstream or most general rule in contract law provides that if there are all of: a. parties with capacity, b. consideration, c. agreement, d. compliance with form, e. intention to create legal relations f. no vitiating elements, and g. compliance with statutory requirements, then there is h. a valid contract.

Secondary rules of contract law may more closely define each of these mainstream antecedents: for instance, there are a number of requirements (notionally like i, j, and k in Figure 3.1) to establish that there is c, agreement. In the mainstream rule, in Figure 3.1, c is an antecedent, and in the secondary rule it is the consequent. It is settled in common law, that there may or may not be certain preliminary contractual negotiations, and there must be an offer and acceptance in order to form an agreement. In Australian contract law, there are legislative provisions that vary from state to state, on the age of majority, that also define contractual capacity, There are also legislative provisions that determine some of the detail of required form and vitiating elements. Tertiary rules might more closely define secondary antecedents (notionally r and s establish q in Figure 3.1), and so on.

The last consequent, h, can be thought of as the Final result of all the rules in the hierarchical structure. However, it is the Final result that largely determines the relevant monads and the order in which they should be placed. Some antecedents presuppose other antecedents; for instance acceptance presupposes that there is an offer to accept. Temporal considerations may also determine the order of antecedents. Ordering of antecedents and interim consequents affects the order of collecting evidence, presenting cases, and presenting legal arguments; it is a matter for legal expertise. To establish the Final result, h, all the antecedents in Figure 3.1 must be established, unless there is a choice of alternative ways of establishing a consequent, as a matter of disjunction. Chapter Three: Computational legal epistemology 137

In order to wholly formalise a rule system, it may be necessary to construe the wording of a rule of law to the formalisation requirements. Over many centuries, rules of law have been set out in judicial dicta and legislation, in many different literary styles. For instance, a rule might be expressed in terms of 'where there is (antecedent(s) then there shall be (consequent)’. Like 'where', the term 'whenever' may be used instead of 'if', and the term, 'then', may be implied rather than express. Provisos, clauses commencing with 'unless' or 'subject to', may also be standardised as antecedents; ‘unless x’ may be formulated as an antecedent, ‘not x’. Sometimes a consequent is stated in a rule of law before one or more of its antecedents. Reordering can place the consequent last. A rule statement sets out the circumstances, ordered as antecedents, in which its consequent arises in law. The formalisation of rules, as a matter of legal expertise, must occur either without changing the single meaning of the rule, or by proferring alternative interpretations as the formalisation reveals ambiguities of the legal logic in the language of the law.

In natural language, there are many ways in which a conditional proposition may be used, each carrying different logical significance. Different uses arise from the way in which items before the arrow and after the arrow are related, and from the different language that may be used before and after the arrow. For instance, if the antecedent(s) collectively imply the consequent, and, vice versa, the consequent implies the collective of antecedent(s), and this is a necessary relationship, then the conditional proposition is called a material implication; the arrow is replaced by a horseshoe:

a1 ∧ a2 ∧ a3 ∧ a4 ∧ a5 ∧ a6 ∧ a7 ⊃ c

Russell and Whitehead (1903) developed material implication, and used the logical word ‘implies’ instead of the ordinary use of ‘then’ to indicate a sequence. Peano reversed the letter C to indicate that the meaning of the conditional proposition included both if a then b and if b then a; he substituted the reversed C that stood for ‘consequent’ for the word ‘then’, to indicate this reversible meaning. The reversed C was developed as a horseshoe by Russell and Whitehead to indicate that it was this logical conditional with which they were concerned. Not all conditional propositions are material implications. Chapter Three: Computational legal epistemology 138

Where the meaning intended is that there is no other way of establishing c, then the sense used is:

If and only if (antecedent(s)) then (consequent).

This use is represented in formal logic by replacing the arrow or horseshoe with a tribar:

a1 ∧ a2 ∧ a3 ∧ a4 ∧ a5 ∧ a6 ∧ a7 ≡ c

An ‘if and only if’ material implication also may be possible, so that if and only if c, then a1 ∧ a2 ∧ a3 ∧ a4 ∧ a5 ∧ a6 ∧ a7, or if and only if c, then, in some order, a7 ∧ a6 ∧ a5 ∧ a4 ∧ a3 ∧ a2 ∧ a1.

Sometimes the standardised form of the conditional proposition uses alphabet symbols as variables so that alternate substantive content may be used, and sometimes the form does not represent alternate variables but some specific content only. Capital letters of the alphabet are used to represent predicates; thus a is a variable and Aa is a variable with a certain characteristic represented by A. An upside down A, namely ∀, indicates ‘all’ and a back to front E, namely ∃, indicates that the variable exists. There are various forms of logic notation, most of which are avoided in the epistemology of 3d legal logic; lawyers do not use logic notations, as law is a matter of natural language that can be used by people generally. Many legal terms such as marriage, licence, penalty, lease, and joint tenancy are used generally, even if they are only understood in a limited way; however, logic notations are generally not used as common language, except where they are taken from common language signs such as the ampersand, &.

Where c is expressed as a command; the antecedents are the occasion for the command, for example: if you go to school, take your umbrella. Some language indicates a hypothetical meaning; for instance, the subjunctive basis for the antecedent(s) would similarly qualify the consequent, for example:

Were there (antecedent(s)) then there would be (consequent).

It is also possible to set out antecedent(s) as steps in a procedure that achieves the consequent. Language may also be used unconventionally so that the meaning of a conditional proposition is untenable but may be humorous: Chapter Three: Computational legal epistemology 139

If Hannibal wins the race then I’ll be a monkey’s uncle.

Sometimes the relationship between antecedents and consequent may be true despite appearing unlikely:

If the train arrives at 9am then the zoo will open at 9am.

The meaning of a conditional proposition may be a chain of causation, whereby the antecedent(s) are the chain of causes and the consequent is the effect, or it may be a chain of reason whereby the antecedent(s) are the requirements for the consequent to be established. The rules of law are generally treated in this way and it is not usual that they are also treated as reversible; for example, the existence of a written contract does not thereby establish that the antecedents for its validity have occurred, and what they are. Even the existence of a valid contract does not imply a particular set of antecedents, as there are various ways of reaching a valid contract; disjunction limits the scope of reversible implication. However, as a matter of legal expert interpretation, the intended meaning or circumstances of some law might also be that the consequent presupposes the antecedent(s).

Material implication raises a question of fact that is not required in law. Law-making authority provides the relationship of antecedent(s) and consequent in hypothetical terms and the relationships between rules that has an inherent tributary system. Even legal abstracts that are epistemological are treated as hypotheticals. The rules of law are, prima facie, expressions with this ordinary hypothetical meaning of ‘if’ and ‘then’, with ‘then’ simply indicating the next step in the sequence listed by the conditional proposition. However, any further meaning of the black letter law that expresses a rule, has to be considered for each system of rules, both in terms of ontologies of monads and epistemological structures.

The study of valid argument forms that use conditional propositions began with the Stoic philosophy of the fourth and third centuries BC in Greece. Conditional propositions were a way of posing and studying possibility. Two forms of argument involving conditional propositions, were regarded by the Stoics as valid:

1. If the first, then the second. (Major deductive premise)

The first. (Minor deductive premise)

Therefore the second. (Conclusion) Chapter Three: Computational legal epistemology 140

2. If the first, then the second. (Major deductive premise)

Not the second. (Minor deductive premise)

Therefore, not the first. (Conclusion)

In the legal domain, adversarial argument, subject to disjunctions, also requires the following form as valid:

3. If the first, then the second. (Major deductive premise)

Not the first. (Minor deductive premise)

Therefore, not the second. (Conclusion)

Since this is not a valid syllogism, the following must be used;

4. If not the first, then not the second (Major deductive premise)

Not the first. (Minor deductive premise)

Therefore not the second. (Conclusion)

Once the meaning of the conditional propositions of law in a system of rules is determined as Major deductive premises for legal argument, then the appropriate valid argument form(s) can be used in extended deduction.

In logical argument, conditional propositions may be given a value of true, false or some degree of probability. In the legal domain, their truth is established by law- making authority. Following the Stoic work on possibility, the Sceptic philosophers of the second and first centuries BC in Greece, posed and developed degrees of probability; possibilities might have some degree of probability. Concerns of probability were adopted in legal epistemology as burdens of proof that pertain to the Minor premises; there are two standards of proof. In criminal suits, where the prosecution carries the burden of proof, the facts relied must be proven beyond a reasonable doubt; in civil suits, the standard is on the balance of probabilities. Probability is a concern of evidence that establishes the Minor deductive premises from the facts of a case; ontological monads are matters of evidence. However, these legal probabilities are calculated by reference to reasons, not mathematical calculations according to a theory of degrees of probability. Chapter Three: Computational legal epistemology 141

If a theory of degrees of probability were adopted by authorised law-makers, an epistemological monad could represent a total probability, as a summation of the different weights of several antecedents that are Minor premises in the system of argument, to be compared with the probability accumulated on a similar basis by the opponent in litigation. Thus far, trial procedures have not established how such mathematical probabilities of evidence might be collected; witnesses are not expected to give a mathematical estimate of their certainty. Only evidence that is true is admissible and false evidence is excluded. Burdens of proof are applied to select the true evidence from inconsistent evidence; all of the evidence is taken into account in making this selection. Probability equations have not been adopted in the selection process; it may be that one antecedent when singly compared to the opponent’s case, is decisive. One lie in the evidence of a witness may be treated as a reason to assume that all the remaining evidence of that witness is lies. However, it is possible for authorised law-makers to create Major premise monads with numerical information as antecedents or consequents in rules of law, so that Minor premises would have to be established accordingly.

In the legal domain, the River system is the structure of Major deductive premises for extended deduction. The Minor deductive premises arise from the instructions of a client or witnesses. A River system may have antecedents or consequents that are ontological monads, or monads with purely epistemological content. Ontological monads, as antecedents, may establish epistemological monads, as consequents. Epistemological monads may establish other epistemological monads. Through epistemological monads, ontological monads may establish other epistemological monads. The River system is an epistemological structure for applying the rules of law to a client’s case in order to produce the knowledge of a legal description or outcome of that case.

3.1.3 Fans

Just as a conditional proposition can have only one consequent in logic, so too, in 3d legal logic, a rule can have only one consequent. However, there may be many rules with the same consequent and that consequent may only be an interim consequent in a more extended legal argument. In the epistemology of 3d legal logic, rule disjunction is visualised as a fan structure, based on the paradigm of a hand fan and Chapter Three: Computational legal epistemology 142 the logical notation for disjunction, ‘∨’. It is called a fan to clearly distinguish it from the ‘v’ for versus in case reports.

A disjunction fan occurs in a River system where more than one conditional proposition has the same consequent, on the formalisation of rules of law. In a River system, where more than one stream flows into another stream at the same point, this represents disjunction, as indicated in Figure 3.2, where b is the common consequent of four secondary streams.

Fans in a River system indicate alternative pathways of requirements leading to the same result. The same consequent of several pathways has only one locus or one place in the visualisation of the epistemology of 3d legal logic; to achieve this, the common consequent is given one place only in the tributary structure and alternate streams of antecedents flow into this junction. Confluence at a common point produces a fan of alternatives. The common consequent indicates a choice of ways of establishing the common consequent.

To assist in the identification and construction of a River, formalised rules of law may be set out as a list in order to identify monads that are antecedents, consequents or common monads. Disjunction is indicated by common consequents. Figure 3.3 illustrates a list of rules that make up a River system. Overlaps of antecedents and consequents occur at c, d, e, f and q; an overlap of two consequents that produces a fan of two alternate streams, also occurs at e. Figure 3.4 and Figure 3.5 show how the list is constructed as a River system by locking together common monads so that they occupy only one place in the system.

A River system may have one or more mainstream rules as a mainstream disjunction. Several secondary streams also may enter a main stream at the same place as a secondary disjunction. Disjunction may occur at any level in the hierarchy of rules. In logic, there may be one or more antecedents in a conditional proposition. Conjunction may occur in any stream, including a fan stream, that has more than one antecedent; the several antecedents are linked by the conjunction ‘and’, indicating a path of requirements. Disjunction is represented by a fan of streams, each of which may have one or more antecedents; conjunction may occur in a disjunction. Chapter Three: Computational legal epistemology 143

Figure 3.2: Fan of wholly formalised rule streams in a River system  Pamela N. Gray, 1996 Chapter Three: Computational legal epistemology 144

Figure 3.3: List of wholly formalised rule streams of a River system.  Pamela N. Gray, 2003

Figure 3.4: Two formalised rule streams of a River system locked together.  Pamela N. Gray, 2003 Chapter Three: Computational legal epistemology 145

Figure 3.5: River map of formalised rule streams.  Pamela N. Gray, 2003

Where there are two or more rules with the same consequent but different antecedents, they may form alternative ways of establishing an interim consequent or the Final result of the River system. In Figure 3.5, where two secondary streams share the same consequent, namely e, the mainstream antecedent e is also a consequent of the two alternate secondary streams. Whereas in Figure 3.1 which has no fans, every antecedent that is necessary and sufficient must be established in order to reach the Final result, in Figure 3.5, only one fan stream has to be established along with all the other non-fan antecedents that are necessary and sufficient conditions, in order to reach the Final result. For instance, in the contract law mainstream, b, consideration, may be established in several alternative ways: by means of property, services, promises, money or some combination of these four forms. In an age of digital money, it is no longer clear that the jural relations associated with the digital record amount to a form of property to be given as a benefit or suffered as a detriment, or both. Chapter Three: Computational legal epistemology 146

To accord with the definition of consideration in Currie v Misa (1875) L.R. 10 Exch 153, each of the five alternative items of consideration must be given as a benefit or forgone by way of suffering a detriment; the benefit of detriment must also be by way of exchange. Prima facie, there are at least three antecedents in each secondary stream that establishes consideration: items, some value, exchange. If consideration is by way of services, there may also be a fourth antecedent, namely compliance with services law. For property, compliance with property law may also be a requirement, as well as ownership. Tertiary streams would be required to represent the further disjunctions where they arise from secondary stream antecedents. This choice of five alternatives can be represented as four secondary streams in a fan structure arising from the antecedent b in much the same way as illustrated in Figure 3.2. It is not necessary to have a fifth secondary fan stream, as the fans are not mutually exclusive. The combination of two or more of the alternatives is available. This is a fan meta-rule. Some fans may be mutually exclusive.

There are at least three fan meta-rules. Firstly, for some fans, any one or more of the alternative fan streams may be satisfied in order to establish at least one fan stream. Secondly, some fans permit a choice of one fan stream only, in which case the alternatives are mutually exclusive. Thirdly, some fans are conditional, whereby the selection of one alternative precludes the selection of others, or whereby the non- selection of one alternative allows the selection of others.

Not all fan streams in a fan may have the same number of antecedents. Fan streams may contain one or more antecedents, as illustrated in Figure 3.2. Depending on the number of monads in each fan stream, fan meta-rules, and the number of fans a River contains, the available alternative cases are captured, holistically, in the tributary structure; this holistic representation allows overlapping to be managed. All the alternative cases have the same Final result of their River system. Some of the available argument pathways will be shared by some cases; the same monads may be part of different legal arguments that have the same Final result. Some monads are the same antecedents in two different cases; there are overlapping sets of alternative valid arguments that achieve the same Final result. For instance, many different contractual transactions have valid contracts.

Disjunction in a River system produces a combinatorial explosion of possible cases that have the same Final result. There can be only one Final result because a Chapter Three: Computational legal epistemology 147 common consequent in alternate mainstreams occupies only one locus. The more fans in a River system, the greater is the combinatorial explosion of possible alternate cases with the same Final result. Thus, the River system represents alternate, sometimes overlapping, possible cases within the ambit of a system of rules.

The determination of fans is a matter of legal expertise. Lawyers are trained to recognise alternatives in the meaning of the rules of law. The selected alternative(s) that will apply to a case, in order to determine the outcome of the case, will be argued as such, but so too it may be argued that the precluded alternative(s) are not the case.

The computational epistemology of 3d legal logic takes rule formalisation beyond the formalisation of single rules as conditional propositions. It requires sound formalisation by a map of all overlaps such that repeated legal monads occupy only one place in the map. Any implicit rules must be made express to reveal the system of overlaps; the formalisation of the implicit requires its expression. It is necessary to look at the meaning in the express units of information to identify overlap content and where the implicit must be expressed. Overlaps allow the connection of rules as the formalisation of a sequence of linked premises in an extended deductive argument.

In law, a choice of arguments is sometimes available. Fans provide alternative case pathways in the River to the common Final result. Boole (1847, 1952) formalised disjunction algebraically; the computational epistemology of 3d legal logic formalises it geometrically to visualise alternatives in the choice structure of the fan. Choice is necessary to logic. Natural language requires a choice formalism in order to represent the logic structure of disjunction. The formal structure can provide the choice points that are either express or implied by natural language; sometimes there is not even a name given to the point at which a choice is implicit.

3.1.4 Strata

Lawyers apply the rules of law to the facts of clients’ cases in order to develop arguments to achieve outcomes that clients want. The rules of law are applied deductively; if the antecedents of the Major deductive premise that is the rule of law, are satisfied by the Minor deductive premise of the client’s case, then the consequent Chapter Three: Computational legal epistemology 148 of the Major deductive premise necessarily applies as the legal consequent of the client’s case. Sometimes the rules do not clearly permit the outcomes sought by the client; so other modes of legal reasoning, pertaining to the rules, must be relied upon where available. These other modes of legal reasoning are abductive; they may support or oppose a deductive rule, strongly or weakly, but they do not apply as a matter of necessity to determine the legal consequent of the client’s case. In this sense, abductive means an argument that leads to the River of rules from somewhere else adjacent, such as a parallel area of reasoning. Abductive arguments may clarify the scope of the rules of law in regard to the client’s case. Six examples of abductive modes of reasoning are as follows:

● Reasons for the relevant rule or part thereof; for instance, the reliance of Lord Atkin on the biblical commandment of love thy neighbour, as the basis for a duty of care to one’s neighbour, founded an action in negligence, as a strong abductive argument: Donoghue v Stevenson. However, it is not a defence to an action for negligence that the defendant loved the Plaintiff; this is not a rule of law. A defendant might argue that this Christian bias of love and caring is not a proper basis for law in a multi-cultural society; at present, this is a weak abductive argument.

● References for authorities for rules: binding and persuasive precedent cases, and legislative provisions. Where one case is authority for various parts of the tributary structure, it should be possible to plot the points of a case through the river systems in order to consider the importance of the case and the exact extent that it might be overruled. Two cases so plotted can be seen to have similarities and differences. The end result of each can be related to these similarities and differences. Different cases may have the same or different outcomes.

● Commentary on the rules in terms of rights, duties or other jural or deontic relations.

● Moral, social, economic or other justifications or explanations for rules, derived from dicta, legislative commentary, or elsewhere.

● Critical comment on rules or part thereof, by reference to their systemic effect or incoherence. Chapter Three: Computational legal epistemology 149

● Legal expectations and strategic advice concerning rules or some part thereof.

Abductive modes of legal reasoning may be expressed in natural language or formalised in some way. If they are formalised as a River system of conditional propositions or deductive premises, then they will overlap at some point of an antecedent or interim consequent, with the River system of the rules of law. In the abductive River, which is deductive in itself, but not in regard to the rules of law, an antecedent or an interim consequent in the rules of law may be the Final consequent in the abductive River; it may be that there is a common interim consequent or common antecedent in the rules of law and their abductive companion reasoning, each belonging to a different deductive scheme. An abductive mode may constitute a parallel River system that is linked to an antecedent or interim consequent in the River system of rules of law.

Abductive legal reasoning may be visualised as a collection of monads in certain relationship, that has a link to a specific monad in a system of rules of law; it enters the River of Major premises of rules of law from its place that is lateral to the River, like substrata of the River, to supplement the extended deductive reasoning that can be used when a client’s case invokes Major deductive premises of the rules of law. The substrata provide the depth of reasoning about rules of law.

In the epistemology of 3d legal logic, the abductive modes of legal reasoning are visualised as Strata which sit beneath the River structure, as illustrated in Figure 3.6 and Figure 3.7. Figure 3.7 is a rendered version of the 3d wire frame in Figure 3.6; it gives the virtual environment of a canyon to the metaphor of River Strata, so that a sense of play or fun, as well as a mnemonic shape for learning, is introduced. The canyon visualisation also indicates that the law, with its abductive Strata, can be operated within another system according to the elements and principles of that other system; this is an epistemological perspective captured in the work of Wittgenstein (1921, 1953) and the Legal Realist School of jurisprudence. Wittgenstein also introduced the limits and implications of the use of language as a system structure of epistemology; since language sometimes uses conditional propositions to mean a necessary relationship between the antecedents jointly and the consequent, this provides a logic structure that can be used as an epistemology. There could be other epistemological structures. Chapter Three: Computational legal epistemology 150

Figure 3.6: River map with Six Strata below it - wire.  Pamela N. Gray, 1996

Figure 3.7: River map with Six Strata below it - rendered.  Pamela N. Gray, 1996 Chapter Three: Computational legal epistemology 151

Initially Wittgenstein was concerned with the the logic of language and the affect of language on logic. He was a protege of Russell at Cambridge and initially adopted Russell’s logical atomism. However, he saw in a proposition that constituted logical atoms not only a picture of a possible thing or state of affairs that could be proven empirically, but also a logical structure, a concatenation of simple objects, represented by simple names, that amounted to facts. This epistemological structure was derived from language.

The epistemology of 3d legal logic extends Wittgenstein’s picture theory of meaning from the use of language in a representation of real world images, to pictures of the ontology of logic itself, both as geometric ideographs and as metaphysical objects or intellectual artefacts. The treatment of law in Figure 3.7, as a Japanese origami ornament, or an Australian schoolchildren’s paper aeroplane or dart, is sometimes a political approach. Knowledge domains can be game domains, consistently with the perception of the human species as homo ludens, gameplayers (Huizinga, 1938, 1957).

The mix of deductive, inductive and abductive premises in legal argument is an indication of the irregularity or non-monotonicity of legal logic. Generally, deduction prevails over induction and abduction, and induction prevails over abduction; but not always. Inductive premises are limited to inductive argument and abductive premises are limited to abductive argument. Inductive premises establish a deductive antecedent by necessity; they are existential assertions such as ‘one peppercorn is consideration’, where consideration is the deductive antecedent necessarily established by the inductive existential assertion. The weight of an abductive argument, be it strong or weak, reinforces or may break an extended deductive argument or one of its inductive sub-arguments. Unbroken extended deduction establishes the Final consequent by necessity.

The representation of abductive Strata information is one indication of the three dimensionality of legal logic. Further three dimensionality occurs in the extended deduction of large-scale law, adversarial deduction, and its inductive detailing. There is a composite three dimensionality in legal logic. Chapter Three: Computational legal epistemology 152

3.1.5 Nested Logic

Where River graphics become too dense to be useful, as indicated in Figure 3.8, massive representation is simplified by nesting to suit the constraints of human perception and cognition. Selected parts of a large-scale River system are nested as submaps. Nesting required to capture a large, complex area of law, may be extensive, as illustrated in the Vienna Convention application in Chapter Four. The levels of nesting may be visualised, three dimensionally, as a stack of levels or two dimensionally as a requirement for a microscope.

A monad or node may contain its nested submap so that, in order to see the nested submap, it is necessary to zoom in on the content of the node. In the submap, there may also be nodes with further submaps, and so on. To return from a submap, it is necessary to pan back out. Thus the River hierarchies may be broken up and arranged as nested sub-systems with a zoom from macrolevels through to microlevels.

Each level should represent no more detail than is comfortable for human perception and cognition. Nodes with submaps are identified by a pattern, whereas nodes without a submap are empty. A standard pattern such as that of a soccer ball or something similar may be used to indicate submapping; standard patterned nodes, called soccer ball nodes, are used in the Vienna Convention application.

Zoom or pan access to adjacent nested levels is via the antecedents at one level which are also interim consequents for the next nested level. The candidate’s video, 3D Law (1997), made with the assistance of her son, Xenogene Gray, video technician, John Merkel, and graphic artist, Jo Barnes, at Charles Sturt University, Bathurst in New South Wales, Australia, includes a short virtual reality program that shows zooming in and out of a nested submap in a River representation of the Anti- discrimination Act 1977 NSW s.24. The phenomenon of submapping requires notional three dimensional mindspace, or logic space, that is well represented in the cyberspace of virtual reality. However, two dimensional Rivers may be used to represent submaps. Figure 3.10 is the submap of the node ‘No Australian link’ which occurs in the initial map in Figure 3.9, shown in the Rivers window of the interface of the eGanges software. Chapter Three: Computational legal epistemology 153

Figure 3.8: Complex River – extensive system of rules  Pamela N. Gray, 2003 Chapter Three: Computational legal epistemology 154

Figure 3.9: Initial map for Ok to send message – Australian Spam Act 2003 © Philip Argy, Pamela N. Gray and Xenogene Gray, 2005 Chapter Three: Computational legal epistemology 155

Figure 3.10: Submap for No Australian link – Australian Spam Act 2003 © Philip Argy, Pamela N. Gray and Xenogene Gray, 2005

In the course of nesting River submaps, there is an opportunity to shape the main map as well as its submaps, so that they take on mnemonic shapes that are both easy to recognise and learn, as well as being so by virtue of the artistic appeal of their form. This opportunity merges epistemology and art in much the same way as legal oratory merges argument and engaging speech. The Australian artist, Power (1934), sought to derive two dimensional wire frames of great paintings in order to map the designs that had such a great impact. His book contains leaves of tracing paper that he used over photographs of the paintings to trace the wire frame designs. It is this sort of study of art that may be pertinent to the design of visualisations of legal logic that promote cognition and memory of the law.

If an initial River map contains several nodes with submaps and it is possible to zoom into these nodes simultaneously to see their submaps, then the nested River system has a fractal nature; further nodes in the submaps may have submaps that can be zoomed into simultaneously, and so on as far as the knowledge goes. Legal Chapter Three: Computational legal epistemology 156 fractals may be run recursively so that the different depths of a submapping series may provide irregularity and diversity in the simultaneous pictures at any time. Bolzano (1837, 1950) suggested that epistemological infinity, an ontology of logic, could occur through sub-sets ad infinitum; the paradoxical relationship between ontology and epistemology has a nexus in the concept of infinity. However, the law is limited and so too is its defined scope of possibilities. Zooming into the depths of submapping that may vary from one part of the River system to another is a process like knowledge mining.

Strata and Nests are not the only the representations that require notional three dimensional space for the visualisation in the epistemology of 3d legal logic. Adversarial logic also has other knowledge structures that require notional three dimensional space.

3.1.6 Triads

Because law is adversarial, there is a corresponding set of hierarchical rules or River system for the opposing or negative case. For instance, in contract law, if the alternative possible cases for a ‘valid contract’ constitute the Positive River, then the Negative River will have the Final result, ‘no valid contract’. Thus, in contract law, if there is one or more of the following: a. a party with no capacity, b. no consideration, c. no agreement, d. no compliance with form, e. no intention to create legal relations, f. vitiating elements, or g. no compliance with statutory requirements, then there is h. no valid contract.

These negative monads are the contradictories of the positive monads; their disjunction makes up alternative negative cases. The negative mainstream also has a set of secondary, tertiary, quaternary, quinary etc streams corresponding to those in the Positive River. The strongest negative case occurs if all the antecedents in the Chapter Three: Computational legal epistemology 157

Negative River are established; this produces the Wholly Negative outcome or Final result of the Negative River.

The adversarial nature of logic was acknowledged by Whitehead (1933, p.291):

Since each proposition is yoked to a contradictory proposition, and since of these one must be true and the other false, there are necessarily as many false as there are true propositions.

In the real world, before a case is finally decided by a court, there may be uncertainty about the facts of the case that establish antecedents in the rules of law, and thus the Final result. Hence there is a third River, corresponding to the Positive River, that represents the possible factual or legal uncertainties in a case. These rules are heuristics of legal expertise, representing the issues of fact and law which must be managed by lawyers in the conduct of a case. Uncertain monads make up the uncertain case. If all antecedents are uncertain, then there is the strongest uncertainty, the Wholly Uncertain result pro tem.

For the positive mainstream in contract law, the corresponding uncertain mainstream is as follows: If there is one or more of the following: a. a party with uncertain capacity, b. uncertain consideration, c. uncertain agreement, d. uncertain compliance with form, e. uncertain intention to create legal relations f. uncertain vitiating elements, or g. uncertain compliance with statutory requirements, then there is h. uncertain valid contract.

The strongest uncertain case occurs if all the antecedents in the uncertain River system are established; this produces the Wholly Uncertain result.

In order to illustrate graphically the three tributary structures, the Positive, Negative and Uncertain Rivers, in their corresponding positions, they must be set out in notional three dimensional space, as illustrated in Figure 3.11, where rings delineate the plane of each River and link the three starting points and end results of each River. Chapter Three: Computational legal epistemology 158

Figure 3.11: Adversarial triads – contradictory and uncertain correspondence  Pamela N. Gray, 1990

In some areas of law such as the Australian federal Spam Act 2003, there are monads or nodes common to opposing cases; these areas of law raise only partially contradictory cases for opponents. The case for each opponent must be mapped as a Positive River with its own corresponding Negative River; it cannot be assumed that the Negative River of one opponent is equivalent to the Positive River of the other opponent. The two Positive Rivers, when compared, may have some contradictory monads and some monads that are the same.

Sometimes, where strict compliance is required, contraries in the two Positive Rivers are treated as contradictories; this may be so even if they are not mutually exclusive by necessity. For example, where a red tail light on a car is required, a faded red tail Chapter Three: Computational legal epistemology 159 light or a bright orange tail light are contraries that are treated as contradictories for prosecution argument. There may be a requirement to remove tail lights of certain colours.

In the Australian spam legislation, ‘electronic message’ is one of the common monads that has compliance categories as well as non-compliance categories. The prosecution must show that there is an electronic message that falls into the non- compliance category, and if the compliance case wishes to rely on the compliance category, then prima facie, there must be an electronic message. This non- monotonicity explains why lawyers do not assume that the Negative prosecution case is equivalent to the Positive compliance case; if they did, they might miss arguments such as the compliance category of electronic messages. Legal experts advise on the basis of the Positive and Negative of both opponents’ cases; they use a meta-epistemological methodology.

Where there are monads common to opposing cases, both negative and contradictory arguments may be used by an opponent. For instance, in a spam prosecution, the defence may argue both that the prosecution has failed to prove that an electronic message was sent (a negative argument for both opponents or common contradiction) and also that if an electronic message was sent, then it falls into a compliance category (the negative argument for the defence or the opponent’s contradiction). Pleadings in the alternative have been permitted since 1705: 4 Anne, chap 3, 24 (Gray, 1997, p. 109). The disjunction of the two arguments of common contradiction and opponent’s contradiction, is a meta-disjunction that requires meta- epistemological methodology.

Mirror images of opponents’ Rivers, which is legal monotonicity, may also be lost where there are fans and where there are mutually exclusive (mutex) fans. In logic, disjunction that is not mutually exclusive is called weak or inclusive, and disjunction that is mutually exclusive is called strong or exclusive; logical notation uses v to indicate disjunction between two things, and places an underscore beneath the v for mutually exclusive disjunction. In the epistemology of 3d legal logic, the fan may consist of one v or more that overlap, arranged at any angle and size to suit the mapping visualisation. A single v has two fan streams and a multiple disjunction has multiple fan streams; any fan stream may have one or more antecedent nodes, irrespective of the number of antecedent nodes in the other fan streams of the fan. In Chapter Three: Computational legal epistemology 160 law, fans are common, but mutually exclusive fans are not so common; fans may also be partially weak and partially strong.

A fan in the Australian spam legislation illustrates partially weak and partially strong disjunction as well as streams that have different numbers of nodes. The Initial compliance map has a fan of four mainstreams, three of which have single antecedents and the fourth of which contains, amongst others, the contradictories of all these antecedents. Thus the multi-antecedent fan is mutually exclusive to the other three fans; a client’s case cannot rely on the multi-antecedent fan if it satisfies any one or more of the single antecedent fans, and vice-versa.

Effectively, where a mutually exclusive fan arises due to contradictories, each contradictory is available to one party, Thus, under the Australian spam legislation, the party seeking to comply may send a spam message that has no Australian link, or a spam message that has an Australian link but complies with the other requirements for that option. There is no constraint on messages that are not electronic and no constraint on messages that are not electronic commercial messages, as defined. There are constraints on Australian link commercial electronic messages as defined that do not comply with the requirements for that option; cases will either comply or not comply. Figures 3.9 and 3.12 are corresponding Initial maps for the compliance case and the prosecution case, respectively, in regard to a prosecution of a spam offence. The two Positive Rivers in Figures 3.9 and 3.12 are not mirror images of each other, not just because they have common monads, but also due to the nature of the contradiction of disjunction; conjunction is the contradiction of disjunction in opposing cases. Instead of a prosecution fan corresponding to the compliance fan, there is a single prosecution stream of antecedents with stream links between nodes that represent ‘and’, as the single conjunction stream. Common nodes and fan disjunction provide choice that introduces non-monotonicity and removes or qualifies mirror imaging. Separate Positive maps for compliance and prosecution, each with its own Negative and Uncertain Rivers, permit a comparison of all Rivers to distinguish the choice of arguments in relation to the context of each map. Chapter Three: Computational legal epistemology 161

Figure 3.12: Initial map for prosecution – Australian Spam Act 2003 © Pamela N. Gray and Xenogene Gray, 2005

Prima facie, each fan on the positive river has a corresponding fan on the negative and uncertain rivers, respectively; however, where a River has fans, then the graphical correspondence may be only tentatively represented as a mirror image. Fan meta-rules qualify the implementation of the consequent of each fan stream until either one fan is established for the Positive case, or all available alternatives are exhausted, for the negative case. The structure corresponding to a fan is effectively a single rule containing the alternatives of the fan, as antecedents linked by conjunction. In short, the fan of alternatives in a Positive River has a corresponding single Negative stream that contains the negative alternatives and the negative consequent, and vice versa, the single Positive stream with antecedents linked by conjunction, has a corresponding negative fan. In logic notation, an inverted v is used to indicate conjunction but this is not used in the single conjunction River which is more like a list that implies conjunction of the items on the list.

Where a fan stream has more than one antecedent, there must be a failure of at least one of these antecedents for that stream to fail; if all streams in a fan have more than one antecedent, then the consequent of the fan will fail when at least one antecedent in each of the fan streams fails. Chapter Three: Computational legal epistemology 162

Implosion of choice occurs where a fan stream has more than one antecedent, so that in the opposing River, this single fan stream must be represented by a fan of alternative contradictory antecedents. This also is illustrated in Figures 3.9 and 3.12. The need for separate Positive maps where there are antecedents common to the opponents, may arise in all or part of the sub-mapping, depending upon where the common antecedent occurs.

Common monads may also indicate the distinction between freedom and prohibition. In the spam legislation, there are effectively two categories of electronic messages: (1) those that may be sent (the freedom category) and (2) those that must not be sent (the prohibition category). The contradiction of the freedom category is the category that ‘may not be sent’; some freedom is restricted but other freedom is gained, namely the freedom from spam. The contradiction of the prohibition category is the ‘not prohibited’ category, which sets out the freedom to spam, not a requirement to spam; there is no category ‘electronic message that must be sent’.

It is necessary for the prosecution to establish that an electronic message was sent, even though an electronic message may be lawful; an electronic message is not ipso facto unlawful. Compliance requires a consideration of electronic message before the permitted categories of electronic messages can be considered. The negative case of compliance is not equivalent to the prosecution case, although in some areas of law where there are no common monads, for both opponents, it may be.

If there are no common monads, the lack of mirror imaging due to fans can be accommodated in the triad representation; there is no argument disjunction of negative and contradiction. If one or only some fan streams are satisfied, the failure of the other fans is part of the case of the party with the fan options, and consistent with a successful outcome for that party.

Uncertainties are provided for pro tem by the burden of proof rules. Until an issue of fact is resolved, an uncertain antecedent favours the party who does not have the burden of proof in respect of that antecedent. It is possible for a party to succeed on uncertainties alone because the party with the burden of proof has failed to prove a case.

Each uncertainty in the Uncertain River is resolved firstly, for issues of fact, by reference to the relevant considerations of the burden of proof, and secondly, for Chapter Three: Computational legal epistemology 163 issues of law, by reference to any legal arguments for clarification of the scope of the relevant deductive antecedent in the rule system. There are two considerations regarding the burden of proof, namely, whether the positive case or the negative case carries the burden of proof, and whether the burden of proof so carried is the civil burden (on the balance of probabilities), the criminal burden (beyond a reasonable doubt), or some statutory burden. Burdens of proof are concerned with empiricism in litigation, that is, the establishment of the Minor deductive premises of the user from the facts of the user’s case. Arguments to resolve issues of law may draw on abductive premises from the Strata available for positive or negative cases. The scope of the antecedent also may be argued inductively.

The choice of contradictories and uncertainties is another form of disjunction; it further increases the potential for combinatorial explosion and possible cases; unknowns can be anticipated and provided for in advance on a large scale.

3.1.7 Spectra

For dealing with the scope of an antecedent by induction, the epistemology of 3d legal logic adds inductive spectra to the triad structure, as illustrated in Figure 3.13. The instances of an antecedent are set out in an order that progresses with minimal difference from one to the next as the range of instances permit. A spectrum might be thought of as a stream of stepping stones between the three Rivers. Each stepping stone is different to the others; some may be only marginally different.

The simplest example of a spectrum in law is the peppercorn rule that states that a peppercorn is consideration in a contractual transaction; a peppercorn is nominal quid pro quo. The spectrum that connects the negative, positive and uncertain peppercorn monads is as follows: zero peppercorns is the negative stepping stone, 1- n peppercorns are the positive sector of the spectrum, and >0 to <1 (greater than zero to less than 1) is the uncertain sector. Although there is no specific case where 20 peppercorns were held to be consideration, it is clear that the rule of one peppercorn can be extended along the spectrum to any number. Each triad of corresponding antecedents, the positive, the negative and the uncertain, has a sector in the Spectrum of inductive instances. There may be a fine line between two sectors where there is only a marginal difference between the last instance in one sector and the first instance in the adjacent sector; there may be a rainbow effect where it is difficult to Chapter Three: Computational legal epistemology 164

Figure 3.13: Triad Spectra © Pamela N. Gray 1997 distinguish the point at which a significant difference becomes apparent, although the deductive antecedent of an applicable sector may be crucial to a litigant’s success.

Any inductive instance in a sector establishes the deductive antecedent that it particularises. Thus any of the negative inductive instances establish the negative deductive antecedent; any of the positive inductive instances establish the positive deductive antecedent; and any of the uncertain inductive instances establish the uncertain deductive antecedent. Until all inductive alternatives in the positive sector fail, the positive antecedent does not fail; only the collective failure of the sector precludes a positive consequent. These are the Spectra meta-rules; they apply in a way similar to fan meta-rules. Fan streams are rules of law; inductive instances are derived from the facts of cases or the common sense spectra of meaning in language, as they detail the antecedents in the rules of law.

A Spectrum is a continuum of contrary possible instances. Each instance might also represent an existential statement or inductive premise that asserts that the particular instance is a member of the universal class described by the deductive monad. Where the universal is constituted by several variables, it may be that the Spectrum takes the form of a matrix Spectrum, still with three sectors supporting the corresponding deductive antecedents. For instance if green leafy vegetables is the positive monad, Chapter Three: Computational legal epistemology 165 there may be a positive range of species of green leafy vegetables in the positive sector of the Stratum, ordered by marginal differences in green, and in leafy, to cover the combinatorial explosion of possible combination of more or less green and more or less leafy; in the negative sector there will be a list of species that do not fall into this category, with the ordering by marginal differences continuing along the matrix of possible combinations of more or less non green and more or less non leafy. Where the gradation of differences produces boundaries between the sectors that are fine distinctions and legal classification of species is required, an issue of law is raised. Until the issue of law is resolved by an authoritative ruling, a pro tem uncertain sector will include the unclassified species.

Some Spectra are numerical; some are qualitative; and some are a combination of the two. The resolution of a boundary between sectors may introduce a further attribute. For instance, the uncertain sector for peppercorns may be divided according to a taste rule: if the amount of pepper is ‘sufficient for taste’ then it is consideration. This could create a matrix of three variables: the range of peppers, the quantity, and sufficiency for taste. Spectra more readily permit the identification of gaps in the continuum, and indicate how the gaps might be filled as a matter of inductive Spectra argument.

Spectra also permit a better understanding of matters that are both issues of law and issues of fact. An inductive instance may raise both an issue of law, namely into which sector of a spectrum the item falls, and an issue of fact, namely whether or not the party with the burden of proof has established that this instance actually occurred in the case. Issues of law concern the monad(s) in the Major deductive premise; issues of fact concern the monad(s) in the Minor deductive premise. It may be that once the issue of law is resolved, the matter may not be pursued as an issue of fact. There is a procedure available to resolve issues of law before a trial to establish facts proceeds.

Uncertainty may be due to gaps in the law, or it may be due to factual uncertainty. If it is due to factual uncertainty, then the onus of proof determines the case that an uncertainty supports. If uncertainty is due to a gap in the law, then the party who has the onus of factual proof must also show that the gap is absorbed by her/his sector of the triad. Thus, the positive contract case with grains of pepper less that 1 must show that this is sufficient consideration: that the issue of law on nominal consideration is Chapter Three: Computational legal epistemology 166 resolved by extension of consideration below one peppercorn to the grains of pepper that were given. The positive marker must be expanded to reduce the uncertainty zone by an even finer distinction.

Finer definition of a deductive antecedent may be by way of fan alternatives, finer hierarchical rules that may be submapped if necessary, or by way of inductive spectrum. In the legal domain, rules of law must be found for deductive submapping. Usually the facts of a precedent case, as distinct from material facts of a case areconsilience. inductive instances for definitional spectra. The range of synonyms and antonyms may also be included in the contraries along a spectrum. Inductive detailing is applied as a matter of definitional, existential or material necessity to establish a deductive antecedent on the River.

The scientific epistemology of Whewell (1859), developed in his historical studies of induction, laid an inductive foundation for scientific theory that Russell and Whitehead, a century later, sought to strip down to empirical atoms that could be used more flexibly. Whewell recognised that ideas were superimposed on data as generalisations; he called this phenomenon ‘colligation’. His epistemology imposed on colligations three tests of truth: (1) empirical adequacy, (2) successful prediction and (3) deductive consilience. If a colligation is verified empirically and can be used to make successful predictions, then it is valid scientific method. Furthermore if a colligation reconciles heretofore unrelated data in this way, then it is a consilience; if the consilience founds reasoning by necessity, then the method is a scientific epistemology. The overall visualisation of the method posed by Whewell was that of a River system with data flowing together through colligation and consilience to form the tributary system of a theory. The River visualisation of the epistemology of 3d legal logic is analogous to Whewell’s visualisation. However, in the legal domain, the tributary system is not constructed by Whewell’s three tests of truth. It is constructed as a matter of what has been laid down by law-making authority. The River in the legal domain is a system of rules rather than a system of scientific theory.

Induction in the legal domain, as visualised in the triad streams in Figure 3.13, arranged as far as possible in spectra, could be said to be a colligation, each inductive instance of which can be used in the process of arriving at a prediction about a client’s case. This legal induction is necessary due to each instance being Chapter Three: Computational legal epistemology 167 either an authoritative existential or definitional statement of a law-maker, or it is necessary due to its position in the spectra vis a vis the three sectors of positive, negative and uncertain. Although there may be no case authority for 15 peppercorns being consideration, it can be assumed that, necessarily, if one peppercorn is sufficient as something, then any number of peppercorns above one will be sufficient. The inductive spectra arise, as matters of existential or authoritative necessity, to detail the deductive antecedents in the rules; once in place, gaps may be filled as a matter of spectral necessity or material implication.

Following his work with Russell on logic atomism, Whitehead left Cambridge and eventually took a post at Harvard where he produced his major work on process and reality that amounts to an epistemological realism (Whitehead, 1929). In the process of becoming, ‘actual occasions’ emerge as realities; so too a system of ideas is established which Whitehead called the philosophy of organism. Bertalanffy (1968, 1972) applied this epistemology to real organisms to found systems theory, analysis and design (cf.*). The epistemology of 3d legal logic uses systems theory, analysis and design to arrive at the jurisprudential system of 3d legal logic, its structures and processes, as an implicit epistemology in the legal domain, set out in this Chapter. Deductive and inductive monads arise over time by case law or legislation to become specific jurisprudential systems in accordance with the generic system of the epistemology of 3d legal logic; they continue to emerge as heterostatic or homeostatic additions to the systems. From time to time, some are discarded or relocated.

The process philosophy established by Whitehead provides an epistemology suited to technology and the arts. Using various hypotheses and theories, synthetically, technologies may be constructed as systems that achieve some purpose. Whitehead viewed philosophy as the critic of abstraction. In constructing and testing a system according to whether or not and how it fulfils its purpose, the hypotheses and theories relied on may be modified or adapted. As the plan of the art or craft of constructing a system for a purpose, technological epistemology is required for applied science and applied philosophy.

Jurisprudential systems may be discovered by finding in the law some purpose to be achieved, and then ordering the rules of law that achieve that purpose, as well as controlling the rules that do not; the construction has process and the constructed Chapter Three: Computational legal epistemology 168 system has structure and process. The epistemology of 3d legal logic has a process of construction, as set out in this Chapter, and, on completion of the construction, its generic system has structure and process for the purpose of further construction of specific jurisprudential systems such as the Vienna Convention application in Chapter Four. Testing and co-adaptation of theories and system also involves a process that may be complex and extensive. Whitehead’s process epistemology, involving epistemological systems to produce and manage specific purpose systems is suited to the technology of legal knowledge engineering, especially for large scale legal expert systems.

Whitehead was a professor at London University’s Imperial College of Science and Technology from 1914 to 1924, before he moved to Harvard. As Dean of the Faculty of Science, Chair of the University’s Academic Council and Chair of Goldsmith’s College managing council in London University, he was involved in many practical concerns of working class education. His opus magnus in 1929 on process and reality was preceded by a collection of essays on the aims of education (Whitehead, 1928, 1950). In providing an epistemology for technology, Whitehead brought applied philosophy to technology. Critical legal studies may still evaluate substantively the abstractions in law that are accommodated by technological epistemology; the precision of legal knowledge engineering may facilitate this evaluation.

Technological epistemology may also draw on the linguistic philosophy of Wittgenstein which assumes that meaning in language is potentially boundless. New epistemologies may be found in the meaning of language. Foucault’s studies of epistemes of the past, his archaeology of knowledge, demonstrate a range of epistemologies that have been found and used. The meaning of meaning has also been studied (Ogden and Richards, 1923, 1949) as well as the meaning of meaninglessness (Blocker, 1974).

For computation in the legal domain, it is necessary to wholly formalise a rule system in law in order to provide program instructions on the sequences or pathways along the tributary structure that determine the description and outcome of all the possible cases that fall within its ambit; precision is required for automation. Accordingly, in legal knowledge engineering, a wholly formalised rule system must provide for all possible cases within the system, be they positive alternatives, wholly Chapter Three: Computational legal epistemology 169 or partially negative, or wholly or partially uncertain. The result for any possible user case must be indicated in the representation. Spectra assist in determining gaps and how they should be dealt with; settled existential inductive premises may be used to necessarily establish an inductive antecedent in a rule of law, or there may be a spectral argument as to which of the adversarial antecedents should be established by an unsettled inductive instance, that is, an argument as to which sector of the spectrum the inductive instance in question should properly fall into, given its position in the spectrum.

Where there are gaps in the stated law, these must be filled and noted to complete the logic and provide for possible cases. The user can be advised of gaps. If the gaps can be filled by logically necessary extensions of stated law, then the user should be advised accordingly. Sometimes rules are stated in positive terms, sometimes they are stated in negative terms; positives can be worked out from negatives and vice versa. Sometimes obiter dicta discuss the gaps or uncertain cases. Logical irregularities in the judicial and the legislative statements of law may become apparent when the spectra are revealed following streamlining.

To be established as a Minor deductive premise, inductive instances in a spectrum have to be decided according to the appropriate burden of proof: on the balance of probabilities in civil matters and beyond a reasonable doubt for the prosecution in criminal matters. If the established Minor deductive premise, is an instance in the Uncertain sector of a Spectrum, then resolution of the issue of law will adjust the sector boundaries to place it either in the Negative of Positive sector. Unless the issue of law is resolved in advance of the trial, the Minor deductive premise is the first step toward adjustment of the sector boundaries for particularising the Major deductive premises.

Whichever party carries the burden of proving an antecedent in a rule of law, also carries the burden of proving the relevant instance in the sector that supports that party’s case; if the instance falls into the uncertain sector of the antecedent’s Spectrum, then that party will also have to successfully argue that the instance that applies should be moved into the sector that supports that party’s case. Failure to establish necessary instances, due to failure in an issue of fact or an issue of law, allows the opponent to succeed due to the burden of proof or a finding that the Chapter Three: Computational legal epistemology 170 uncertain instance is to move to the sector of the Spectrum that details the opponent’s antecedent.

For instance, in criminal matters, the positive case is likely to be the case for the prosecution; in these matters, ultimately, a reasonable doubt about an instance that is necessary in order to prove the prosecution case, is factual uncertainty that is decisive support for the negative case. Certain criminal defences usually place the onus of proof on the party claiming the defence, usually the party with the negative case; the defendant the carries a civil burden of proof in respect of the defence to be established. The prosecution will win if it can be shown beyond a reasonable doubt that the instance established by the defence on the balance of probabilities, did not occur, provided the prosecution case has been otherwise established. In civil matters, if there is a necessary antecedent in the positive case in respect of which the onus of proof lies on the party with the positive case, and that antecedent is pro tem uncertain, on the balance of probabilities, the positive result cannot be established, even if there are other uncertainties, on the balance of probabilities, in respect of antecedents that carry an onus of proof on the party for the negative case.

Just as the failure of one antecedent in one positive fan will not prevent the positive result, so, too, the failure of one only instance in many on the positive sector of a Spectrum will not prevent the positive result. The sectors are mutually exclusive, although the inductive instances within a sector may not be. Sectors can be categorised according to the jurisprudential meta-rules, which determine the nature of the choice that they permit. Like fans, inductive instances in a sector of a triad Spectrum must all fail before the Spectrum fails to establish the deductive monad represented by that sector. Inductive choice places constraints on the effect of negative instances until all positive instances are exhausted.

3.1.8 Double Negatives

The law has a jurisprudential negative as well as a factual negative: for example 'no peppercorns' is both a jurisprudential negative as well as a factual negative. However, no rejection of an offer is a jurisprudential positive as well as a factual negative. The association of jurisprudential and factual positives and negatives in regard to each monad is a matter of legal expertise. A factual determination establishes which antecedent monads apply to a client’s case; it occurs in the Minor Chapter Three: Computational legal epistemology 171 deductive premise in the extended deductive argument based on a pathway through the River system. Once the factual monads are determined, then the associated jurisprudential negatives, positives or uncertains in the Major deductive premises that are the rules of law, apply; the jurisprudential antecedents determine the outcome of the case in law. The Major deductive premises are conditional propositions that are deemed to be true by virtue of their authority; the Minor deductive premises are, or rest on, existential assertions which may be found to be true in the course of litigation if there is evidence to support this. Minor deductive premises establish antecedents in the Major deductive premise so that the consequent of the Major premise will apply to the case.

3.1.9 Star Poles

Adversaries in litigation are sometimes said to be poles apart. In order to determine precisely how far apart the parties are, it is necessary to specify their differences in terms of negatives and uncertains. Possible differences are anticipated and provided for in the epistemology of 3d legal logic by adding to the representation in Figure 3.13, two Poles, being the Negative Pole and the Uncertain Pole, that give the visualisation the appearance of a star, as shown in Figure 3.14.

The Negative Pole represents the consequent of Partially negative but conclusive Final result; The Uncertain Pole represents the consequent of Partially uncertain result pro tem. Pole streams shown in Figure 3.14 represent Pole rules that have the same consequent, namely, the Pole result; all negative Pole rules have the consequent Partially negative and all uncertain Pole rules have the consequent Partially uncertain. A Pole rule has a single antecedent which is the same as an antecedent on the River from which it arises; thus an antecedent on the Negative River is also an antecedent in a Negative Pole rule, and an antecedent on the Uncertain River is also an antecedent in an Uncertain Pole rule.

Poles occur because of the meta-rule that a path of necessary and sufficient positive antecedents must be established in order to reach the positive Final result. Where there are alternative paths of necessary conditions, one of these paths is sufficient. Where no sufficient path is established, then the failure of antecedents may be calculated by reference to the number of antecedents by which the nearest path to the Chapter Three: Computational legal epistemology 172

(Conclusive Negative Pole)

(Pro tem Uncertain Pole) Figure 3.14: Star showing Poles and pole streams © Pamela N. Gray 1997

Positive result falls short of being sufficient, and whether or not the shortfall antecedents are negative or uncertain monads.

For each of the shortfall antecedents, there is a Pole stream; the more Pole rules that are satisfied, the stronger is the Partial Pole result. Subject to fan constraints, prima facie, all negative monads that belong to the Negative River, are also single antecedents in Pole rules, respectively, with a common consequent that is the Partially negative but conclusive Negative Final result. When all alternatives in a Positive fan fail as negative, then the Negative Pole consequent is established as the Final result. Likewise, if all alternatives in a Positive fan fail as uncertain, then the Uncertain Pole consequent is established as the Final result pro tem, pending judicial resolution. When the Positive case fails due to a combination of some negative and Chapter Three: Computational legal epistemology 173 some uncertain monads being established in a case, then the net result must be considered by reference to whether or not there are fan considerations. If a Positive fan fails due to some alternative streams being negative and at least one alternative being uncertain, then the fan fails for uncertainty. If the only negative monads in a case are precluded from the Negative Pole result by virtue of fan(s) with a net uncertainty, then the Final result pro tem is the Uncertain Pole result. Otherwise, the negative monad(s) prevail and there is a Negative Pole result.

Because Negative Pole rules share the same consequent and Uncertain Pole rules share the same consequent, each Pole has a fan of Pole streams as the alternative possible ways of establishing the Partial result. As soon as a Pole fan stream is established by a case, then the Pole result applies to that case. The same negative antecedent in two different rules, one in the River system and the other the Pole rule, has two different consequents, the River one that is the Wholly negative Final result, and the Pole one that is the Partially negative but conclusive Final result; there is a corresponding Wholly uncertain and Partially uncertain result pro tem for each uncertain antecedent.

As soon as a negative antecedent is established in the River system rule, it is also established in the Pole rule; subject to fan constraints in the River system, the more negative antecedents that are established in the Negative Pole fan, the stronger is the negative case. If all negative antecedents are established, there is the strongest negative case, the Wholly negative Final result. Likewise, as soon as an uncertain antecedent is established in the River system rule, it is also established in the Pole rule; subject to fan constraints in the River system, the more uncertain antecedents that are established in the Uncertain Pole fan, the stronger is the uncertain case. If all negative antecedents are established, there is the strongest negative case, the Wholly negative Final result; if all uncertain antecedents are established, there is the strongest uncertain case, the Wholly uncertain result pro tem.

The Poles reflect the differences between the uncertain, positive and negative cases. Prima facie, all the antecedents of the positive case must be established in order to reach the Positive result: as soon as there is a decisive failure of a positive antecedent in all the alternative Positive case paths of necessary and sufficient conditions, the positive result cannot be claimed. However, if one decisive negative antecedent is established, for example if there is no consideration in a contractual transaction, then Chapter Three: Computational legal epistemology 174 prima facie the negative case will succeed; there may be a Partially negative but conclusive result. A negative antecedent is decisive when all positive alternatives are exhausted, or when there are no alternatives, in regard to that antecedent.

Although there is a Negative Pole rule for each negative antecedent that corresponds to a positive antecedent in a positive fan, there is a Negative Pole meta-rule that provides that none of these Negative Pole rules will apply until all rules in the Positive fan fail; if there is such a failure, then all the established negative antecedents will activate the common consequent. For example, if there is no consideration then there is no valid contract; this is a Negative Pole rule. The requirement of at least one alternative path of necessary and sufficient conditions, or the one and only path of necessary and sufficient conditions for a Positive result, is represented by the Pole cones. All partially negative rules share a common conclusive consequent, which must be represented separately from the Wholly negative consequent; the Wholly negative result of the Negative River is only correct when all negative antecedents are established.

If differences between cases, past, present or future, are to be assessed, then the Partially negative but conclusive Final result must be represented separately from the Wholly Negative Final result. Similarly, the Partially Uncertain pro tem result must be represented separately from the Wholly Uncertain pro tem result.

Thus, there are five sets of rules in a field of law:

1. those which lead to the Partially Negative result,

2. those which lead to the Wholly Negative result,

3. those which lead to the Positive result,

4. those which lead to the Wholly Uncertain result,

5. those which lead to the Partially Uncertain result

The Positive result is mutually exclusive of the remaining four possible results. A cumulative Partially Negative result can amount to the Wholly Negative result just as a cumulative Partially Uncertain result can amount to a Wholly Uncertain result. The Wholly Negative result and the Wholly Uncertain result are mutually exclusive. However there may be partial negatives only, partial uncertains only, or a mix of partial negatives and partial uncertains; any of these cases may produce partial Chapter Three: Computational legal epistemology 175 positives. However, there can be no partial but conclusive positive result unless the partial positives amount to one of the alternative necessary and sufficient pathways in the Positive River system. Where there is both a Partially Uncertain and Partially Negative case, prima facie, subject to fan meta-rules, the Partially Negative result will prevail as the final result.

The adversarial features of the law can be said to polarise the parties because there are two ways of opposing the Positive case: by denial or by doubting of the positive monads. Where there is a Positive fan which fails partly for Negative monads and partly for Uncertain monads, then the Partially Uncertain result prevails over the Partially Negative result to the extent of the River fan meta-rule that qualifies the Pole fan result. The polarisation of the parties is effected by the River fan meta-rules and the Pole fan meta-rules.

In Figures 3.13 and 3.14, the positive case is central, because the partially negative and partially uncertain rules that are the Pole rules, arise from the Negative and Uncertain Rivers, respectively. No such rules arise from the Positive River.

The partially negative rules with their common consequent can be represented graphically as the partially negative but conclusive cone with a pole like the North Pole of the earth, representing the common Partially Negative but conclusive result; this makes the Negative River similar to the Topic of Cancer. There is a corresponding cone representing the Partially Uncertain rules, with a common Partially Uncertain pro tem result like the South Pole; this makes the Uncertain River similar to the Tropic of Capricorn.

The Poles are a judicial method of providing for possible cases in the law as real cases constitute various combinations of antecedents from the three tributary structures, including their nested detail. Combinatorial explosion and implosion occurs as possible cases zig-zag across the three Rivers and deep into their submaps, as antecedents from different Rivers and Final results are determined. The significance of each monad is directed to the prima facie five possible end results: Partially Negative, Wholly Negative, Positive, Wholly Uncertain, and Partially Uncertain; Poles assist the determination of the net result and what adjustments need to be made to the sector boundaries in inductive Spectra. Chapter Three: Computational legal epistemology 176

Like the expanding physical universe, the legal system ensures the growth of law. Every year parliament puts out new legislation, to vindicate the changing political platforms of adversarial democracy. With litigation, precedent cases accumulate.

Computer technology, which extends and assists human memory, supports this expansion of information and maintains its viability and profitability. In 1997, the Butterworths Catalogue (p.4) claimed that about 14 million documents per week were added to the online legal database of Lexis-Nexis.

As the judiciary embraces integrated databases that permit speedy cut and paste judgments, mixing extracts from black letter law with extracts from transcripts of evidence and legal argument, the legal system is likely to flirt with chaos and complexity that defies the consistency, orderliness and coherence of justice. Spaghetti access to the law through Boolean search may ultimately produce a dense and knotted law, at the mercy of language use statistics rather than the reasoning of prioritised substantive principles. Figure 3.15 demonstrates the chaos of a nested River system of the Australian Spam Act 2003 when it has no submapping. It is interesting to compare this chaos, systematically produced by eGanges, with the medieval representation of the complexity of human thinking, the geometry of the mind (Leith, 1990, p.86; Ong, 1958), shown as Figure 4.4.

Whitehead posed a process epistemology to manage language that uses conditional propositions and deductive necessity. If the structures are identified for this process, then chaos in the processing of language as logic, and the consequences of this, can be avoided.

3.1.10 Adversarial fishbone

If all the tributary streams of a River system are disconnected as in Figure 3.1, they may be reconnected as a linear sequence of antecedents that conclude with a Final result. When the positive, negative and uncertain tributary structures are unplugged and reconnected as linear structures in a two dimensional plane, then the resulting linear structure looks like three parallel lines. The Pole streams rise from the antecedents in the negative sequence and antecedents in the uncertain sequence. The hierarchy of rules can no longer be seen and the representation looks like an adversarial fishbone, as illustrated in Figure 3.16; a dash over an antecedent is used to indicate negation, and a tilda over an antecedent is used to indicate uncertain. Chapter Three: Computational legal epistemology 177

Figure 3.15: eGanges compliance map without submapping (chaos map) - Australian Spam Act 2003 © Pamela N. Gray and Xenogene Gray, 2005

Ishikawa (1985, p.63) first posed a fishbone diagram to represent causation in a manufacturing process, to facilitate quality control in manufacturing in Japan after the second World War; his graphic is reproduced in Figure 3.17. It has a paradigm that is like the River paradigm in the epistemology of 3d legal logic. Gray (1988) identified the River paradigm as the structure of a system of rules of law; in 1989 she worked with Tony Morgan to develop a program to represent and process the River paradigm for legal expert systems. Morgan (2002, pp.122-5) showed that the fishbone paradigm also applied to business rule systems; hence the epistemology of 3d legal logic set out in this thesis is applicable to some extent as a generic epistemology of 3d logic.

The fishbones of Ishikawa provided a two dimensional model of sequential information which avoided the repetition of factors in the alternative pathways of a tree flowchart. These diagrams are standard aids to assist quality control management. Their form seems similar to a layout of Samurai forces to spearhead a Chapter Three: Computational legal epistemology 178

Figure 3.16: Adversarial Fishbone  Pamela N. Gray, 1996 Chapter Three: Computational legal epistemology 179

Figure 3.17: Ishikawa (1985, p.63) Fishbone: Cause and Effect Diagram military campaign, in contrast to the regimentation of disjunction in decision trees. The representation was certainly effective and assisted common cognition of the individuals working in a factory production team. The fishbone also was adopted as one flowchart model for computer program design from the 1980s; it was used by the candidate in legal expert system design in 1987 (Gray, 1988). The fishbone has arrow flows that indicate the direction of causation or process toward another cause, and ultimately to a final effect.

The adversarial fishbone does not use the River paradigm; it dismantles the Ishikawa fishbone to a linear form. Its fishbone paradigm is based on the linearity of the three adversarial Rivers with their Pole cones spread out as fans, like the fins of a fish. The adversarial fishbone is a representation that allows for a presentation sequence of the antecedents and a two dimensional reckoning of the Pole rules. It also permits a visualisation of antecedents which do not have Pole rules because they are neither necessary nor sufficient in the Positive case, although they may occur and should be understood as being consistent with the Positive case.

An order in which antecedents will be considered or presented may mix the hierarchical level of rules, depending upon whether:

● a series of antecedents will be considered or presented before stating their consequent; or Chapter Three: Computational legal epistemology 180

● a consequent will be considered or presented before stating the antecedents which establish that consequent.

The art of unravelling a Star of Rivers, selecting components from the deductive hierarchy, the inductive Spectra and the abductive Strata, and ordering them in a linear sequence is a formalisation of the art of rhetoric. It is assisted, as a science, by the adversarial fishbone.

Fishbones are also useful for the graphical representation of a Pole fan. It may be that a sub-cone in each Pole cone could collect respectively negative and uncertain antecedents pro tem, until all Positive alternatives are exhausted; then Pole dominance meta-rules could apply and the Final result of the dominant Pole could be triggered as either a Partially negative but conclusive result or a Partially uncertain result pro tem.

A negative River fan, as distinct from the Pole fan, corresponds to a single positive tributary stream that has more than one antecedent. Only one alternative in the negative River fan must be established before there can be a Partially negative but conclusive Pole result. This is so irrespective of whether the negative fan alternatives might be mutually exclusive, multiple choice, or conditional fan alternatives, as occurs in positive fans.

In the Adversarial Fishbone, the Pole streams which oppose the positive case permit an evaluation of the strength of a case, as the number of positive, negative, and uncertain monads are established in a user case; numbers can be calculated in the Pole fan as processing proceeds and antecedents in favour of each party in a dispute can be compared progressively as the Minor deductive premises of a client’s case are established.

3.1.11 Neutrality

Sometimes a positive monad has no corresponding negative or uncertain antecedent that prevents the positive result. The three possible cases represented normally by the positive, negative and uncertain antecedents, are all consistent with a positive result; all are located on the Positive River as a fan without a corresponding single tributary stream in the Negative River and thus with no Pole streams. In the adversarial fishbone, they are represented as a triad of monads that are all on the Positive sequence. For the epistemology of 3d legal logic, these are the neutral Chapter Three: Computational legal epistemology 181 monads that indicate consistency rather than necessary and sufficient antecedents. They may occur and their effect needs to be understood, but they are dispensable.

An example of a neutral monad in contract law is an enquiry to clarify the meaning of an offer. It is not necessary to make this enquiry to establish a valid contract: there is no rule of contract law that requires the offeree to clarify meaning in this way. If no enquiry is made, a valid contract can still arise, although a contract of this sort may be defeated by another rule such as a rule of mistake. Even if it is uncertain whether or not an enquiry has been made, this will not, per se, defeat the positive result: if resolution of the uncertainty finds that a counteroffer, not an enquiry was made, then the positive result will be defeated by the rule that treats a counteroffer as a rejection of an offer.

3.1.12 Criss-crossing

Sometimes two or more inductive spectra in different parts of a River system form a continuum because the boundary between them is a fine distinction. The connection is represented by an inter-node link that crosses the River system where required in order to make the connection. Criss-crossing of inter-node links introduces irregular patterns in the star paradigm. Example of this in contract law are the enquiry- counteroffer spectra and the mistake-misrepresentation spectra. It is useful to delineate these spectra so that inductive instances can be carefully distinguished and located in the appropriate rules with attendant consequents.

Criss-crossing may take a matrix form, if the component spectra have this complexity. A matrix form in one of the criss-crossing spectra may suggest a continuum of the matrix or some variation thereof.

There is a form of reasoning about antecedents that can be thought of as an equity form. It manages evidence through four dimensional or moving inductive Spectra. The four dimensionality has a form of criss-crossing whereby an instance on one Spectrum changes the sector position of an instance on a second Spectrum. These Equity River systems permit flexibility through which equitable discretion may operate. The judiciary may determine from time to time when sector boundaries may move pro tem or in certain circumstances. An inductive instance may be in the Positive sector of a Spectrum of a certain antecedent in the combination with certain Chapter Three: Computational legal epistemology 182 other antecedents in one case and, with a change of antecedents in another case, it may move into the negative sector of the Spectrum.

Thus, in discretionary law, inductive instances may move from one sector to another of the Spectrum, according to the position of inductive instances on other Spectra and what Minor premises have been established by inductive instances on other Spectra. Movement within the static framework of rules due to information that is established by criss-crossing between Spectra in different locations in the River system gives potential four dimensionality to Equity law. When discretion is exercised, all the circumstances of the case are taken into account in making a determination. For instance, the Family Court in Australia has this flexibility in determining, by reference to a list of factors, entitlement to maintenance and property.

Four dimensionality introduces another form of combinatorial explosion. The number of instances on all Spectra provides a basis for calculating all possible combinations of sector positions for these instances on all available Spectra.

A movie of Equity could show the Spectra positions of instances in one case and the changing of their Spectra positions for another case. If there are inconsistencies in the exercise of the discretion, the alternative movies for the same set of instances can show the difference in the changes. Alternative Equity movies can then be argued by reference to some coherence that might appear in the non-inconsistent part of the movies, or by reference to abductive argument in applicable Strata. In equity, discretion occurs through the potential combinatorial explosion that could be visualised and managed through movies.

Combinatorial explosion occurs when possibility is considered. Once antecedents and their instances are laid down in relation to each other in rule systems with inductive Spectra, a realm for possible cases is determined. Within this realm, cases might arise in various ways in the potential for alternative combinations of antecedents given available pathways and instances. New cases may introduce new antecedents, new instances, or some adjustment of pathways. Changes in a Star of substantive law produce combinatorial implosion that is increasingly dense. Submapping may be extended as far as required to accommodated growing complexity in increasingly large-scale River systems. Chapter Three: Computational legal epistemology 183

3.1.13 Rings

Legal domain epistemology also uses presuppositions and postsuppositions, circular concepts, and moral motivations. In the visualisation of the epistemology of 3d legal logic, rings are used to represent these phenomena and manage any paradoxical effects.

For example, when Lord Atkin laid down the rules of negligence in Donoghue v Stevenson, the justification that he used was one of the ten commandments of the Bible: love thy neighbour. Christianity was the dominant religion of Britain at the time, and this particular commandment was also a distinguishing feature of Protestant Christianity. The abductive religious justification was treated by Lord Atkin as a presupposition for the rule system of negligence. As such it acted as a moral motivation for social acceptance of the radical new development of human welfare law. Such presuppositions may ensure a level of voluntary compliance with a certainty of inexcusable non-compliance, that makes law viable. As a postsupposition of the new law of negligence a vast insurance industry grew up to cover negligence liability; this industry provided compensation payments and ensured a standard of human welfare in the age of technology and science.

Love is a circular concept. It belongs on a paradoxical spectrum, which has gradations through love, and its opposite, hate. It is a subjective state belonging to the realms of human emotions, and difficult to identify and prove as a fact. At any given time, a person might feel the full circular spectrum, both love and hate for another; customarily, a wedding ring symbolises the unbreakable legal commitment of a relationship based on love. While love may be used by the law as an abductive justification for the law of negligence, it is not suitable as part of the rules determining legal liability; the difficulty of its paradoxical nature is the basis of divorce law.

Rings may occur in the confines of abductive Strata. However, if an antecedent is potentially paradoxical, then the visualisation of the epistemology of 3d legal logic uses a ring to represent this where it occurs in the River system. The occurrence of paradoxes in complexification is considered extensively by Casti (1994). He identifies the circularity of paradoxes as features of the most complicated things, the fractals of Mandelbrot (1982). It can be expected that large scale legal expert Chapter Three: Computational legal epistemology 184 systems will ipso facto involve increasing complexity and therefore paradoxes must be provided for and managed in their representation.

3.1.14 Sphere

The fundamental ring for a Star of rules represents the beginning and end of the rule system; this is the Boundary logic of the system. There is a Start presupposition and an End postsupposition that can be linked as Boundary logic to convert the three dimensional Star to a globe or Sphere. The Start presupposition is that there are three alternative monads, namely the first triad of antecedents, that are the starting antecedents of the system; the End postsupposition is that there are five alternative end results, being clockwise, the Partially negative result, the Wholly negative result, the Positive result, the Wholly uncertain result, and the Partially uncertain result. The circular nature of these presuppositions and postsuppositions is shown by the ring that links them in the Sphere in Figure 3.18, as the Boundary logic of the rule system.Thus, a sphere of alternative overlapping valid pathways of Major deductive premises is produced in the computational epistemology of 3d legal logic. The user input of Minor deductive premises determines the selection from these alternative

Figure 3.18: Sphere of legal knowledge  Pamela N. Gray, 1996 Chapter Three: Computational legal epistemology 185 pathways, as an extended deductive argument that leads to a final consequent. In the epistemology of 3d legal logic work this sphere is comparable to a truth table in logic; it is a spherical table of valid arguments for extended deduction.

3.1.15 Universe

The legal system has many fields of law which can be represented in Spheres like a legal Universe as illustrated in Figure 3.19. It is unlikely that Spheres in different field for different purposes would have identical River system structures. Differences may occur substantively and simply by virtue of the mnemonic and cognitive arts in the design of the River system structures.

It might be possible to develop categories of Spheres such as compliance Spheres, prosecution Spheres, Strategic Spheres, litigation management Spheres, vocational procedure Spheres, specific task Spheres, conflict prevention or resolution Spheres. There might be multi-field Spheres, legal practice Spheres, or case specific Spheres with relevant law included.

In the epistemology of 3d legal logic, Spheres of different fields of law may are linked. For instance there is a link between the Sphere of contract law and the Sphere of tort law; there may be alternate remedies in contract or in tort. Travelling through the legal Universe, via links or freely, provides a visualisation of the vast collective, potentially immortal, intelligence of the legal profession - good practice for the individual to understand the detailed holism of the contemporary collective and how to develop the future collective.

3.2 INTELLECTUAL ARTEFACTS

Logic descriptions of the visualisation in the epistemology of 3d legal logic, add clarity to legal domain epistemology and vindicate the reification for object-oriented programming. The legal Universe provides intellectual artefacts for system design of a shell for constructing legal expert systems, and for object-oriented programming of that shell. Because the visualisation permits object-oriented programming, it is a computational epistemology.

The visualisation also indicates that human intelligence is three dimensional and can be modelled for the cyberspace of a virtual mind; the realm of the meta- Chapter Three: Computational legal epistemology 186

Figure 3.19: Legal Universe  Pamela N. Gray, 1996 Chapter Three: Computational legal epistemology 187 epistemological method demonstrated in this thesis, is three dimensional. While not simple, the epistemology of 3d legal logic is holistic and therefore coherent; as such, it is potentially sound as an epistemological framework. Even without computer aids, lawyers may yet be space travellers in their metaphysical legal Universe - a training ground for the language and culture of real space travel. With the visualisation of the epistemology of 3D legal logic, the expert systems of the legal Universe can be charted and travelled interactively.

3.3 STAGE 2 SPECIFIC META-EPISTEMOLOGICAL METHOD

The specific meta-epistemological method uses the epistemological ideographs of the computational epistemology of 3d legal logic to achieve the transformation of Stage 1 domain epistemology to Stage 2 computational epistemology. The ideographs provide objects for interactive visualisation, and object-oriented programming. Logic descriptions of the visualisation provide validity and soundness for the representation and assist the determination of sound heuristics for object- oriented program epistemology.

In the legal domain and in legal knowledge engineering, ideographs have been used customarily to design legal expert systems. Chapter Four considers the historical development of the River ideograph and other epistemological ideographs that employ logic. As a method of meta-epistemological transformation, epistemological ideographs are particularly effective for large-scale systems; a picture says a thousand words and may clarify and readily convey complex meaning. LEGAL KNOWLEDGE ENGINEERING METHODOLOGY FOR LARGE SCALE EXPERT SYSTEMS

VOLUME 2

by

Pamela N.Gray, LL.B.(Melb.), BA.(Melb.), LL.M.(Syd.)

Barrister and Solicitor (Vic, 1968, Aust., 1968, N.T., 1974)

Solicitor (Eng., 1974)

This dissertation is submitted for the degree of Doctor of Philosophy

University of Western Sydney

School of Law

March 2007 4 CHAPTER FOUR:

SHELL EPISTEMOLOGY

4.1 DESIGN OF EGANGES

The eGanges (electronic Glossed adversarial nested graphical expert system) shell was designed in 2002 on the basis of the computational epistemology of 3d legal logic. The design sets out the shell epistemology. There were two main tasks in the design: firstly, the creation of the interface and secondly, the determination of operating heuristics as instructions for processing user input and providing feedback. The second task depended on completion of the first task.

The design of the user interface for construction and consultation of an application, the first task, required a reconsideration of the domain epistemology: a retroduction to matters in the domain epistemology relevant to lawyer-client communication, rather than computational epistemology, was a necessary step before settling the nature of the communication system of the user-interface and the epistemological processing of the computational epistemology of 3d legal logic that would be required to give effect to the interface. The communication system characterises the shell epistemology. The computational epistemology of 3d legal logic is transformed to suit the communication system of the interface.

4.2 INTERFACE

4.2.1 Dialogue epistemology

Legal domain epistemology includes some considerations that are relevant to lawyer-client communication that should determine the design of a legal expert system interface. Lawyers take instructions from clients, and clients take from lawyers advice based on these instructions. Primarily, there must be effective communication between lawyers and their clients. A legal expert system application may be used instead of a lawyer, so the interface of the legal expert system, to be effective, must be user-friendly. To be user-friendly, the law must be comprehensible to the subjects who are expected to comply with, and benefit from, it. The interface Chapter Four: Shell epistemology 190 also, to be feasible, must be expert-friendly for quick and easy construction of such an application; the epistemology of the program must be comprehensible to a legal expert, and be seen to encompass a sound legal domain epistemology.

Comprehension is a matter of human cognition. Since cognition is concerned with the understanding of knowledge, it is a psychological aspect of legal domain epistemology. Ignorance of the law is no excuse, and a legal expert system should convey a knowledge of the law so that subjects do not encounter trouble with the law due to their ignorance of it. To avoid or resolve such troubles, lawyers interact with clients, with an appreciation of the comprehension of the client, to receive instructions and convey advice. Legal knowledge engineering methodology must include some appreciation of human cognition and the lawyer-client communication system. Hence, design of the interface should provide a communication system which permits the ordinary subjects of the legal system to comprehend the requirements and benefits of the law; it must be suited to the constraints of human cognition.

Language effects communication; legal language is translated into facts in the lawyer-client communication, and into computational logic by legal knowledge engineers. Following the search for a language of legal discourse as a processing solution, particularly the work of McCarty, Smith and Allen, and the schemes for implementing non-monotonic sequences of legal reasoning, particularly the work of Gordon, Sartor and Prakken, referred to in Chapter One, Lodder (1999) developed a dialogical model for a legal expert system, called DiaLaw. Gordon had suggested a dialogue model and Prakken saw his non-monotonic system as dialectical. Lodder did not claim that DiaLaw was epistemologically sound, but he did claim that it was useful. DiaLaw relied on game theory for its epistemology. This results, effectively, in a private law game, with some paradigms of real law. For instance, as explained by Lodder (2004, p.574):

In DiaLaw the burden of proof is simple. The player who claims has the burden to prove that the claimed statement is justified. This means that if a player has claimed a statement and the statement is questioned, he must adduce other statements that support his claim.

Lodder relies on rhetoric, not logic, as the jurisprudential basis of the game; he defines rhetoric as enthymemes which have inconclusive results but are conductive arguments (Wellman, 1971). If the Ramist view of legal reasoning is taken, logical Chapter Four: Shell epistemology 191 arguments are not knitted together, but enthymemes are presented in an exchange sequence of rhetorical arguments; this may be a form of legal abduction dialogue. An enthymeme is an argument which has no conclusion and is thus incompletely stated; for example, a thief will eventually be caught. The conclusion might be, therefore do not steal, but it is not stated. Without the rhetoric of the enthymeme, the conclusion might not be adopted, so the enthymeme might be useful persuasion. For a jurisprudential basis for DiaLaw, the Scandinavian Realist School (Curzon, 1979) of Hagerstrom, Lundstedt, Olivecrona, and Ross, may provide a framework of justification; natural law jurisprudence and, also, Hegel's (1830) scheme of dialectical logic and free will might be used as jurisprudential guides in a private law dialogue system. However, Ong's (1958) study of the decay of dialogue due to the dichotomy method of Ramist's structures of logic columns and spatial graphics, should also be considered; dialogue, like dialectic, and games are inherently dichotomous.

For people requiring a settlement of their dispute according to law, DiaLaw offers no application of the relevant rules of law. Maxims float as guides, like enthymemes, but maxims are not the deductive rules of law. Since many legal disputes involve risks of imprisonment, large monetary losses, and damaging effects on personal security, a dialogue that resolves such matters according to law, is required.

The program epistemology of eGanges uses the communication system which is effected through the interface, as the implementation of rules of law through sound legal domain epistemology, including the computational epistemology of 3d legal logic. An interactive visualisation of a prima facie positive River, Star heuristics, heuristics for management of positive disjunction and neutral antecedents, and the availability of any inductive and abductive glosses for each antecedent in the River, are employed to achieve the ostensible transparency of the shell.

The communication system of a legal expert system, through the design and functionality of its program interface, is also a matter of legal domain epistemology; it must transmit the relevant legal knowledge and legal reasoning, as well as provide conclusions, even if the conclusion is an uncertainty. Primarily, a lawyer will take instructions from a client by asking questions and receiving answers that suggest further questions. The interrogation derives the evidence of the client's case, by reference to the antecedents in the relevant rules of law that must be established; Chapter Four: Shell epistemology 192 witnesses may also be interrogated for proofs of evidence. Thus a client who seeks damages for injuries suffered at his worksite when a scaffold fell on his head would be asked, not just about the event, but about facts pointing to the negligence of the party liable, for instance: Why did the scaffold collapse? Had it collapsed before? Who installed it? Were safety checks done on it?

Were you required to be under it?

The skill in constructing a legal expert system is to match questions to antecedents, and to refine rules of law to suit the evidence required. Thus negligence rules might be refined by rules that use an ontology of alternate worksites, for unsafe system of work cases; the ontology can be derived from the refined assumed material facts of precedent cases, many of which may be unreported, but within the knowledge of a specialist practitioner who may have conducted many of those cases, or discussed them with colleagues at a Law Society educational or social event. Occupational health and safety administration proactively also builds up ontologies of safe and unsafe systems of work through OHS standards.

Transparency is a requirement of user-friendliness; a user-friendly interface should also have transparency. As client's are asked for instructions, they should be able to see which antecedent is being addressed in the whole system of rules. In the legal domain, transparency has long been a principle of justice; inter alia, it is effected by the Writ of Habeas Corpus, and the requirement for reasons in a judgment. The use of the interactive visualisation of the River, which is the representation of the positive rules of law as extended deductive premises, also provides a map of the questions that must be answered to obtain advice. As questions are answered, they are sorted into the relevant adversarial window which is colour co-ordinated with the answer type. The Current result of the answers is available, cumulatively, at any point during a consultation. Thus the rules and their processing are transparent to the user.

Artificial legal intelligence should not give the appearance of rule by the black box; the technology should not be a covert Star Chamber in cyberspace. In the Chapter Four: Shell epistemology 193 communication system of the interface, all relevant legal knowledge should be accessible and navigable; any processing should be understood as a sequence whereby output advises in advance what the effect of a selection of input will be. What the program advises will happen to user input, should be seen to happen as soon as that input is given; justice must be seen to be done. The program should be easy to learn and use. Both the statics and dynamics of the program epistemology should be tractable, within the cognitive constraints of the ordinary subjects of the legal system. The program epistemology of eGanges meets these standards.

The problem for the specific meta-epistemological method of legal knowledge engineering, is to transform the complexity of the computational epistemology of 3d legal logic into the program constraints of user-friendliness.

Cost-effectiveness is a further constraint of user-friendliness. The shell allows quick construction of an application; the River, its associated questions and answers, and gloss advice, must be constructed, with easy construction assistance available in the Build menu. However, deriving the extended deductive rules for construction of the River, from black letter law, formulating questions and answers for each antecedent node in the River, and drawing up inductive and abductive gloss advice, may be difficult, even for specialist lawyers. It may require a major paradigm shift in the legal profession. eGanges is as cost-effective as the technology can be, but artificial legal intelligence is a new kid on the block that requires nurturing and care.

4.2.2 eGanges interface statics

The design of the eGanges interface, which conforms to the communication system requirements of the legal domain epistemology, and the requirements of the computational epistemology of 3d legal logic, is shown in Chapter Three, Figure 3.9 and in the Appendix Figure A-1. Ex facie, it has the following windows, buttons and other features:

Epistemological windows: 1 Rivers window. 2 Interrogation windows 3 Adversarial windows Chapter Four: Shell epistemology 194

1 Current result window

Epistemological buttons: 5 Answer buttons

1 Current result button

Other features:

In addition, the eGanges interface has further facilities, some of which have epistemological aspects: Construction menu Consultation glosses Navigation options Map finder Search options Save options Report options Print options Help

These features of the eGanges program design will be considered as shell epistemology under the following heads: Rivers and Communication system. On the basis of the static features of the interface, the eGanges interface dynamics are explained. The heuristics of the communication system that give effect to the computational epistemology of 3d legal logic may then be understood. So too the transformation from Stage 1 legal domain epistemology and Stage 2 computational epistemology of 3d legal logic to the eGanges shell epistemology may be understood, along with the methodology of the specific meta-epistemological method. A synthesis of Stage 1 and Stage 2 epistemology is achieved within the constraints of the Stage 3 epistemology.

4.2.3 eGanges interface dynamics

The dynamics of the interface permit (1) free navigation of the application River in the Rivers window, to see the rules of law and questions to establish each antecedent, (2) ready access to any inductive and abductive glosses of an antecedent, as data, and Chapter Four: Shell epistemology 195

(3) the exchange of input answers and advisory feedback through the Adversarial windows and Current result window.

The design of the statics and dynamics of the interface requires an understanding of the eGanges application River and the communication system of the programming epistemology that is based on it, and gives effect to the computational epistemology of 3d legal logic.

4.2.4 eGanges Rivers

Following the necessary prior analytics of the legal expert, a nested River system of extended deductive Major premises may be constructed in the Rivers window for consultation and processing. To construct a River, it is necessary to determine by prior analytics of the expertise, the premises that can be arranged in this way; some prior analytics and procedures for constructing a River are dealt with in Chapter Five in the context of constructing the Vienna Convention application. Expert knowledge of the rules of law is acquired and represented within the epistemological constraints of the extended deductive River; the system of rules of law relevant to the positive Final result, is represented in this way, as distinct from the entire system of rules of law that is captured in the Star structure of the Sphere of the ontology of legal possibilities. Depending on the natural language used by law-makers, to set out these rules, the task of prior analytics may be more or less difficult. Some natural language more clearly conveys the logic of choice than other wordings.

The application River that appears in the eGanges Rivers window, is prima facie the positive river of the Sphere in the computational epistemology of 3d legal logic. However, as input is received from the user in a consultation, the blank or keyhole nodes of the prima facie River are coloured to match the triad scheme of Rivers: a blue node indicates that the user's node is from the Positive river, a red node indicates that the user's node is from the Negative river and a yellow node indicates that the user's node is from the Uncertain river. This permits the user to understand the mix of pathways along which the case is proceeding. Such a pathway may notionally zig-zag through the Sphere, but in the prima facie eGanges River, the alternate node colours in the 2d tributary system indicate the set of necessary and sufficient conditions, which may also be accompanied by some unnecessary and Chapter Four: Shell epistemology 196 insufficient conditions, that produce the result which can be seen in the Current result window at any stage in the consultation process.

At the website of Xenogene Gray, Grays Knowledge Engineering (www.grayske.com), there is a small applet in the area of Australian finance law, namely, Corporations Act 2001 (Cth) Chapter 7 (Financial Services Reform Act 2001 (Cth)), section 767A(1), which defines financial market; processing of the prima facie River of rules implicit in this section, can be seen in a consultation of the applet. The applet was developed in 2006 for the candidate's online teaching of finance law at Charles Sturt University. eGanges applications may be consulted or converted to applets for consultation, without the need for the shell. Only Java, which can be downloaded free from the web, must be available to run the applet.

4.2.5 Transformation of 3d visualisation by extraction of 2d River

By using the prima facie River as the form of graphical representation for the interactive visualisation of the shell epistemology, the methodology for large scale knowledge engineering is contracted from 3d epistemological complexity to the 2d visualisation of the program epistemology. The use of the prima facie River is an alternative way of understanding the complexity of the 3d visualisation. However, the remainder of the 3d complexity is accommodated in the feedback of the Adversarial windows and the Current result, and other ways of processing in the program epistemology.

To produce user-friendly large scale artificial legal intelligence, 2d visualisation of the expert knowledge is a more effective technique than 3d visualisation. This is borne out by virtual reality studies (Gray, 1995), using Australian constitutional law, anti-discrimination law, and the virtual reality shells, Superscape, Generic 3D, INFINI D, and Extreme 3D, from 1995 to 1997.

In 1995, 3D OZCON1 was constructed by the candidate, as legal knowledge engineer, in collaboration with J. Zufi and D. Gruber of the Virtual Reality Corporation, Melbourne, Australia, as the first prototype 3D legal logic system, using Superscape. The candidate provided the sample positive River to begin a three dimensional graphical representation of the Australian Constitution s.51 which sets out the federal powers to legislate. The plan was to build up the sphere of rules from the positive River. The experiment did not go beyond the positive River Chapter Four: Shell epistemology 197 representation. It was apparent that labelling of the River nodes brought difficulties, as the node labels were required to swing around, as the user moved in the virtual reality space, to suit the perspective of the user; words in immersive cyberspace could not be read from behind, as it is difficult to read words backwards. Conceptual relativity took on new meaning; two dimensional language symbols were not suited to three dimensional space. Even as the labels swung around, they were not always visible at certain angles, from certain positions of the user in the virtual world, in which the scaffold structure of the River was notionally stationary. If the sphere had been completed, it would have amounted to a three dimensional semantic net within the boundary of a sphere.

Shortly after, in the same year, 3D OZCON2 was constructed by the candidate as legal knowledge engineer, and Glen Peterson, a mathematics student at Charles Sturt University, Bathurst, New South Wales; a more extensive sample three dimensional graphical representation of the Australian Constitution s.51 was provided by the candidate. The program was presented at a special lecture, Cosmic Law in Cyberspace, given by the candidate at the University in 1995. Generic 3D was used in the second 3d Law project to see how a user might navigate the three dimensional knowledge structures if the program was not immersive.

Autodesk's Generic 3d was originally produced as an engineering drawing tool during the 1980s by Ian Braid of Three Space, a small software house in the silicon valley of Cambridge, England,. In 1989, Braid showed the software to the candidate, who was then a Visiting Fellow in Law at Hughes Hall, Cambridge University; knowledge of this tool, no doubt directed her formulation of the visualisation of 3d legal logic.

In 1996, a short video was constructed using INFINI D, based on a stereotype River figure without labels, to illustrate how abductive logic strata attach to the River. Jo Barnes, a graphics artist, and John Merkel, a video producer, at Charles Sturt University, worked with the candidate to produce the INFINI D program on video. The frame which showed the strata beneath the River was rendered and located in the dawn environment and the canyon environment offered by INFINI D. The video sequence showed firstly the River figure, then its strata falling below it, as a frame, and then the transformation of the frame to a rendered object with a dawn background. Further transformation of the environment of the rendered object is Chapter Four: Shell epistemology 198 shown with the transition of the dawn background into the canyon; the rendered object then moves in and out of the canyon like a flying machine. This video was shown at the combined Conference of the Australian Legal Philosophy Society and the Australasian Association for Philosophy, University of Queensland, Brisbane, Australia in 1996 in association with the candidate's presentation, The Theory of Three Dimensional Legal Logic.

A second, more substantial video, 3D LAW, also produced by Merkel, included the INFINI D program and also a more extensive Extreme 3D program constructed by the candidates son, Xenogene Gray. For the Extreme 3D program, the candidate prepared graphics that illustrated the River and sphere of the Anti-discrimination Act NSW 1977, section 24 (1); she also wrote accompanying text screens, and a script which she spoke as voiceover that explained her theory of 3d legal logic. The Extreme 3D program also portrayed the transformation of the graphics to implement amendments to s.24(1) in 1994. Where nesting of the tributary structure was required, the nested map was located inside the node which it particularised, like some micro map that could be zoomed into; this was the technique used in the Generic 3D program, but it was shown more clearly in the Extreme 3D program which had spherical nodes that were rendered and a more effective zoom function. The video was widely shown and used by the candidate in teaching; an extract of the Extreme 3D virtual reality graphics was also included as a news item on Prime Local News tv in 1997.

The second video explained the prior analytics of s.24(1) and its subsequent amendment that was necessary to prepare the River representation; it also showed the adversarial fishbone and how it was constructed, as well as the Sphere and how it was derived from the fishbone. Images of the mobility and navigation of the sphere showed clearly that immersive 3d visualisation is cognitively disorienting; the problem remained of how to automate the 3d extended deductive reasoning, as the virtual reality programs had no such facility. AVS (Advanced Visual System), produced by the North Carolina Super-computing Centre, did have some intelligent processing of virtual reality graphics, but this required hardware that was not available to the 3d Law Project.

Interactive virtual reality programs require extensive computer memory that would restrict unnecessarily the available hardware on which a 3d legal logic program Chapter Four: Shell epistemology 199 could be accessed. The efficiencies in design of eGanges produced a small program of about 250 KB which also minimises application size. Furthermore, the small size of eGanges and its applications suits PDA (Personal Digital Assistant) technology, including some mobile phones, that so far offers limited memory.

4.2.6 Object-oriented logic

At the conclusion of the virtual reality experiments, further exploration of a suitable programming environment lead to the LIL (Legal Intelligence Language) project which commenced in 1997 at the University of New South Wales. The task of the project was to produce an object-oriented computer language for law, based on the model of 3D legal logic, which would run on the World Wide Web. The project was undertaken by the candidate as Honorary Visiting Fellow in the School of Business Law and Taxation, and Tim Menzies, then Research Fellow in the School of Computer Science and Engineering. At that time, Menzies had embraced object- oriented programming and abductive logic as learning nodes from which further advances in artificial intelligence could proceed. The collaboration was cut short when Menzies joined NASA in the USA in 1998. Programming of a language to read the scheme of legal knowledge engineering instructions that the candidate developed during the project (Gray, 1999 – this has publishing errors in the graphics; also at www.grayske.com), never commenced.

The work of development of an object-oriented shell, eGanges, that provided for the deduction, induction and abduction of the legal domain, was undertaken as this doctoral study. After the candidate designed the shell in 2002, as explained in this Chapter, collaboration commenced for its programming by the candidate's son, Xenogene Gray, now a computational physicist. The shell was first demonstrated in June 2003 at a seminar at the Singapore Supreme Court, and shown at the Jurix Conference (Gray and Gray, 2003) in December 2003. Refinement of the shell continued until it became commercially available in 2006 (www.grayske.com).

The system of objects in the epistemology of 3d legal logic provides for interactive visualisation of a system of rules in the legal domain, and also the processes and heuristics by which they may be applied to a user's case. Flows in the Star conclude at a Pole or at the Final consequent of the Positive, Negative or Uncertain Rivers. The Positive River, which has the Positive rules of law, determined by the Positive Chapter Four: Shell epistemology 200

Final result, was selected as the 2d object for user-friendly interactive visualisation in the design of eGanges; the meta-rules and heuristics for this interaction were determined from the remainder of the epistemology of 3d legal logic. A client may be concerned with the Positive or the Negative result in an application, but an eGanges River may be drawn with the client's goal as the Positive result. In any event, eGanges shows in the Adversarial windows, the points for each of the opponents' cases.

4.2.7 Extraction of 2d River from Sphere

The River which has the Positive Final result lies in the equatorial plane of the Star or Sphere of the computational epistemology of 3d legal logic. The Star encompasses the equatorial River, the Northern hemisphere negative rivers including the Tropic of Cancer River system, and the Partially negative pole rivers that arise from it, and also the Southern hemisphere uncertain rivers including the Tropic of Capricorn River system, and the Partially uncertain pole rivers that arise from it; the inductive spectra with three triad sectors that link the three River systems of the equator and Tropics, is also included in the Star. The Sphere contains the Star and the abductive strata associated with its rules, as well as the boundary of presuppositions and post-suppositions of the three dimensional rule logic system.

The equatorial River is the central epistemological ideograph, as it is the positive rules and also the basis of departure for the negative and uncertain rules. The case for either parties in litigation may be treated as the positive case, with the positive Final result being the result actually sought by that party. The corresponding negative case is then the case for the opponent, subject to common antecedents, as explained in Chapter Three, in relation to the opponents' Initial Spam maps.

The eGanges River is an isomorphic representation of a system of rules that indicates the alternative pathways of valid inferencing to a single Final result; the negative and uncertain Rivers, including their pole streams, follow from the positive River to complete the logic in terms of the alternative pathways to one of the other four possible Final results. It is possible to identify the negative and uncertain deviations from the positive River, including the effects of disjunction and neutral nodes, by reference to the substance of any positive River, without requiring a visualisation of the three dimensional system of the Star of rules; meta-rules or heuristics for Chapter Four: Shell epistemology 201 processing the deviations may represent and account for the remainder of the computational epistemology of 3d legal logic. Generally, the conjunction of antecedents in a positive rule transforms as a fan of disjunctions in the corresponding negative rule and uncertain rule.

All the tributary rules of the positive River have to be satisfied in order to establish the Final consequent, except where a positive disjunction in the form of a fan provides alternative pathways. Any fan requires processing by reference to any tributary hierarchy of fans in which it occurs. In order for the River ideograph to be constructed as an isomorphic representation of a system of rules, the rules of law are distinguished from other legal information with which they are associated, such as the reasons for the rules and the case authorities for the rules. The heuristics of the hierarchies of fans in a system of rules may then be processed without the complications of inductive and abductive premises. Extended legal deduction is automated in a simple way, by input of Minor premises, as answer input of the user that establish certain antecedents in the River; the automated extended deductive processing also provides for the opponents' rules and the rules for uncertainty, which might arise as issues of law from uncertainties about the scope of the application of the antecedents in the rules of law, and/or issues of fact from uncertainties in the evidence.

The advantage of keeping distinct the premises used deductively, inductively and abductively, is that arguments in which they may be used may be evaluated as necessarily valid for legal deduction, definitionally or existentially valid for legal induction, and strong or weak for legal abduction. To extend or overturn legal deduction, there must be a modification of the rules of law. To extend or overturn legal induction, there must be a modification of the scope of application of the antecedents in the rules of law; this modification occurs by redefinition or existential reconsideration. Sometimes legal deduction is modified by legal induction, without the need to change the rules of law. Legal reasoning proceeds in this way; where there is an inductive solution, this is preferred over a change to rules raised by legal abduction. Abductive arguments may be extended or overturned by further information that requires an adjustment to the perceived strength or weakness of the abductive argument; social change, or the anticipation of social change, sometimes effects this. People are organised intricately according to existing law, so that a Chapter Four: Shell epistemology 202 change to law by virtue of a strong abductive argument, usually only follows social change. A non-monotonic sequence would require a sequence of prioritisation of arguments, whereas extended deduction through the entirety of the Star structure of rules, aided by connective spectra of inductive instances, to assist the user in the selection of answers, keeps the automation to necessary deduction.

Law that is large scale consists of complex epistemology and complex substance, making up a complex system. Metaphors have been shown to be useful in portraying complex systems that otherwise are not identified. Complex systems theory (Auyang, 1998) which uses 3d graphs of complex relationships, such as that shown in Figure 4.1, which might be compared to a Star of a legal application, suggests that existing methods of formal logic that are algebraic and linear, may not be suitable for the management of large scale complexity. The current conceptual system of rules in the legal domain takes the form of the (b) graph in Figure 4.1 which must be clarified as an (a) graphic to be computational. Algebraic expression may be useful for ensuring validity of a complex system, but it may not be manageable for a large scale sequence which might notionally extend some 200 kilometres, say from Sydney to Bathurst, with a bracket that might run from Penrith to Katoomba, say 50 kilometres. The one dimensional, algebraic code of a linear sequence is not user- friendly; in large scale complexity, compared to a graphic, a linear sequence of code may not provide ordinary insight into complex relations and operations, given the nature of human cognition.

The 2d River ideograph in the computational epistemology of 3d legal logic advances 2d epistemological ideographs in two ways: Firstly, unlike other 2d epistemological ideographs, such as a taxonomy tree or a decision tree, it is an isomorphic representation of a system of extended deductive Major premises in the form of conditional propositions; as such, its extended deduction may be automated on confirmation of the relevant Minor premises. Secondly, it is distinguished from a taxonomy tree and a decision tree, and from other 2d epistemological ideographs that Chapter Four: Shell epistemology 203

Figure 4.1: System of binary relations and Situate system Source: Figure 4.1 in S.Y. Auyang, (1998): Foundations of complex-system theories, Cambridge University Press, Cambridge, England, p.119. permit a mixture of deductive, inductive and abductive premises. A River paradigm with a mixture of deductive, inductive and abductive premises is apparent in the 2d epistemological ideographs of case arguments in the work of Fraunce (1588).

Apart from the task of legal knowledge engineering, the use of a River paradigm in an epistemological ideograph to represent legal reasoning, is supported by the judicial, evidentiary ideographs of Wigmore (1913), as well as the case argument ideographs of Fraunce. The 2d legal ideographs posed by Conover (1988) in Figure 4.30 also have linear, meandering River characteristics. Further, an aerial photograph of the tributary system of the Colorado River was suggested as a paradigm for a , by Buzan and Buzan (1995).

4.2.8 Extraction of deductive River from mixed argument River

Elizabethan legal logician and Gray’s Inn legal practitioner, Abraham Fraunce, was concerned to show the pattern of arguments in legal reasoning, where lawyers used a sequence of various logic forms in a case. He illustrated, in 2d ideographs, the arguments and judgment in an Exchequer case, namely, Earl of Northumberland's Case (1567), taken from Plowden's Reports. Fraunce's ideographic representation of the case revealed a tributary pattern of logic arguments; some of the legal arguments Chapter Four: Shell epistemology 204 in the ideographs were deductive, some were inductive and some were abductive. However the arguments were linked continuously and, where they were too extensive for a page, they were linked to further pages like sub-maps.

As a student at St John's College, Cambridge University, England, Fraunce had studied the recent logic methodology advanced by French philosopher, Peter Ramus (1515-72), whose revolutionary revision of Aristotle's Organon provided, in sequence, concise explanations of available argument forms, based on Aristotle’s works; the logic forms were set out in Latin as a Porphery tree taxonomy of logic, the General Table of the Dialectic of P. Ramus, shown in Figure 1.7 (Ong, 1958, 1974, p.202). Columns and parentheses were used extensively by Ramus and other medieval philosophers, as geometric logic. Samples reproduced by Ong are shown in Figures Figure 4.2, Figure 4.3 and Figure 4.4; the latter two might be seen as a medieval comprehension of the complex system shown in Figure 4.1 In 1574, two years after Ramus was assassinated in his room at the College des Presles during reformation riots in Paris, his two books of Dialectic were translated into English by Roland MacIlmaine and published as The Logicke of the Most Excellent Philosopher P. Ramus Martyr. The Porphyry tree hierarchical dichotomies of logic, clarified the alternative forms and structures of logical argument. Ramus (1574, 1969, p.56) also suggested that different argument forms could be 'knit' together by a transition from one to another; he (1574, 1969, p.18) further clarified a scholastic view that things, in logic, had a place:

A place is the space in the which the thing placed is contayned: ...The naturall philosophers also more accurately in the heauen, symple elementes, and compounde thinges, dothe acknowledge a place: Which is nothing els, but the subjecte of the thing contayned in it: ...

In legal knowledge engineering, logical place assumes notional or metaphysical space; there are places in epistemological spaces. In the specific meta-epistemology method, each Stage has its own space, with its own constraints, and the task is to transfer information from the space of each stage through the space of the following stages so that the fifth stage has an epistemologically sound legal expert system. Epistemological soundness must be maintained at and through each stage; each of the stages, in their sequence, are necessary, according to the five steps in knowledge engineering, for the construction of a legal expert system. Chapter Four: Shell epistemology 205

Figure 4.2: Ramist Chart of columns as outline of an art Source: W.J. Ong (1958): Ramus method and the decay of dialogue, Harvard University Press, Cambridge, MA, USA., p.181 Chapter Four: Shell epistemology 206

Figure 4.3: Celaya's The Geometry of the Mind Source: W.J. Ong (1958): Ramus method and the decay of dialogue, Harvard University Press, Cambridge, MA, USA., p.81. Chapter Four: Shell epistemology 207

Figure 4.4: Tartaret's Logic in Space Source: W.J. Ong (1958): Ramus method and the decay of dialogue, Harvard University Press, Cambridge, MA, USA., p.80. Chapter Four: Shell epistemology 208

In his study of the influence of Ramus at Harvard College, Cambridge, New England, from the seventeenth century, Miller (1939, 1954, p. 125) called the General Table the standard diagram, the “epitome” or blueprint that accompanied the work of Ramus in its teaching and study at Harvard College; he observed:

Only when the arguments were liberated from the categories, and allowed to fall into their proper places in the art of reasoning, could the whole pattern of reason be recognised.

Fraunce's diagrams of the legal arguments and judgments in Earl of Northumberland's Case showed the whole pattern of arguments, liberated from the Ramist hierarchy of categories, and allowed to fall into their proper places in the art of reasoning within the constraints of legal epistemology. In 29 pages of legal logic ideographs, Fraunce (1588, pp.125r-139r) identified in the case reasoning, the forms explained by Ramus, in their case pattern. The arguments and judgments in Earl of Northumberland's Case were mapped out by Fraunce in a representation of the sequence of the various forms of logic arguments that were put in the Exchequer mining matter, The ideographs showed the whole structure of the knitted transition in a sequence of logic components of the arguments for each side, and for the judgments of the Court of Exchequer. Northumberland's Case was a royalties dispute between Queen Elizabeth 1 and the Earl. At issue was the entitlement to the gold and silver in mines on land that was granted to the Earl by Acts 4 and 5 of Philip and Mary. Earl of Northumberland's Case is a case on that part of the law of Royal privilege that is concerned with mining royalties.

Fraunce developed logic methodology further than the Porphyry tree, in a diagrammatic way, based on legal practice epistemology, and showed how a selection from the alternative Ramist argument forms, might be used in a continuous sequence of legal argument, their places knit together at related transition points; this diagrammatic methodology clarified the nature of applied logic or technology, as the 'knitting' together of arguments, and complex use of logic forms for practical purposes. Knitting transitions from one sub-argument to the next, was the basis for the art of logic, or technology, meaning the plan of the logic art or craft. Miller explains the development of technology at Harvard College, from the Ramist springboard.

Thus, the teachings of Ramus that brought the clarity of methodology to Aristotelian logic, provided for Fraunce’s diagrammatic representation, the various argument Chapter Four: Shell epistemology 209 components that were each distinct Aristotelian logic forms, with a continuity for a practical purpose. It was the flow of continuity to the end result of the practical purpose, that produced a River structure as a two dimensional ideographic analysis of practical complex logic. In his tributary system of arguments, Fraunce mixed deductive, inductive, and abductive arguments non-monotonically, in accordance with the reasoning that happened in an actual case; inductive and abductive components were knit by transitions, into the extended deductive River structure. The parentheses used by Fraunce and other Ramists, have a mid-point like the arrow in a bow, indicating the point at which the knit connection enters the flow of the associated argument: the transition where one sub-argument joins another.

In the sixteenth century, the inference arrow had not yet been posed as a formal logic symbol to signal the end of the antecedents, and the consequent to follow; but the use by Fraunce of parentheses that had a central arrow, brought into legal ideographs, the metaphor of the central place on a bow holding an arrow. The parenthesis could give the sense of the flow of argument intended, giving the whole structure a form of a tributary River system; it provided a transition from mixed deductive, inductive and abductive argument, to streamlined extended deductive argument with inductive and abductive supports. This was important because the syllogisms of extended rule deduction were interwoven with the deductive syllogisms of legal abduction. In Fraunce's representation, the inductive parts of the argument could be questioned as definition or existential detail; the abductive parts could be questioned for strong or weak support. However, the deductive components could only be questioned as rules of law; if they were rules of law, they applied necessarily, unless they were changed.

It was noted by Leith and Hoey (1998, p.292) that Fraunce's model of the whole reasoning in a case had not been studied or adopted by contemporary legal knowledge engineers. The legal logic ideographs of Fraunce stood alone as unvisited beacons for hundreds of years, probably because their labelling was written in Anglo-Norman with some Latin, the languages of the legal system at the time. After English replaced Anglo-Norman as the official written language of lawyers in 1730 and 1732, by 4 Geo. 2, c. 26 and 6 Geo. 2, c. 14, the meaning of Fraunce’s ideographs faded with their archaic language.

In 2001, for the purposes of this doctoral study, the candidate arranged for a translation of the Anglo-Norman and Latin by Sybil Jack, a retired history professor Chapter Four: Shell epistemology 210 of the University of Sydney, with expertise in the Anglo-Norman language and the relevant legal period. Jack also had a specialist knowledge of Plowden’s Reports. A comprehensive study of Fraunce’s work is outside the scope of this thesis, but samples of his translated ideographs are shown in Figures 4.5, 4.6 and 4.7 to illustrate the tendency of the ideographs to a River paradigm; these samples are discussed by Gray and Mann (2003). The ideographs also illustrate the non- monotonicity of the logic components.

Figures 4.5 and 4.6 show some of the arguments for the Queen. They follow the method specified by Ramus (1574, pp.54-5) as the method of Aristotle, whereby the most general matter is placed first, by Fraunce, at the left, or at the top, and the special or singular last, at the right or at the bottom; also, as Aristotelian method, antecedents are placed before their consequent. As indicated by the mid point arrow of the parenthesis of Fraunce, the flow is from the details at the right to the conclusion that they support at the left. Sometimes the parenthesis captures a disjunction with alternatives related in some way, and sometimes it captures a syllogism.

Figure 4.7 is Fraunce's representation of the categories of the Court's major findings in the Exchequer judgment (1574, p.139r). Details of the Exchequer findings appear in the vertical column at the far right of the judgment ideograph. The reasoning system flows from right to left, and the order of specific findings of the Court can be read from the most general at the top to the most special at the bottom, in the final column at the right. Fraunce sets out the contradictory arguments of the opponents in Northumberland’s case seriatim, each roughly in two dimensional River ideographs. It can be seen from these samples of Fraunce’s ideographs, that they represent the components of specific logic forms in the opponents’ arguments, and the reasons for the decision. They have a systemic tributary structure that indicates a River paradigm.

In Figures 4.5 and 4.6 (1574, pp.125r and d), AA is used as a symbol to indicate where a sub-argument joins into another argument in the system of arguments; it is like a soccerball node, the sub-map indicator in the computational epistemology of 3d legal logic and eGanges. The ideographs in these Figures are a sample of the arguments for the Queen. They reveal a variety of argument forms running vertically and horizontally. Parentheses segregate each argument form and indicate by the Chapter Four: Shell epistemology 211

Figure 4.5: Arguments for the Queen in Northumberland's Case by A Fraunce (1588), Lawiers Logike, p. 125 r, translated by S. Jack.

Figure 4.6: Arguments for the Queen in Northumberland's Case by A Fraunce (1588), Lawiers Logike, p. 125 d, translated by S. Jack. Chapter Four: Shell epistemology 212

Figure 4.7: Judgment for the Queen in Northumberland's Case by A. Fraunce (1588), Lawiers Logike, p. 139 r, translated by S. Jack. Chapter Four: Shell epistemology 213 arrow shape midway on the parentheses, the link to the next argument. As extended arguments, they have a mix of deduction in the form of categorical syllogisms, induction including analogies, and abduction that provides strong support for the content of the deduction. The argument forms in the case are identified by Fraunce as syllogisms, analogies, authorities, rules and reasons for rules or coadjunct causes. The arguments are put by way of rules to be applied, and the development of those rules to suit the party’s claim; in the judgment, the rules are developed to support the final decision of the Court.

The extended, non-monotonic argument for the Queen commences on page 125r (meaning the first side of page 125 in the early page numbering system) with an inductive spectrum argument that arises from a principle of parallel excellence: royal excellence, the excellent minerals of the earth, and other excellent things are instances of excellence on the same inductive spectrum. The thrust of the argument is that the spectrum of excellence necessarily entails ownership by the crown, who is one instance of excellence, of other excellent things on the spectrum, such as the excellent minerals; the spectrum provides a common Platonic form of excellence, or relationships of definitional necessity between the inductive instances, that keep or bind them together. The spectrum is used to establish both the antecedent, monarch, the antecedent, excellent things and a conjunction of the two antecedents for a rule with the consequent of Royal prerogative. This is not a valid inductive argument, and can only be a weak abductive use of an inductive spectrum. It is not accepted by the Court.

The principle of parallel excellence which is prima facie an inductive spectrum argument, is used as an abductive argument for the Queen, to justify the rule that if there are excellent minerals, then they are the property of the crown. The abduction appears to fuse all excellent things by reason of the necessity of their common form, excellence; it relies on definitional necessity. In Elizabethan times, when any form of denial of the excellence of the monarch might be found to be treason, punishable by an extremely tortuous and barbaric death penalty, this abductive argument actually provides strong support for the rule that excellent minerals are owned by the crown. Abduction allows for power to have a place in a scheme of legal logic, namely, a place where reality and presuppositions enter the deductive system of the rules of law. Chapter Four: Shell epistemology 214

It can be seen from the judgment ideograph of Figure 4.7 that Ockham’s razor was used to slash the requirement for excellence. The rules of law go directly to the relevant adversarial deductive disjunction of base, silver, and gold metal mines, which is stated to include mines that have more than one of these; for combined metal mines, the prerogative cuts out where the quantity of base metal exceeds the gold and silver, so that the cost of extracting the gold and silver is greater than the value of the extracted gold and silver. Thus some combined metal mines in the spectrum belong to the Monarch, and some do not. As indicated in the judgment, the Earl may not have been advised of all the categories of mine that he might claim, particularly the combined metal mines where the quantity of base metal exceeds the gold and silver, so extraction is not cost-effective, as his pleading does not show the cost-ineffectiveness and a claim for such a mine with base metal greater than gold and silver.

There are three sub-arguments given to support the principle of parallel excellence; with the sub-arguments moving from top to bottom of the page and the extended argument structure moving from initial premises on the right to the final conclusion of parallel excellence on the left. A modern logician would probably present the whole structure vertically, starting from the initial premises on the right, rather than having two directions of argument.

The Queen’s sub-arguments for the parallel excellence argument, begin on page 125d (meaning the back side of page 125 which is then opposite page 126r in the early page numbering system), with two authorities for the common law royal prerogative in regard to fish: first the Treatise of Praerogativa regis as an authoritative declaration of the common law, that sets out the rule that the monarch ‘shall have whales and sturgeons taken at sea or elsewhere within the kingdom,’ and second, the earlier authority of Britton’s chapter on ‘trouvailles’ (things found), that sets out the common law royal prerogative in regard to ‘fishes’. These authorities are referred to as witnesses; they are set out vertically with the more recent authority first and then the older authority beneath it. Statements of authorities are abductive legal arguments that strongly support the existence of the deductive rules of law. Britton is an authority for the royal prerogative in regard to fish generally, and the Treatise of Praerogativa regis is the authority for the instances of the most excellent fish, namely whales and sturgeons. Chapter Four: Shell epistemology 215

Thus, part of the spectrum of fish, the whales and sturgeon, is selected out through the more recent authority, as special instances of fish taken from the general category of fish, for membership of the spectrum of excellence. The spectrum of excellence then replaces the spectrum of fish as the antecedent required for the consequent of Royal prerogative. Instances in the inductive spectrum of excellence form a new rule: if a whale or a sturgeon, then an excellent fish. Accordingly, it is then argued, vertically, that ‘of sea things the fishes and of the fishes the sturgeons and whales are the most excellent. Therefore the common law assigns to the [monarch] the most excellent things of the sea and the waters.’

Fraunce’s identification of this argument as a syllogism, suggests the classic categorical syllogism, consisting of three propositions, each relating two classes or categories of things, and containing three different terms, each of which appears twice in distinct propositions. The statement of the argument is represented by Gray and Mann (2003) in terms of modern predicate logic, as follows:

[p1] (∀x) ((Sx v Wx) ⊃ Kx)

Everything which is either a sturgeon or a whale is the property of the monarch.

[p2] (∀x) (Ex ⊃ (Sx v Wx))

Here the material implication should probably be material equivalence as follows:

(∀x) (Ex ≡ (Sx v Wx));

Everything which is a most excellent sea creature is either a sturgeon or a whale [and vice-versa]

[ic1] (∀x) (Ex ⊃ Kx)

So everything which is a most excellent sea creature is the property of the monarch

In Aristotelian terms, (Sx v Wx) is the middle term; Kx is the major term and Ex is the minor term.

This is a valid deductive argument. Chapter Four: Shell epistemology 216

The valid deductive argument is used abductively to justify a rule. The Queen’s arguments to establish the parallel excellence principle continues on page 125r. Gray and Mann (2003) put the argument as follows:

[p1] the land [of the kingdom] is like the sea in being an integral part of the monarch’s domain

[p2] the monarch owns all of the most excellent things of the sea [by law, the conclusion of the first argument]

[ic2] so the monarch also owns all of the most excellent things of the earth.

On the face of it, this is an inductive argument, with some element of analogy and some element of abduction. The new implicit antecedent, brought in abductively, with inductive instances, is the realm of the monarch, which may be earth, its waters (e.g. rivers and lakes) or adjacent, encompassed sea. Probably it is best regarded as an argument from inductive analogy.

Again, this conclusion becomes a premise in the final, deductive argument of the chain of reasoning. Because all of the most excellent things of the earth are the monarch’s property, and gold and silver are the most excellent things of the earth, the gold and silver [of the kingdom] belong to the monarch.

This is expressed in logic formulae by Gray and Mann (2003) as follows:

[p1] (∀x) (Ex ⊃ Kx)

All things are such that if they are most excellent things of the earth [middle term] then they are the property of the monarch [major term]

[p2] (∀x) ((Gx v Sx) ⊃ Ex)

Gold and silver [minor term] are most excellent things of the earth

[fc] (∀x) ((Gx v Sx) ⊃ Kx)

So gold and silver are property of the monarch

[the x class here is the class of things, or things capable of being property]

The second premise seems unproblematic in the circumstances. And the argument is valid. Chapter Four: Shell epistemology 217

Next in page 125d, are the Queen’s abductive arguments that strongly support the prerogative rules. Gray and Mann (2003) also put these arguments into the form of a categorical syllogism:

[p1] (∀y) (Ry ⊃ My)

All effective rulers are rulers with access to the material means of ruling

[p2] (∀y) (My ⊃ (Gy & Sy))

All rulers with access to material means are rulers with access to the gold and silver of their kingdoms [access to gold and silver is a necessary condition of access to material means]

[c] (∀y) (Ry ⊃ (Gy & Sy))

So all effective rulers are rulers with access to the gold and silver of their kingdoms

[the y class here is the class of people or rulers][

Fraunce shows how these two different lines of reasoning, from rules of law and from reasons for the rules of law, converge upon the same conclusion, namely, the monarch’s right of ownership of gold and silver within the territory of the kingdom. This, in turn, becomes a premise for further extended argument. Fraunce’s diagrams are argument maps, not rule maps (cf. Gray, 1988, 2002), and not practice maps like the Latent Damage Law tree (Capper and Susskind, 1988). However Fraunce's argument maps include rules of law and show arguments that support the existence of the rules that are posed for determination of the judgment.

The deductive rules of law posed in the Queen’s arguments could be extracted from the Queen’s non-monotonic sequence and mapped as an eGanges River system in accordance with the epistemology of 3d legal logic; this is shown in Figure 4.8. If the deductive premises of law are distinguished from the deductive premises that are used in legal abduction, then the reasons for the rules of law may be considered as strong or weak arguments for the adoption of the rules. Also the legal deduction in the application of rules of law to facts of the case may be seen clearly for its validity. Chapter Four: Shell epistemology 218

Figure 4.8: P.N. Gray's eGanges River of Queen's arguments in Northumberland's Case

In Figure 4.8, the interim consequent, Most excellent fish, becomes the antecedent in a further extension of the deductive rules of law. The new rule is that if there are Most excellent fish, then there are Most excellent things, and these are subject to the royal prerogative. There may be Most excellent things other than the Most excellent fish that qualify as Most excellent things that are subject to the royal prerogative; other Most excellent things are the most excellent minerals, the most excellent things of the earth.

If these arguments of the Queen are expressed as conditional propositions, they can be clearly constructed as a River, according to the computational epistemology of 3d legal logic. A River requires streamlining of antecedents in a system of conditional propositions. The order of antecedents indicates the flow direction of the River logic. Separating antecedents on a primary stream from those on a secondary stream, allows separate particularisation and an interim consequent.

By comparison to the arguments of the parties, the judgment is simple, and extracts the deciding factors in the case. The content of Fraunce's diagrams integrate legal Chapter Four: Shell epistemology 219 logic arguments, litigation procedure and a scheme of presentation. Fraunce effectively nests his 2d diagrams, as the legal reasoning is too extensive for a single page. Nesting indicates the need for 3d notional space. Also he uses a precedent case (p.127r) as a gloss on the ; as a note in a paragraph outside his diagram, in relation to a point of argument in a preceding diagram (p.126d), he gives the 'example' (cf. Detmold, 1984, p.179) of the case of Roger Chambernoune (1457- 8), a nefarious taker of ore.

By rendering the reasoning logically transparent, Fraunce's form of representation opens up such reasoning to critical analysis and assessment. The Earl's case also has elements of non-monotonic logic, what might be termed defeasibility argument, propositions of the ontology of law, of the epistemology of legal doctrine and of the epistemology of adjudication, as components of legal expertise (cf. Bankowski, White and Hahn, 1995, pp.12-13).

For instance, a point of abductive legal ontology is made (p.126r) in the Queen's argument that, because the monarch needs gold and silver for the minting of coins, there is no royal power to grant the mining of gold and silver to anyone else. In the judgement there is a point of epistemology of adjudication, where there is a finding that the Earl has not proved the ratio of gold and silver to base metals, so as to fall within the requirement of the rule that where the ratio of gold and silver to base is such that the cost of the process of separation would be greater than the value of the gold and silver, then there is no royal prerogative in relation to these deposits. The onus is on the Earl to plead his case, as there is a presupposition in regard to the ratio of the metals in favour of the Queen.

Fraunce’s River did not distinguish legal deduction, induction and abduction; all arguments were included in his case ideograph to represent the arguments that were actually put, as a study of legal logic. His River was not isomorphic to the rules of law that could be applied by extended legal deduction. The River structure of Fraunce was not clearly distinct as pure application of rules. It mixed deductive rules, abductive arguments to support the relevant rule formulation, inductive analogy, and abductive authorities to support the validity of the rules. The more mixed the logic, the more aberrant is the extended deductive River structure. Chapter Four: Shell epistemology 220

Fraunce's model of integrated legal expertise also reflects the mix of science and art in legal reasoning. Where it is a science, it may be suitable for automation. Where it is an art, it may not be. In addition to the automation of a system of deductive rules as a scientific core, a legal expert system may provide legal data for some specified non-monotonic art, whereby an irregular ordering or processing, of induction and abduction, expands on the automation.

In Fraunce's graphics, the classification of details proceeds from the left to the right; however, like a River, there is a flow in the order of extended deductive argument from the top of the details, through interim conclusions, from the right to the left, and down to the Final result that the claim of the Queen for royalties is successful. This deductive flow is similar to the eGanges River in which deductive flows may also be drawn from left to right, as a matter of design balance in the graphic.

Fraunce’s legal logic diagrams were not included in the candidate’s Master’s thesis, due to the lack of an English translation. However, it is consistent with one of the tenets of the candidate’s Master’s work, namely that there is a cyclic paradigm of legal intelligence, and in the life cycle of the English legal system, there was a Theoretical Stage in the period 15th Century A.D. to 17th Century A.D. (Gray, 1997. pp.123-6). During this period, a theoretical paradigm dominated legal thought. In the Master’s work, evidence of this dominance was given, apart from the work of Fraunce: it began with the modification of the common law by the development of equity in the fifteenth century, and thereafter proceeded with the conversion of the forms of action into a theoretical exposition of principles of law for the categorisation of cases. In the light of the translation of Fraunce’s ideographs, in the sixteenth century, his diagrammatic legal logic contributed to the transition from forms to principles. There was a culmination in the seventeenth century, of the Theoretical Stage of the legal system with the pattern analysis of Bacon, the systems analysis of Hobbes, and the empirical epistemology of Locke that flagged the next stage following, the Stage of Casuistry.

Ramus, whose method was embraced by the Protestant religions of the Reformation, provided the logic palette for coherent compositions or systems of different categories of logic forms; he has been regarded as the only Renaissance philosopher (Dunn, 1969, p.xiii). Through Fraunce's first pattern of Reformation legal epistemology, the Reformation logic gives rise to one line of study of Chapter Four: Shell epistemology 221 epistemological ideographs in the legal domain, and a second line of study that develops legal epistemology. The computational epistemology of 3d legal logic may be seen as an advance along these two lines of investigation; that extends epistemological ideographs and legal epistemology into computational legal epistemology. The deductive River ideograph is a central part of this development.

Fraunce (1588) also mapped as a taxonomy tree, the criminal law set out in Stanford’s Crown Pleas, an authoritative text; contrary pleas were included in the criminal ideograph. In his studies of logic at Cambridge University, Fraunce is likely to have seen Porphyry's tree (c.300) that synthesised the form philosophy of Plato and the logic form of Plato’s most renowned student, Aristotle. Porphyry (c.232- c.304), who is known as the father of taxonomy, was a Greek philosopher who worked in Rome with Plotinus, the founder of Neoplatonism. The simple Porphery tree, which is shown in Figures 4.9 and 4.10, is a 2d ideograph of Aristotle's ontology of substance; mutually exclusive disjunctions are clarified in relation to universals. The Isagoge, an introduction to Aristotle’s Categories, was written by Porphyry and translated into Latin by the Roman philosopher Boethius (c.480-525);

Figure 4.9: Tree of Porphyry (c. 300 AD) Source: R. Audi (ed.) (1999): The Cambridge Dictionary of Philosophy, 2nd ed., Cambridge University Press, Cambridge, England, p.928. Chapter Four: Shell epistemology 222

Figure 4.10: Tartaret's Tree of Porphyry (c. 300 AD) Source: W.J. Ong (1958): Ramus method and the decay of dialogue, Harvard University Press, Cambridge, MA, USA., p.78. Chapter Four: Shell epistemology 223 during the Middle Ages the Isagoge was treated as a companion to Aristotle's treatises on logic, the Organon, and regularly published with it.

Fraunce’s case ideographs were an innovative representation of practical reasoning toward a teleological end, unlike the taxonomy tree of Porphyry. The methodology of Ramus enabled him to weave the rhetoric of legal practice with its mixed logic of invention and judgment. Hierarchies exist in both the case ideograph and the tree. However the tree taxonomy represented Aristotle's hierarchy of categories of substance, whereas the hierarchy in the case ideograph is concerned with categories for extended deductive arguments for a certain legal outcome. Antecedents in rules of law are not related in a taxonomic hierarchy, although there may be definitional rules that give a taxonomic effect by a hierarchical definition. Both the tree and the case ideograph are structured by reference to greater particularisation, but in the Porphery tree taxonomy, all categories are within the universal, whereas in the case ideograph, there is no all encompassing universal, but rather a Final result from all the arguments related and represented hierarchically. Taxonomy trees arise through differentiae, whereas deductive Rivers represent extended, overlapping deductive premises, in the form of sequential conditional propositions, with a Final consequent. However, factual taxonomy trees also may be considered in the formulation of antecedents in the deductive rules; there are inductive or abductive glosses on River nodes, available in eGanges, as will be explained.

As a qualitative development of geometry, the epistemological ideograph of the tree of Porphyry (c.300 AD), represented Aristotle’s ontology of qualities of substance; the ideograph shows a hierarchy of sub-categories as divisions of the universal, namely substance, which might encompass everything. At each level of sub- categories, there are mutually exclusive categories that are opposites in the nature of contradictories; the substance ideograph is an epistemological formalisation showing conceptual entities in their relationship to each other. If the ontology is correct, then the taxonomic hierarchy would permit reasoning about the relationship between the categories, with necessary validity derived from the tree hierarchy.

The taxonomy of Porphyry's tree with its Aristotelian genus of substance, subordinate genera derived from differentiae, and finally species, was developed extensively for the botanical and zoological classification systems of Linnaeus (1707-1778) and Jussieu (1748-1836). Linnaeus, or Linne, (1751) described over Chapter Four: Shell epistemology 224

7000 plants, giving each a binomial nomenclature of genus and species. Jussieu (1789) provided the genera plantarum of plant families. The tree is a structure for large scale systems. With the prior analytics of botanical and zoological classification systems that were established by Linnaeus more complex Porphery trees were required. Classification trees represented taxonomies with many hierarchical levels that provided for classes, orders, genera, species, sub-species etc. The American jurist, Wigmore (1913), also used two dimensional ideographs, now known as Wigmore Charts, to develop a judicial epistemology for managing evidence in complicated cases. However, Wigmore was largely concerned with sorting out the reliability of differently weighted conflicting evidence, rather than the complexity of deductive legal rules and their inductive particularisation in authoritative cases. Wigmore’s emphasis reflects the dominance of the Casuistry Stage in which he was working. His ideographic epistemology might be developed as an extension of eGanges. Figure 4.11 is Wigmore’s representation of his evidentiary evaluation system. The system is not established by law, but it suggests a pragmatic regularity for systematic and consistent evaluation of evidence. The symbols of Wigmore in the chart indicate whether evidence reaches, or blocks, a conclusion.

Wigmore, in the development of his graphical evidentiary evaluation system, may have been influenced by the works on logic of Peirce, particularly the system of 2d entitative graphs (Peirce, 1897, 1933, Vol III, Exact Logic, p.295ff) in his logic of relatives, extended by the system of 2d existential graphs that Peirce (1911, 1933, Vol IV, The Simplest Mathematics, Book II,) developed, based on the logic diagrams of Lambert (1764; Peirce, c.1903, 1933, Vol. IV, p.297), Euler (1772; Peirce, c.1903, 1933, Vol. IV, p.294ff,) and Venn (1894; Peirce, c.1903, 1933, Vol. IV, p.296), who founded set theory.

Peirce's inductive method of scientific inquiry distinguished abduction as the context or source of hypotheses, deduction as the context or source of verification, and induction as the context or source of validation. Just as legal abduction may sometimes use deductive forms of argument, so an evaluation of evidence to make findings of inductive material facts, also may sometimes use deductive forms of argument. These distinctions were not incorporated into jurisprudence, and lawyers were free to argue how they saw fit; laws were not viewed as hypotheses, although Chapter Four: Shell epistemology 225

Figure 4.11: A Wigmore Chart Source: Wigmore, J.H. (1913): Principles of judicial proof as given by logic, psychology and general experience and illustrated in judicial trials, Little, Brown and Company, Boston, USA, following p.756. Chapter Four: Shell epistemology 226 they take that form of conditional proposition. However, the representation of possibility, necessity, and actuality is a matter for judicial consideration that concerned Wigmore; it progresses from the range of possibilities revealed by the evidence, inferences that might be made from these possibilities, to actualities based on these inferences.

Wigmore was concerned to categorise evidence by reference to the reliability of witnesses as well as the facts of their testimony. His ideographs incorporated his factors of reliability, which are not consistently adopted by judges, as well as the testimony given. They show the difficulty of synthesis in weighing up the evidence and making findings of material facts by reference to the antecedents of the rules. Evidence in the alternative, for prioritisation, will require a flow like the fan of Rivers toward a Positive, Negative or Uncertain material fact; for example, the more fan streams established for a Positive material fact, the greater the weight for that finding. eGanges does not provide for evidence in this way, although it could be adapted to do so, as it allows for parallel Rivers to be constructed as separate applications which can be connected at nodes in each, selected as the points for connection.

An eGanges parallel River for the cases that are authorities for the positive sector of an antecedent's spectrum, may have a Final result of 'Material fact establishing antecedent', and an initial disjunction of (1) precedents with a positive Final result and (2) precedents with a negative Final result. There may then be fans of relevant material facts with case citations, from each of the two sets of precedents.

In his theory of signs and the study of perspicuous graphical representations of ontology and logic, Peirce sought to clarify his version of pragmatism, as drawing on abduction for reason and decisionmaking. Quality and relations could be represented in graphs; quality included mere appearance as well as specific objects or general types; relation covered similarity or difference, dynamic or causal, and habit or rule- determined. In regard to representation, Peirce distinguished assertions, as predicate premises, propositions, as conditional premises, and arguments which required an application of propositions to assertions to arrive at a conclusion in the nature of a further assertion. Thus ideographs could map each part of an argument, namely, the proposition, the assertion and the conclusion, in order to show the expansion of the concluding assertion by reference to the premises; thus ontology could expand Chapter Four: Shell epistemology 227 through conditional propositions which might be drawn from abduction. However, the expansion of ontology only occurs by the epistemology of the argument; this was Peirce's seminal pragmatism, distinct from any subsequent versions of pragmatism that might discard logic. Peirce's (1906, 1933, Vol IV, pp.410-11) explanation of, and justification for, his theory of signs, applies to the use of eGanges maps and their more extensive associated maps in the theory of 3d legal logic, the hemispheres de inesse, that might be viewed through a River keyhole node that is coloured red or yellow:

Come on, my Reader, and let us construct a diagram to illustrate the general course of thought; I mean a system of diagrammatization by means of which any course of thought can be represented with exactitude.

“But why do that, when the thought itself is present to us?” Such, substantially has been the interrogative objection raised by more than one or two superior intelligences, among whom I single out an eminent and glorious General.

Recluse that I am, I was not ready with the counter-question, which should have run, “General, you make use of maps during a campaign, I believe. But why should you do so, when the country they represent is right there?” Thereupon, had he replied that he found details in the map that were far from being “right there,” that they were within the enemy's lines, I ought to have pressed the question, “Am I right, then, in understanding that, if you were thoroughly and perfectly familiar with the country, as, for example, if it lay just about the scenes of your childhood, no map of it would then be of the smallest use to you in laying out your detailed plans?” To that he could only have rejoined, “No, I do not say that, since I might probably desire the maps to stick pins into, so as to mark each anticipated day's change in the situations of the two armies.” To that again, my sur-rejoinder should have been, “Well General, that precisely corresponds to the advantages of a diagram of the course of a discussion. Indeed, just there, where you have so clearly pointed it out, lies the advantage of diagrams in general.”

Peirce (1906, 1933, Vol IV, p.421) also defined his term graph:

By a graph (a word overworked of late years), I, for my part, following my friends Clifford and Sylvester, the introducers of the term, understand in general a diagram composed principally of spots and of lines connecting certain of the spots. But I trust it will be pardoned to me that, when I am discussing Existential Graphs, without having the least business with other Graphs, I often omit the differentiating adjective and refer to an Existential Graph as a Graph simply. Chapter Four: Shell epistemology 228

Lambert, in his Neues Organon (1764) posed the use of parallel lines to represent a category and sub-category; Peirce (c.1903, 1933, Vol. IV, p.297) gives an example of this graphics system of Lambert, which is reproduced as Figure 4.12

Figure 4.12: J.H.Lambert (1728-77) - 'Some A is B' Source: C.S. Peirce, 1933, Vol. IV, The Simplest Mathematics, p.297.

One of Peirce's early graphs, in considering 'non-relative logic', (Peirce, c.1885, 1933, Vol. III, Exact Logic, p.242) shows the paradigm of the Ishikawa fishbone that was apparent in branching diversifications; it is reproduced in Figure 4.13 However, Peirce probably saw this as a Porphery tree, since he describes it as 'branchings from a stem.'

Figure 4.13: One of Peirce's early graphs, in considering non-relative logic Source: Peirce, c.1885, 1933, Vol. III, Exact Logic, p.242

When Peirce (1883, 1933, Vol. III, Exact Logic, p.412) encountered four dimensional logic in his early work, he observed:

It is much as if a geographical position should be expressed by a single algebraical letter; the value of this letter could only be defined by the use of two numbers, say the latitude and the longitude. Chapter Four: Shell epistemology 229

With the development of systems science in the twentieth century, the logic of dynamics and the dynamics of logic has continued to develop with the use of lattices, represented in graphs, to locate related positions and changes. Coecke, Moore and Smets (2004. p.534) show a photon lattice that is star shaped, with positives and negatives distinguished; this lattice is reproduced in Figure 4.14.

Figure 4.14: B. Coecke, D. J. Moore and S. Smets: a photon logic lattice (Logic of dynamics and dynamics of logic, in S. Rahman, J. Symons, D.M. Gabbay and J.P.v. Bendegem, Logic, epistemology and the unity of science, Kluwer Academic Publishers, Dordrecht, Holland, Figure 1, p.534.)

In the late twentieth century, mind mapping was accepted as a method of analysis of thoughts; it commences with a central word or concept and proceeds by adding associated words or concepts without any necessary reasoning. Usually 5-10 main ideas that relate to the central word are drawn around the central concept. Then 5-10 ideas are drawn in relation to each of the main ideas, and so on, exponentially. Depending on the nature and use of any actual mind map, the mind mapping method may or may not be epistemological; it is not inherently epistemological. Its value is in the construction of semantic associations free of epistemological constraints. The website at www.mindtools.com explains the use of these 2d ideographs; 3d mind mapping of associated ideas, based on the paradigm of the brain, is also available at: www.thebrain.com. eGanges deductive Rivers, the mixed logic river of Fraunce, and taxonomy trees could be regarded as epistemological mind maps. Just as taxonomy trees are based on ontologies with hierarchical categories, Rivers are based on Chapter Four: Shell epistemology 230 ontologies structured as conditional propositions of extended deductive Major premises.

4.2.9 Extrapolation of River from mixed logic decision trees

The methodology offered in knowledge engineering for devising expert systems, was decision tree modelling. It was discovered in the prior analytics for the first CLIMS Pilot that a River ideograph (Gray, 1988), rather than a tree, was an isomorphic representation of the hierarchies of rule sub-systems of law, suitable for acquiring and managing selected legal information for the design of an expert system; In 1990, CLIMS Pilot No.3 (Gray, 1997, p.262-3) was constructed by the candidate as legal knowledge engineer and Sue Gates as package adviser, using the Texas Instrument shell Procedure Consultant, which permitted the construction of the application with the visual aid of a decision tree. The 3d visualisation of the system of rules in a Sphere, had been devised (Gray, 1990) before the construction of CLIMS Pilot No 3. The intention was to construct a larger program than the first and second CLIMS pilots, and adapt the Sphere of Rivers to a decision tree of a contractual transaction strategy. It was thought that Procedure Consultant might permit legal experts without programming expertise, to readily construct legal expert systems, provided a decisionmaking procedure could be selected from the Sphere of rules, as a tree representation. The tree was not available in a consultation of the expert system, so that it did not add transparency for the consultation user.

Spherical knowledge for the design of a program provided for the combinatorial explosion of possibilities arising in the potential cases covered by the rules, and the combinatorial implosion of inductive factual instances of antecedents in the rules of law. Inference engines of knowledge base shells use a cache as a storage device for accumulated answer input, so that a question or repeated part of a pathway would not be put to a user twice. The repetition of pathways to deal with possible cases, would not be noticeable to a user if they changed pathways The Procedure Consultant tree construction made it possible to mix a choice of inductive precedent material facts, with deductive disjunctions from positive, negative and uncertain rule systems. Abductive information including question glosses, and information about the consequence of possible answers, could also be included in a tree node. This complicated the preparation of the tree. The CLIMS Pilot No 3 tree was constructed from part of the Sphere of contract rules, using Procedure Consultant. This Chapter Four: Shell epistemology 231 experiment confirmed that the tree was not suitable for large scale systems, due to the repetition required in providing decision paths for all alternative combinations of antecedents. The decision tree methodology is not user-friendly and not cost- effective for large scale legal knowledge engineering. Rivers make repetition unnecessary, as they require a specification of the positive rules only; combinatorial heuristics can be devised by reference to the prima facie positive River.

Further CLIMS experiments attempted to use hypertext to effect a River graphic, and to scale up the CLIMS Pilot No 3. The scaling up attempts made apparent the need for nesting. The development of prior analytics methodology was hampered by the transformation of a Sphere of rules to a 2d tree in which it was difficult to maintain design control of the repetition required for combinatorial explosion. A systematic methodology of producing large scale applications, arising from the Sphere was identified as a prerequisite for further development of the technology, using virtual reality software.

Porphery trees are decision trees based on a scheme of hierarchical ontology. Decision trees may also be based on other ontological decision-making schemes. For instance, a River might be used to design a decision tree; it is possible to extract an antecedent from a positive River, as a tree node that asks the question that might produce a positive answer that establishes that antecedent. Alternate answers that are negative or uncertain are possible so that the three alternative answers can be shown as alternate pathways or branching of the tree to the next antecedent selected from the River, which is regarded as a child of the first selected antecedent. Each alternative answer to the second selected antecedent, would then have the answers as pathways to the third selected antecedent as a child of the second antecedent, and so on. The order of antecedents may begin with a selection at the top of the mainstream rule in the River, and as mainstream antecedents are selected, selection of antecedents for the tree order may backtrack to the top of any secondary, tertiary etc stream to the top of any hierarchy. To complete the alternative tree pathways, where there are positive disjunctions, these would have to be treated in order, according to their antecedents and the combinatorial explosion that they raise. Each negative answer would go to the sequence of fan alternatives until they were exhausted, before proceeding on to either a negative Final result, an uncertain Final result, or the next positive antecedent. The complexity of the combination of positive and Chapter Four: Shell epistemology 232 adversarial disjunctions has to be iterated in transforming a Sphere of rules to a tree representation.

A simple decision tree is shown in Figure 4.15, with its characteristic hierarchical regimentation. Trees are particularised from the root to the leaves exponentially, and to some extent, so too are Rivers particularised from mainstream antecedents up to the watershed nodes. However, as the eGanges River is confined prima facie to positive rules only, its disjunctions are captured in fans, and no repetition of nodes or pathways is permitted or required. A tree may be turned upside down so that the root is at the bottom of the diagram, or sideways so that the root is at the right or left extremity of the diagram; upside down, a tree may begin to look like a River.

Figure 4.15: Tree showing root node and leaf nodes

The schemes for chaining in rule-based expert systems were designed by reference to decision trees. In his description of backward and forward chaining of an inference engine, Susskind (1987, p. 209) identified, the nodes in a tree turned sideways, with the root node at the right, as antecedents in rules: Chapter Four: Shell epistemology 233

In short, forward-chaining involves finding the antecedent of a rule on the left-hand side that is satisfied and then moving through the tree from left to right until a conclusion is reached, whereas backward-chaining entails starting at the conclusion on the right-hand side, as it were, and progressing leftwards in a search for antecedents that apply to the facts.

An extensive decision tree was used by Capper and Susskind (1988) in the development of the Latent Damage Law expert system. In doing so, they followed the practice at the time in knowledge engineering so that they could use a commercial shell, Crystal. Their tree diagrams are shown in Figure 4.16 and Figure 4.17. As indicated in Figure 4.17, it was also necessary to expand some parts of some pathways. In the formulation of a decision tree, legal experts were required to specify all possible paths and any necessary duplications, and this was a time consuming and tedious task, not suitable for large scale legal knowledge engineering; the tree methodology is not suitable for the legal domain which has rule systems that are large-scale and produce extensive combinatorial explosion.

The difficulty of duplication is complicated if the nodes of a decision tree include not just antecedents in the relevant rules of law but also inductive choices established by case authorities, and abductive information that comments on the rules, cases or other matters. Nodes in the Latent Damage Law system have some mix of premises of legal deduction. induction and abduction. The knowledge base is considerably larger than any of those of the CLIMS pilots.

The large decision tree of Capper and Susskind, used to design and construct Latent Damage Law, represented expertise in the law of latent damage in building disputes; it consists of expert communications in chunks of data schematised as pragmatic expert advice rather than as an isomorphic representation of a system of rules of law. Latent Damage Law is a system which provides advice of an expert legal practitioner. The massive Latent Damage Law decision tree may have been smaller if its nodes had represented antecedents in a rule of law; however, sometimes chunks of pragmatic expertise combining deductive, inductive and abductive information, can compress the rule system. Latent damage law is a limited area of law, yet the tree is massive and complex.

In his earlier work, Susskind (1987) saw that rules of law could be represented isomorphically by and/or trees, but he did not consider this extensively. He was more concerned to find a commonly acceptable range of variations in automatable rule Chapter Four: Shell epistemology 234

Figure 4.16: Latent Damage Law Tree – top level Source: P. Cappa and R.E. Susskind (1988), Latent Damage Law: the expert system, Butterworths, London, Figure 1. p. 66 Chapter Four: Shell epistemology 235

Figure 4.17: Latent Damage Law Tree – breach of duty Source: P. Cappa and R.E. Susskind (1988), Latent Damage Law: the expert system, Butterworths, London, Figure 1. p. 72. Chapter Four: Shell epistemology 236 forms, to accommodate the scope of analytical jurisprudence in legal expert systems technology. Effectively, Susskind reconciled analytical jurisprudence to the epistemology of rule-base systems that were used during the 1980s in knowledge engineering. He initially produced a small Scottish divorce law program with a rule- base containing a scheme for chaining, that was applied by an inference engine which carried out the chaining search according to the user's answer input. Although Susskind adopted the rule paradigm based on legal positivism, as a suitable basis for legal knowledge engineering, he did not consider that law-making power to determine the rule, made legal positivism the source of truth of the Major premises to be used in extended legal deduction. The source of truth for the Minor premises lay in the evidence provided in the client’s case. A tree representation may blur this distinction where chaining incorporates simultaneously both search and inference. A search for applicable Major premises only amounts to legal inferencing if it gives effect to extended legal deduction. In the legal domain, the failure to establish a Minor premise never affects the truth of a Major premise, as it might do in a system of premises in a field of science that uses similar extended deductive logic. Provision must also be made for neutral antecedents, common antecedents in opponents' cases, and disjunction pro tem in the adversarial context.

In his use of trees, Susskind also observed:

In a sense, the duality of a consequent in one rule that is also an antecedent in another rule, cuts off the ‘if’ clause from the ‘then’ consequent; the consequent can only be an interim consequent. The interim consequent is then the established antecedent in the other rule. Conditional propositions that are truncated in this way amount to antecedent linkages which may be used as a protracted sequence of Major premises for extended deduction, known as Horn clauses; only the Final consequent which is not also an antecedent in another rule operates solely as a consequent. This duality of some antecedents is a heuristic or meta-rule for the design of a chaining process of an inference engine.

Rivers represent the rules of law without any such Horn clause truncation, as the Major premises of the hypthetical syllogism, explained by Waller, as referred to in Chapter 1, by which law is applied to a user's case. As such, they are a departure from decision trees which were used in twentieth century knowledge engineering, to devise Horn clauses, representing Minor premises, or inductive material facts, with their associated conclusions, for knowledge acquisition and representation, and to aid Horn clause programming design. Chapter Four: Shell epistemology 237

The tree is a comprehensive hierarchy of disjunctions that, for legal knowledge engineering, must include all the disjunctions that are represented by the Star in the computational epistemology of 3d legal logic. The same problem of showing the combinatorial explosion of legal possibilities, applies to trees as well as their rule base iteration. Even if there is an attempt to reduce the combinatorial explosion by compacting deductive, inductive and abductive information about legal choices into a scheme of legal information nodes, such as the mixed logic decision tree of the Latent Damage Law expert system, shown in Figures 4.16 and 4.17, trees are not suitable for large scale legal expert system methodology; their lack of a legal domain epistemological system allows the legal knowledge engineer to get lost in the complexity and extent of the tree. Even in the natural environment, Rivers, due to their flow, allow those lost in the surrounding forest to find their way out; similarly, the concrete jungle of modern cities with its forest of information, can benefit from eGanges quality control, micro-management Rivers.

In the computational epistemology of 3d legal logic, the Star differentiates rules according to their logical relationships and synergistic flow direction, even though the three dimensional structure is not user-friendly for interactive visualisation and quality control micro-management of large scale, complex legal information. However, all tributaries in the Star of Rivers, are either included in the central equatorial River, or they are connected to it by the elastic inductive spectra, which can expand to include new material facts of more cases, without disturbing the rule system; the Polestreams are connected indirectly to the equatorial River, systematically, operating when their single antecedent is established in a Tropic River. This central role of the equatorial River, makes it suitable for use as the prima facie 2d River in the eGanges Rivers window, and as the basis for capturing the heuristics of the potential Star syllogisms. The deductive Star heuristics process the combinatorial explosion implicit in the prima facie equatorial River, without the need for visualisation of the whole Star of rules as alternative pathways of extended deductive premises, or syllogism sequences.

Furthermore, the relevant part of the combinatorial explosion is shown in the eGanges program epistemology, during a consultation, through the colour recording of input significance in the keyhole nodes of the eGanges River. In addition to input colour coding in the River nodes, the program epistemology might be varied so that Chapter Four: Shell epistemology 238 a red node permits zooming in to show the northern hemisphere of the combinatorial explosion with the user's path through it, and a yellow node permits zooming in to see the corresponding southern hemisphere path, like domes supported by a framework of Tropic Rivers and Polestreams. eGanges does not incorporate any such 3d graphics; the Clapham bus passenger might find this too demanding; it is kept simple for efficient communication with minimal complexity. In the eGanges shell epistemology, the logic synergy of the Star provides coherence and consistency of the rules in the course of processing the user's answer inputs. Feedback is given as if there had been processing of the relevant syllogisms in the combinatorial explosion of the ontology of legal possibilities; the Star of Rivers with their flow directions is used to derive generic heuristics for the shell's processing instructions through the communication system of the interface. Yet the user is not concerned with the complexity of the computational epistemology of 3d legal logic; ex facie, the processing produces the expected feedback results, given the labelling of answers with the same colour coding as the adversarial windows.

The equatorial River that is constrained by a Star structure, may be extracted from the Star with all its coherent and consistent deductive connections that are its synergistic implications. They may be thought of as the loose ends of the differentials shown in Korzybski's logic object. Once extracted with its synergistic implications, the equatorial River could be regarded as a type of decision tree, but this may be potentially misunderstood and confusing, compared to the clear delineation of a River in the computational epistemology of 3d legal logic. Rivers have synergistic connections that are defined by the Star structure from which they are taken; decision trees are not epistemologically defined with these synergistic implications, although they could be.

The decision tree may represent decisions required by antecedents of the rule system, without differentiation of their logical structure of extended deduction, and its associated synergistic effect. In the way trees are used in large scale systems, it is usually not clear at each choice point in the hierarchy of tree disjunctions, what are both the interim and ultimate consequents of each alternative that is available. It is usually necessary to repeat nodes in different paths, with different conclusions, in order to capture the combinatorial explosion for different user cases. However, the tree is suited to the branching instructions available in programming; nevertheless, in Chapter Four: Shell epistemology 239 legal knowledge engineering, branching instructions must accord with legal epistemology. The River also requires branching instructions, and the use of branching instructions available in programming.

The unconstrained nature of a decision tree is the value of a tree, like a mind map. The paradigm structure of a tree may be used semantically in different ways, so that it rests on various decisionmaking ontologies or epistemologies. The programmer might know the relationships and flow, represented by the tree; they may be logical relationships, ad hoc relationships, or some combination of the two. However these relationships are not communicated ex facie to an application user, by the single epistemological definition of a tree. A lack of constraints, which may suit ad hoc thinking, does not permit the automation of deductive reasoning or a mastery of extensive logical complexity; in this case, legal expert systems would be controlled by a technology that does not advance understanding of the coherence of law, the optimal freedom that the law offers, and the human intelligence required to produce epistemologically sound technology for the legal domain. Those who provided ad hoc, incoherent, disparate artificial legal intelligence would be the contemporary successors of the ancient priest-rulers who kept the formulae of law covert for the processes of trial by ritual articulation (Gray, 1997, p.116-7). Ritual was the antithesis of judicial decision and broke the recursive revenge blood feuds that still are chronic in some ethnic conflict; where the litigant had the power to perform the ritual, but failed in the required articulation, no judicial bias could be blamed for the outcome. However, the skills of recitation are not a sound basis for human justice.

Justice requires epistemology and the computational epistemology of 3d legal logic captures the epistemology of justice that is actually used in the legal domain. It provides for every antecedent, every rule, every material fact and every reason that is actually determined by authorised law-makers to achieve justice, just as legal experts are required to do. The synergy of the Star is the synergy of relative justice. This is maintained in the program epistemology of eGanges, without troubling the user with its full complexity.

The River structure could be deployed in some epistemology other than as an isomorphic system of rules in the computational epistemology of 3d legal logic. Rules might be replaced by other forms of deductive premises or procedures. Chapter Four: Shell epistemology 240

Susskind (1987, p.144) referred to the use of trees in jurisprudence by Raz (1980, p.99) and observed:

Although a tree diagram as a means of representing parts of legal systems is no innovative image for jurisprudence, and it can be deployed as a means of representing both static and dynamic systems, neither has its full potential been exploited by legal theorists, nor has the nature and contents of such trees been fully clarified.

Two tree diagrams, Figure 4.18 and Figure 4.19, are illustrations by Susskind (1987, pp.145 and 146) used to explain the difference between what he terms the Kelsenian model (Figure 4.18) and the and/or logic tree used in artificial intelligence (Figure 4.19). In these two examples, only the positive disjunctions are represented, so that the hierarchy of disjunction is limited; thus the trees conform to the constraints of the positive River. Figure 4.20 is the River equivalent of Figure 4.18 and Figure 4.21 is the River equivalent of Figure 4.19, assuming that the substantive contents of Figure 4.18 and Figure 4.19 satisfy the requirements of the computational epistemology of 3d legal logic.

The substance of the model of Kelsen (1911; 1945) is a hierarchy of norms; the root node is the basic norm, or grundnorm, and all nodes denote norms or rules as sub- categories, like a tree taxonomy. Figure 4.19, the and/or tree of Susskind (1987, p.146) uses an arc to represent ‘and’ i.e. conjunction; where there is no arc, there are

Figure 4.18: Kelsen tree of R. E. Susskind Figure 4.19: and/or tree of R.E. Susskind (1987), Expert systems in law, Clarendon (1987), Expert systems in law, Clarendon Press, Oxford, Figure 1, p.145. Press, Oxford, Figure 2, p.146. Chapter Four: Shell epistemology 241

Figure 4.20: eGanges River map of Susskind's Kelsen tree by P.N. Gray (2006).

Figure 4.21: eGanges River map of Susskind's and/or tree by P.N. Gray (2006) Chapter Four: Shell epistemology 242 alternatives or disjunction. A conjunction arc on an and/or tree, may curtail the flow, as all the children nodes have to be established before proceeding to the next decision; yet each child in the arc may have further children, and so on. In the design of computer programs, and/or trees have been commonly used to instruct programmers.

The Kelsenian model is explained by Susskind, on the basis of its interpretation by Raz, as a 'pyramid'; this is similar to the Linnaean scheme of categories in the classification of a genus as sub-genera and species. In the Linnean scheme, the root node is the genus and the leaves are species; nodes in between are sub-genera. So in the Kelsen model, the root node is the basic norm, or grundnorm, from which sub- genera of norms are derived, and in turn, give rise to the species as the leaves. Alternatives in the disjunctions of the Linnaean tree are mutually exclusive contraries; they are not contradictories, as contradictories are a simple denial of a category without the simultaneous assertion of another substantive category. Mutually exclusive contraries are each consistent with the generalisation that they particularise. Porphyry's tree does include contradictories.

In the disjunctive hierarchy of a tree that includes contradictories as well as contraries, each alternative is consistent with the choice point from which it springs, not by virtue of that choice point being a generalisation in a classification system, but by virtue of the decision-making order; a tree may represent a decisionmaking procedure that is not confined to a genus hierarchy.

It would be possible to apply the Kelsen model to the common law, to assist abductive argument. For instance, if the basic norm were 'just optimal freedom' and the first layer of sub-genera derived from this were agreement, trust, care, territoriality, and protection, then rights of action could be derived from each of these sub-genera as species of 'just optimal freedom'. However, lawyers do not argue in this way in the application of rights of action to clients' cases. It is usually not necessary to approach the rules of law in this justice framework, although the root node of justice may ground the application of the rules and any challenge to the rules. If appropriate, a lawyer may evaluate the justice of the client's goals or situation, especially if the client seeks an explanation of the justice in the rules of law applicable to the fact situation of the case. Chapter Four: Shell epistemology 243

The taxonomy of the Linnaean tree is not concerned with sub-genera or species that do not exist, although the classification could be expanded at any level if new information came to light. The categories of the taxonomy are contraries like red and blue; whether or not a particular individual fits into a category is a matter of predicate logic. However, predicate logic may be expressed as propositional logic, as illustrated by Waller, above. For example,

Predicate logic: All larcenies have criminal penalties This case is a larceny Therefore this case has a criminal penalty

The first two premises in this deductive syllogism, are assertions, descriptive in nature, and the totality of the conclusion is that this case is both a larceny and a crime, which is descriptive of a specific instance of the generalisation. If the generalisation in the first premise and the classification in the second premise are correct, then the conclusion is correct.

Propositional logic: If a larceny then a criminal penalty. This case is a larceny Therefore this case has a criminal penalty

The first premise in this deductive syllogism, which is the Major premise, is a conditional proposition which is a prescriptive assertion, rather than a descriptive assertion. It floats across the future like a Roman maxim (Gray, 1997, p.154), setting out that when there are certain condition(s) in existence, a certain consequent will apply. The second premise, which is the Minor premise, is an assertion that the condition exists. The totality of the conclusion is that the consequent has been applied, or, it may be that in the case of conditional propositions that are rules of law, it should be applied. Although an ought cannot be derived from an 'is', as pointed out by Hume (1740, 1975), an ought might be a matter of practical reasoning from an hypothetical assertion. In the legal domain, what is applied is the consequent of a rule of law, rather than the description of the generalisation in the Major premise, even though the difference is semantic rather than logical. The semantic distinction is part of legal domain epistemology. It is how the legal domain treats its own Chapter Four: Shell epistemology 244 classification system in regard to facts, and how the requirement for ontological truth of the generalisation in predicate logic is replaced by the power of law-making authority, with the necessity of deduction in its law-enforcement.

Susskind explained that in the nodes of the and/or logic tree, Figure 4.19, the substance in all nodes is 'propositions or statements or words' and the branches represent logical relationships between them. The technology is not concerned with the truth of the Major premises; acquired expert knowledge is engineered so that it can be used or applied. All trees are given a regularity of structure, as if the truncated Major premises have this regimentation by virtue of their inherent logical relationships, like categories in Linnaean trees. Tree-based artificial intelligence assumed that tree regimentation is required for logic processing, as the totality of expert intelligence. eGanges is designed on the basis of a communication system suited to the epistemology of the legal domain, with minimal regimentation. The graphical irregularity of River ideographs retains some structural idiosyncrasies of the Major premises, consistent with natural language usage; they may also be arranged to suit natural language labelling of antecedents.

Figures 4.20 and 4.21 illustrate the sort of irregularity that can be processed by the automation of extended deduction that applies the Major premises of the River to the user's case, in a consultation; the differentiation in the tributary structure indicates the character of the knowledge which may aid application builder and user memory. Mnemonic attributes of an ideograph may assist in the construction of large scale legal expert systems; they are also helpful where interactive visualisation is part of the transparency and user-friendliness of the system. A memorable or perspicuous map may be a landmark, and may say a thousand words, so that it may also expand recognition and understanding very quickly, like the map of a bus route.

The ordinary subject of the law has been portrayed, judicially, as the man on the Clapham omnibus (Rogers, 1979, pp.46-7; Vermeesch and Lindgren, 1998, pp.417- 8). This man has the urban intelligence which understands bus routes. Public transport routes are not laid out with the hierarchical regimentation of a decision tree, or the classification pyramid of a taxonomy tree; such regularity does not assist memory on long journeys, as does differentiation in the stages of the journey. The River map is irregular in its shape, unlike the and/or tree, and thus invokes this ordinary intelligence and may aid its memory. Chapter Four: Shell epistemology 245

More importantly, the River uses different structures to represent conjunction and disjunction; a single stream of nodes represents conjunction and the fan structure is only used to represent disjunction. In the and/or tree, the use of a fan structure to represent both conjunction and disjunction invites errors if the arc, indicating conjunction, is missed or overlooked. The River also has one-directional logic flow, allowing for temporal considerations, such as an offer must come before its acceptance, in accordance with the legal domain epistemology; it does not have two- directional logic relationships like the taxonomy tree and the and/or tree.

The formal logic expression of Figure 4.19 is set out by Susskind (1987, p.146) as the summary of a set of rules, as follows:

(1) (h Λ i) v (j Λ i) <=> g, more elegantly, (h v j) Λ i <=> g (2) g <=> c

(3) d Λ c Λ f <=> b (4) b v c <=> a

The root node is the Final consequent of this system of rules, so that the logic description of the tree is similar to the River which places the Final consequent at the bottom, rather than the top, of the diagram. The <=> symbol, which may be used instead of the tribar, ≡ , indicates that the epistemology supporting the and/or logic tree is equivalence or the material biconditional, namely, if and only if the antecedent(s), then the consequent, so that if the consequent, then the antecedents; this is not an appropriate representation for legal epistemology, as antecedents must be established to prove a consequent but not vice versa, unless there is a legal presumption to this effect. If legislation uses the words 'if and only if', as appears in Scotland's divorce legislation, then they are interpreted according to the rebuttable presumption that whatever is done officially is presumed to be done correctly. The tribar operates once the decree of divorce is made, subject to provisions for appeal. Until a decree is made, on the basis of evidence that establishes the grounds, the tribar is not effected.

The <=> symbol indicates that the logic processing of the and/or tree is logically valid in either of the two directions of inferencing, like a Porphery tree, whereas a system of rules of law is one directional only; furthermore a rule of law is contingent not existential. The authoritative truth of a rule does not give it existential validity without an authoritative presumption to that effect. In the eGanges shell Chapter Four: Shell epistemology 246 epistemology, a River may be navigated freely to explore its content, or to give input at random, but this does not implement two directional inferencing. The processing heuristics of eGanges maintain the one directional inferencing indicated by the flow arrows of the River, irrespective of user navigation and order of input. At the same time, eGanges has a default order for input that accords with the one directional inferencing; the default order also gives effect to the sense of temporal order or presuppositions that may be reflected in the ordering of antecedents in the River.

The material biconditional could be replaced by the material conditional that is also called material implication, represented by the reversed C of Peano, ⊃; this means that the antecedent(s) can be inferred from the consequent, as well as the simple semantics of the conditional proposition that the consequent is inferred from the antecedent(s). The legal domain requires the one directional arrow indicating that the simple semantics of the conditional proposition only, applies; this is consistent with an appellate jurisdiction, where proof of antecedents may be challenged, and with the heterostasis or evolution of a system of rules of law, where antecedents may be removed or added, and consequents may be changed.

It may be that the one directional inference arrow, →, could replace the <=>, so that the representation in the and/or tree conformed to the requirements of the legal domain epistemology. Lawyers are always able to agree that an interim consequent is established without reference to the antecedents upstream of it; they are always able to argue for the establishment of an interim antecedent on the bases of uncertainties upstream of it. However, the establishment of an interim consequent per se does not necessarily establish antecedents upstream of it. eGanges will undo an interim consequent if there is an inconsistency established upstream of it. In the legal domain, antecedents upstream must be established directly, not as a matter of rule inference but as a matter of evidence that proves the Minor premise. Rule inferencing that is not evidentiary, always occurs downstream of antecedents only. eGanges will automatically inference downstream if established upstream antecedents establish consequents downstream; this automation will proceed through the hierarchy of extended deduction as far as the collected input and basis for inferencing permits. Chapter Four: Shell epistemology 247

It is to be noted that unlabelled or contrived choice points, such as h/j in Figure 4.21, may be required to do the job of brackets that are used in algebraic expressions. The choice can not be logically exercised without this choice point which also facilitates the River visualisation of disjunction that is not otherwise fully articulated. The Ishikawa fishbone in Figure 3.17 provides the model for articulating choice points in order to remove duplication of tree nodes. Duplication complicates the determination of heuristics and processing.

Some decision trees have only disjunctions that are consistent with the root node; both Figure 4.18 and Figure 4.19 are of this sort, so they are, or can be reconfigured as, an eGanges River; this is indicated in Figure 4.18 and Figure 4.19 by the node that is a parent of a single child, namely c in each of these Figures. The eGanges River is also of this sort; all antecedents are consistent with the Final result. Figure 4.20 and Figure 4.21 are largely formulated by simply inverting the trees in Figure 4.18 and Figure 4.19, respectively. In this sense a River may be regarded as a tree, but a certain epistemologically defined tree that is distinguished as a River. However, decision trees usually do include contradictory disjunctions so that every node has at least two children, namely, the positive child and the contradictory child; there may also be another inconsistent child such as the uncertain antecedent. All pathways commence at the same root node, but may end at contradictory leaves, as different conclusions. Many leaves may in fact be of only a few types.

Contradictory and inconsistent antecedents of the eGanges River antecedents are conditions that directly or indirectly lead to a failure of the Final result of the positive River; an indirect failure has to be directed by the burden of proof rules which deal with who has the burden of proof and the standard of proof. For a tree, multiple root nodes may be used to represent the overlapping pathways to contradictory or inconsistent conclusions in the ontology of legal possibilities. Such trees are illustrated by Winston (1984, p.152), as shown in Figure 4.22. Complexity of trees might be further increased if there is more than one root node; Winston’s trees are turned ninety degrees sideways, so that they are read from the left or from the right rather than from the top down or from the bottom up. Both root nodes may direct backward and forward chaining. Chapter Four: Shell epistemology 248

Figure 4.22: Multiple root node trees - P.H. Winston (1984): Artificial intelligence, 2nd ed., Addison-Wesley Publishing Co Inc, London, England, p.152 Chapter Four: Shell epistemology 249

For Star dimensions, five root nodes would be required, and the tree pathways would require duplication to manage the combinatorial explosion; the Star is an isomorphic representation and requires no duplication of pathways, so that programming is kept simple and minimal. In his illustration, Winston compares trees where the root node is the goal and the leaves are the initial states (for forward chaining, as in a River) with trees where the root node is the initial state and the leaves are the possible goals (for backward chaining, as in a taxonomy tree). Duplication may produce a repetition of nodes that would require careful programming to keep separate the logical places of each. A duplicated node is not to be confused with a neutral node; it would require the development of a system of pro tem neutral status to effect processing of combinatorial explosion.

A tree that represents both backward and forward chaining should be such that the root node, as the goal, can also be the initial state, and the leaves that are goals can also be the initial states. Since a forward chaining tree has many leaves and thus many alternative initial states, it requires a program epistemology that determines which of the possible alternatives will be adopted as the actual initial state from which the search will commence. An eGanges River could begin with a fan of mainstreams, any one of which may be selected by the user as the initial state; however, usually a system of rules of law has only one mainstream, so that parties may begin at its top antecedent to manage exploration of detail upstream from each node on the mainstream.

The problem of multiple initial states is limited in eGanges Rivers by the mainstream or mainstream rule ordering of antecedents with secondary, tertiary, etc rules. Where there is a fan of mainstreams, as in the Spam application, the Initial map of which is shown in Figure 3.9, alternative initial states are shown in a confined and manageable way. Where there are too many initial states due to multiple mainstreams in a large scale River system, then they may be given a contrived classification to reduce the number that need to be shown at once; this is a simple matter for an eGanges River but it might be more difficult in a decision tree which fully particularises the disjunctive logic, especially if the decision tree has been created ad hoc to the requirements of a chaotic or logically inefficient epistemology, sometimes given the guise of pragmatism. Chapter Four: Shell epistemology 250

Programs such as the commercial shell Ruleburst (2005), which converts into ad hoc tree flowcharts, the formalised rules, provided ad hoc, in pseudo code suited to automated computational parsing, relies on the loose epistemology of decision trees and goal-constrained flowcharts. Ruleburst applications might be evaluated against the epistemological specification of eGanges; it is outside the scope of this thesis to do this.

It might be possible to flatten out the Star in the computational epistemology of 3d legal logic as a decision tree, perhaps as a triple root node system where the root nodes are the alternative Final consequents of negative, positive and uncertain, in the system of rules; the study of a possible method of doing so is not within this thesis. A flat picture of the Star is shown as an adversarial fishbone in Figure 3.16. The adversarial fishbone preserves the five possible Final results as necessary to the complete set of possible valid legal arguments in the system of rules; however, it does so at the expense of the more significant nested complexity of the rules and their River logic. River hierarchies together with the triad of Rivers, ipso facto requires notional 3d logic space.

A triple root node tree, as a Star representation, would be unnecessarily large, as it would articulate the combinatorial explosion of all possible cases that are provided for as pathways through the tree; however, the logical completeness of rules enables the judiciary to decide new cases consistently, with minimal inventing of new laws. This is not to say that the judiciary never makes new antecedents, new consequents or new rules, as new rules may be required where the existing logical synthesis does not cover a new case. The rules are said to have run out. The synthesis of existing rules in an eGanges River, that shows the limits of the positive system of rules, also shows how the system might be expanded or modified to accommodate the new parts of a new case, requiring reasoning with parts of premises, or some of the premises, in an extended deductive pathway through the River. A limited positive River is more manageable than a flattened Star with squashed up links; it allows ease of access to nested maps, as deeply as the rule system requires.

An eGanges River is the equatorial River in the Star or Sphere of the computational epistemology of 3d legal logic; the extraction of this equatorial cross-section simplifies the representation for interactive visualisation, and is validated by processing heuristics that implement the remainder of the Tropic and Pole rules that Chapter Four: Shell epistemology 251 are not visualised, namely the contradictory (negative) and inconsistent (uncertain) rules, as well as the consistent disjunctions of the equatorial River, including neutral disjunctions.

Winston's trees in Figure 4.22, may be processed by backward chaining which commences at a selected root node, as initial state, and progresses by a choice from the alternative pathways to that node from the next nodes in the hierarchy of disjunctions. This might be appropriate for some taxonomies of decisionmaking, but if this were a representation of rules, like a River, it would move from the Final consequent to a choice of the last antecedents on the mainstream rules; this would be inappropriate for the legal domain. It would be like asking if there were an acceptance of an offer before establishing that there was an offer; in the legal domain epistemology, a proper basis must be laid for questions that are put to a witness. The isomorphism of the extended deductive rules of a River takes care of these problems of meaning.

Nor is it appropriate to assume that it makes sense to put a collection of leaves to the user to choose from as a starting point. If forward chaining commences at the leaves which are the initial antecedents of alternative pathways of overlapping rules; a selection from the leaves is required initially. The program must search for all possible Initial leaves, and then present them as the set of alternative nodes in the initial choice set; this may produce disparate choice for the user. If the selection from this initial choice set determines the next set of alternatives in a similar way, the sequence of each choice set may be disparate. In a Porphery tree hierarchy, a choice of nodes on the same level of the hierarchy may be a meaningful set, but this does not apply in a River. From this set of alternative nodes in the same hierarchical level of a Porphery tree, a selection must be made; the selected node then determines the next search and selection of a set of nodes from the next hierarchy applicable to the selected node only, from which to make the next choice; and so on. The decision tree could be regarded as a taxonomy of a decisionmaking process, where the alternative pathways identify the next hierarchical level of alternatives. However, in a tree, unless it is expressed as a Popple tree, Figure 1.1, there is nothing to indicate in the hierarchy of disjunctions, which nodes are interim consequents of disjunctive rules, and which nodes are antecedents of interim consequents; nor is it clear what the indirect consequents are and at what point in a consultation the Final positive Chapter Four: Shell epistemology 252 consequent fails. eGanges Rivers are designed to emphasise interim consequents and their direct and indirect effects; this suits the legal domain epistemology.

Porphyry's tree permits the reasoning of predicate logic, whereas a decision tree may permit the reasoning of propositional logic. Backward and forward chaining applies to both. As shown, the deductive syllogism may be expressed in terms of predicate logic or propositional logic. Whereas the tribar of a conditional proposition permits valid reasoning backwards and forwards, the Porphery tree permits valid deductive reasoning from the general to the particular but not valid deductive reasoning from the particular to the general. It cannot be deduced that if Socrates is a man and is mortal, that therefore all men are mortal. The Major premise that all men are mortal may have been established by a study of all men that rests on full and complete examination or some statistically valid sample of men; this is a process of inductive reasoning that rests on existence. However, once the Major premise is established, then it applies deductively, as a matter of necessary inference.

In an eGanges application, logical processing may commence anywhere that the user chooses. The River indicates where the logic flow originates and the user may choose to follow these indications; a user may choose to chain, but not inference, backwards or forwards. In the eGanges shell epistemology, the navigation of the River is distinct from the logic flow.

The River indicates a starting point by mainstream differentiation, which can manage large scale complexity through its nested sub-maps. A left-right, or top-down ordering of a nested tree may introduce similar tree nesting, but the rationale for nesting would have to be established; in eGanges, inferencing may transit sub- mapping. Search for a valid path through a tree would have to be managed as the search of all valid arguments in the Sphere of possible valid arguments.

As each user answer to a question in the Questions window, which establishes a node, is given, on a consultation of an eGanges application, the node in the eGanges River changes colour to indicate the logical significance of the user input. A negative answer is a failure of an antecedent that is required to establish the Final result of the prima facie positive River; except where there are consistent alternatives in a fan, all River nodes must be established for the positive case to win. When all the Chapter Four: Shell epistemology 253 alternatives in a positive fan are exhausted, their total failure will produce a Pole Final result.

In the eGanges shell epistemology, pro tem controls on inferencing give effect to positive disjunctions. eGanges provides these controls in its adversarial feedback windows, by noting negative and uncertain nodes from a positive fan as such, in the positive adversarial window, and retaining them in that window until the alternatives of the positive disjunction are exhausted; once the positive fan has completely failed with negative and/or uncertain answers, pro tem negative and uncertain fan nodes that complete the necessary and sufficient required for the negative or uncertain Final result, respectively, are then transferred to the appropriate adversarial window where the Final result label will also be listed. eGanges heuristics work to the requirements of the positive River; a flattened Star does not necessarily have this orientation and may require a refinement of the pro tem heuristics that withhold the application of the rules in the Northern and Southern hemispheres, pending the completion of positive disjunction processing.

A River node, like each node in a tree, requires a decision to be made; in the eGanges shell epistemology, some of these decisions may be automated. If antecedents are satisfied, an interim consequent node is automatically established; this may produce further automated inferencing, if other antecedents are thereby satisfied.

There may be an evaluation of each form of representation as a program design aid, including as a design aid for an interactive visualisation, and as a programming aid. The tree may be used to represent different knowledge schemes. A River may be regarded as a particular form of tree, and is most closely related to an and/or tree; however, the River is isomorphic to extended deductive rules and has the constraints of the computational epistemology of 3d legal logic. For this thesis, there has been no comprehensive study of the transformation of Rivers to trees or vice versa, and no systematisation of any such transformation.

The eGanges River represents the overlapping alternative sets of necessary and sufficient conditions which will establish the positive result; it also represents the system of conditional propositions that are the Major deductive premises available Chapter Four: Shell epistemology 254 for an extended deductive argument that establishes the Final result via the hierarchy of interim consequents.

The eGanges River graphic allows for quick changes and angling to allow labelling of nodes. The hierarchical regimentation of trees does not allow space for natural language labelling. Labelling might not be crucial to the logic processing, although the unique labels for each node are processed to effect logic, but labelling is crucial to a communication system involving complex logic

A River may be extrapolated from a decision tree by a process of deconstruction. In the specific meta-epistemological methodology, trees may be useful also in the prior analytics that permit formulation of a River; they may be regarded as a free formulation, to be followed by a logically constrained reconfiguration. Derrida (1988, p.21) defined deconstruction as follows:

Deconstruction does not consist in moving from one concept to another, but in reversing and displacing a conceptual order.

The transposition of the tree in Figure 4.18 to the River in Figure 4.20, is largely a reversal and displacing of conceptual order; care must be taken to comply with the constraints of the computational epistemology of 3d legal logic in the transposition of a tree to a River. A tree provides for the selection of one pathway, whereas an eGanges River envisages that all pathways in the tributary structure will have to be established, unless there are disjunctions providing alternatives. To extrapolate a River from a tree, the decision tree pathways are harvested for antecedents that are also interim consequents for the purpose of extended deduction; these interim consequents provide the hierarchy for the River.

Sometimes, in prior analytics, the Final result of an eGanges River may be selected from the leaves of a tree by identifying one type of outcome from the various leaves; instead of numerous leaves of the same type, the leaves of this type may be compressed as one Final River node. All the antecedents which are not consistent with this outcome are excised. The River distinguishes and excludes rules that lead to outcomes other than the Final result of the River; only disjunctions that have alternatives all of which are consistent with the Final result of the River are included in the River. Thus, a litigation team relying on a tree plan, may focus on the outcome sought by the litigation. eGanges, in its processing, will deal with possible Chapter Four: Shell epistemology 255 alternative cases for the opponent, and warn of these in the adversarial windows, the colouring of nodes in the River, and the Current result that is available at all times in a consultation.

Requirements of conjunction in the single streams of the River hierarchy are clearly distinguished from the choice of fan streams that represent consistent disjunction; the eGanges River is based on a confluence of differences for a common purpose, rather than comprehensive disjunction without strategic discernment of differences. All the antecedents in a River must be established to reach the Final result, unless there are alternatives in consistent disjunctions. In a decision tree, only one path may be selected and it is difficult to evaluate the significance of alternative paths in terms of consistent and contradictory alternatives.

The River is a taxonomy of decisions in a rule system, whereas the decision tree is a taxonomy of decisionmaking that may be concerned with rules. The River has one- directional flow which can be automated as extended deduction. The pathways of a decision tree may be treated as if and only if pathways so that both chaining forward to the root node or chaining backward from the root node, will be valid inferencing; a decision tree may have dual direction inferencing. This must be considered in any deconstruction. In legal domain epistemology, only one directional flow is permitted; evidence is not assumed if a general legal concept is admitted. eGanges does not permit upstream assumptions to be made but it will undo downstream admissions if they are inconsistent with nodes subsequently established upstream.

Interim consequents in the River taxonomy of a system of rules, are isolated for detailing adjustments upstream. Susskind (1987, p.146) noted the phenomenon of overlapping antecedent and interim consequent nodes, although he did not explore these as a source of an isomorphic formalised rule hierarchy or heuristics for an epistemologically sound legal expert system; he had limited programming support for any design. Capper and Susskind used the large decision tree to represent a taxonomy of legal decisionmaking prescribed by the latent damage law expert.

In regard to legal heuristics, Susskind regarded these as part of the expert knowledge to be acquired and provided for in a legal expert system. Latent Damage Law was knowledge engineered by a process of interview of the latent damage law expert, as recommended in the teaching of knowledge engineering skills.(cf. Teknowledge’s Chapter Four: Shell epistemology 256

1980’s knowledge engineering methodology course.). Also, Susskind regarded the propositions of law as a potential source for deriving this heuristic knowledge.

Although Susskind did not pursue the logic system of extended deduction implicit in the hierarchical duality of the antecedent/interim consequent into the adversarial dimensions of legal reasoning, his trees represented rules of law that were the Major premises of deduction, that would also function as the Minor premises of Horn clauses in an adversarial context; the facts of a case would establish the ‘truth’ of antecedents in the rules of law, and thus the applicability of the rule. A pathway through the tree would arrive at the consequent leaf. All antecedents, including the interim consequents were posed in the tree with alternative values for the establishment of Minor premises. However, without further meta-rules and their heuristic implementation in the processing, only one value would be consistent with the one value of the Final consequent. Susskind did not consider that rules of law were ipso facto presumed to be true, due to the exercise of authorised power by which they were created and that there is a set for each opponent in litigation. A shell design should make transparent which set of rules is available for each opponent, and how various meta-rules applied to available Minor premises of a case in a consultation. eGanges meets this standard in its communication system by showing the positive Rivers, showing the adversarial consequences of alternative answers, and reporting lists of antecedents established for each opponent in the adversarial windows. In the eGanges shell epistemology, the River ideograph is used interactively to establish the user's antecedents, and interim consequents that are processed according to the flow heuristics of the Star in the computational epistemology of 3d legal logic.

If a decision tree such as the Latent Damage Law decision tree, includes antecedents from all rules of contradictory and inconsistent hierarchies, this literal expression expands the data of a large scale system unnecessarily. eGanges requires labels for each alternative at each choice point in its prima facie River, with its positive outcome consistency, and consistent, as distinct from contradictory and uncertain, disjunctions. The hierarchy of rules permits automated extended deduction according to input answers. The labelling of possible answers, with their Final result consistency or inconsistency, distinguishing any inconsistency of uncertainty and any negative contradictory, permits both freedom of natural language in the formulation Chapter Four: Shell epistemology 257 of a question, and transparency. The three possible positive answers provide for a lack of inconsistency or contradiction for neutral nodes, where every possible answer is consistent with the positive Final result; neutral answers are also consistent with the contradictory and inconsistent Final results of the unshown adversarial trees, the parts of which that are relevant to the user's case in a consultation, will appear, as River node labels sorted according to the answer's outcome consistency (positive, negative or uncertain), in the adversarial windows that represent the three dimensionality of the logic. Whatever node labels appear in the adversarial windows indicate the Minor premises of the user's case. For example, if a label appears in the Negative window, it indicates the relevant negative antecedent of the negative Major premise, as well as the Minor premise of the user's case.

A decision tree that includes contradiction or inconsistency, inevitably uses a hierarchy of fans; it is a hierarchy based on outcome disjunction so that there is no distinction at each choice point between alternatives in the fan that are contradictory, inconsistent or consistent with the Final result sought by the user. In a large scale tree, the Final result at the leaf may be a long way off and difficult to find through the disjunctive fan hierarchy. A decision tree is a taxonomy that maps alternative possible pathways to their Final outcome. In dealing with combinatorial explosion, a decision tree might have various alternative pathways that have some common and some different nodes; it might be necessary to repeat part of a pathway in a tree, as also part of another pathway in the same tree. Outcomes might also be duplicated many times. Programs have to manage such repetition. The eGanges River precludes duplication and has a clear single outcome; however it also shows where this outcome fails at each point of answer input.

The River prescribes constraints in deconstruction of a tree to form a River. A program that uses an and/or tree must provide for the order in which nodes in an arc are treated. The legal domain requires some ordering of the antecedents in a rule; for example some temporal ordering may be required. In eGanges, antecedents in a rule stream are given an order by virtue of the linear sequence of conjunctions and the flow direction of Rivers.

In the legal domain, antecedents in a rule of law must be established severally, by reference to the facts of a case. Therefore it is appropriate to include in a River node only the deductive antecedent; such an antecedent may be an express or implied Chapter Four: Shell epistemology 258 node. An antecedent may also be an interim consequent of a rule upstream from it, and the question that establishes a node establishes it as both an antecedent and an interim consequent. However, if upstream antecedents of the interim consequent undo an interim consequent, they also undo it as an antecedent in the lower stream, with all the consequent inferencing changes that may entail.

Depending on the decisionmaking ontology and epistemology, the root node in a tree may be the equivalent of the first antecedent in a River mainstream, and the leaves of the tree may be the outcomes, some of which are the Final result in a River, and all of which are one of the five possible Final results in a Star or Sphere. Where the tree is not isomorphic to the rules, it must be construed as rules in the extended deductive structure of the River. The effect of a selection of a node from a tree must be followed through to the leaves with which it is consistent; in a large scale system, this might be a set of very long alternative paths, some concluding with one outcome and some concluding with other outcomes. Legal domain epistemology requires a knowledge of the outcome value of rule antecedents at the point when they are raised for consideration; this is provided in the eGanges shell epistemology by so labelling the possible user answer input. The partially duplicated pathways of a decision tree hierarchy, and fully duplicated outcomes in the leaves, may flatten all the possible pathways of the Star into one plane, but they do not fully reconcile or coherently manage the overlapping of pathways and backtracking required. Combinatorial explosion is reined in to the synergy of the positive eGanges River, which also maps backtracking required for the collection of input in a particular consultation so that a Final result can be given as feedback; this also clarifies the heuristics of managing a consultation cache that, for trees, collects input by recursive search to make sure the input has not already been given.

In large scale systems, the removal of the logic hierarchy of interim consequents, pragmatic duplication of pathways, and total severalisation of pathway outcomes as leaves, make it difficult to see where the consistent and contradictory disjunctions provide choices in the whole system. In the eGanges shell epistemology, the River contains choices in pathways, all of which, consistently lead to the positive result. The choice of answers to establish a River pathway, indicates the answer that stays on the positive path and the contradictory (negative) and inconsistent (uncertain) pathways that will defeat the positive path, unless there are other positive fan Chapter Four: Shell epistemology 259 alternatives not yet exhausted. Legal choice is the measure of freedom in social organisation where freedom, inherently in social relationships, must be relative, and at best optimal; the legal expert system user is concerned with the scope of freedom and constraints that apply to the user's situation, and this must be transparent in a legal expert system. A user cannot learn from mistakes without this transparency; a user cannot make informed decisions without all the relevant information.

For the purposes of large scale systems, the graphical specification of all possible partial and wholly contradictory or inconsistent pathways, unnecessarily increases the size of the instructional ideograph; the Star ideograph in the computational epistemology of 3d legal logic shows that most of the rules are in the hemispheres, not the equator. There are less positive rules than negative rules or uncertain rules. When the positive nodes bring with them, through the interrogation system, the synergy of the Star, then the synergy of contradictory and inconsistent disjunctions is transformed to outcome inconsistency, and is managed in relation to outcome consistent disjunctions by equatorial processing heuristics, as in the design of eGanges. It is not necessary to focus on the whole Sphere of information to process the equatorial plane. The focus is on the positive River which has the least number of rules. The extrapolation of the equatorial River from the Star and any decision tree, confines the focus and the information for the user to a single outcome, and to the disjunctions that are consistent with that outcome. Other information is brought in where it is relevant, on the failure of a positive node. The prima facie equatorial River also acts as a homeostatic control in the shell epistemology, to stabilise transparent processing that might effect the relevant part of the whole of the three dimensional logic in a consultation. Complex choice is managed visually and relevantly, for the user.

Legal knowledge engineering has generally followed the advances and practices as they have been established in the field of artificial intelligence (Gray, 1997). This has led to a Feigenbaum bottleneck in the legal domain because the epistemologies of artificial intelligence have not accommodated the domain epistemology of law soundly, and, to the extent that the legal domain has been accommodated, albeit inadequately, problems remain in the acquisition and representation of legal information for large scale applications. The specific meta-epistemological Chapter Four: Shell epistemology 260 methodology is developed to redress these problems and provide domain advances and practices in legal knowledge engineering methodology.

Susskind did recognise that available computer tools were limited, given the nature of legal expertise. He did not consider that rules of law are not 'if and only if' propositions. Since the tribar or hook cannot symbolise 'then' in a formalised rule of law, backward and forward chaining does not amount to valid inferencing; prima facie, only forward chaining through a decision tree is valid legal inferencing. The discrepancies between Horn clauses and legal deduction syllogisms did raise the avoidance of rules of law as Major deductive premises; this made easier the assumption that an interim conclusion in a system of rules of law, which is usually an abstract concept, could be pruned by the Ockham razor of Russell and Whitehead's logical atomism. This may expedite the interrogation of a witness and limit the points of law in a legal expert system. A Minor premise that establishes a material fact, does automatically establish the antecedent satisfied by the material fact, and this process may ultimately establish interim and then Final consequents; each antecedent must be established directly or indirectly by user input.

However, expert users may establish an interim consequent without the need for its antecedent(s) factual input, if such matters are not in issue. Parties to a legal dispute might agree that any antecedent or interim consequent is not in dispute, and a party may choose not to accept that it is established, to limit or raise issues that are significant; however the antecedents required to establish the interim consequent are not thereby proved except by agreement. In the eGanges shell epistemology, the Note window directly below the Questions window, permits the legal knowledge engineer to gloss the question and also permits the user to make notes for the report of the consultation which is available. In the Note window, the user may record that the answer to be given is agreed, with details of the agreement. Details of evidence relied on for the answer may also be recorded. These facts are legal potentialities brought into the reach of the legal possibilities of the rules of law. The eGanges shell epistemology provides for legal potentialities in this way.

Decision trees for backward and forward chaining, are limited as representations of legal inference pathways; there is more to inferencing in the legal domain than ordering decisions in the disjunction hierarchies of trees. Trees are not isomorphic structures of extended deductive rules of law, and decisions in the process of legal Chapter Four: Shell epistemology 261 inferencing are ordered by reference to the epistemological synergy of adversarial extended deductive rules of law. Inferencing in legal reasoning is effected by epistemologically sound decisionmaking order, and decision tree order is not readily suited to the construction of large-scale, epistemologically sound, transparent legal inferencing.

However, a rule base could be created from a River just as well as a tree. In 1987, the applicability of the River structure to the legal domain was discovered in the first CLIMS (Contract Law Information Management System) Project, in which a two dimensional River ideograph of a small sample of contract rules of law, was used as a legal expert design, to construct the first CLIMS Pilot (Gray, 1988, 1997). The programming was carried out by Carl Jackson, using a small, public domain shell, ESIE, for his undergraduate computer student project at Charles Sturt University, Bathurst, Australia, to construct a knowledge base (Gray, 1997, pp.230-254). The rule representation of the River ideograph, formalised according to the ESIE syntax requirements, was processed by the shell’s inference engine. The contract law ideograph posed a tributary paradigm as a representation of a system of formalised rules of law, with River flow directions that indicated the reasoning from one antecedent to the next, concluding with a Final result, namely the consequent of the mainstream rule. The contract law River was construed to the requirements of Horn clauses that provided the scheme of the ESIE rule base through which the inference engine could chain in order to select questions as output, and apply the rules to the sequence of the user’s answer input.

If there were a duplication of rules in the ESIE knowledge base, this would be rationalised in the cache that collected and maintained input. It is not good design to ask a user the same question twice, and difficulties might arise if different answers were given each time. The content of the cache was processed before each question was presented to the user, to prevent a duplication of questions. Duplication of rules becomes necessary when disjunction creates a combinatorial explosion of alternative possible pathways to the Final result, or the possible Final results, each of which carries its own River of alternative pathways. If there were rules in the ESIE rule base that had no overlap or link with other rules, so that they were not incorporated in the available chaining pathways, these rules never became implemented by the inference engine. These problems do not arise in the eGanges program epistemology, Chapter Four: Shell epistemology 262 as the nodes in the prima facie positive River must have unique labels; however a label that commences with a blank space is different to a label that does not, so that they may appear to be the same.

The River could also be used as a Prolog programming aid; this was demonstrated in the first ESPRIT legal project in 1989, when Tim Flannagan of Machine Intelligence, the company leading the project, used the contract River of the first CLIMS ESIE rule base to produce a more advanced prototype legal expert system, CLIMS Pilot No 2, programmed in PROLOG. However, the simple positive River also indicated that PROLOG programming to capture the extent and complexity of a River hierarchy, to give reasons for a conclusion, would not be viable. Reasons for a judgment include all the Rivers in the case pathway through the River. When CLIMS Pilot No 2 was shown to members of the legal profession at a seminar at St John's College, Cambridge University, in 1989, a preference was expressed by one lawyer for the first CLIMS Pilot, as it contained a trace, albeit a user unfriendly trace that referred to the list of syntax coded antecedents that had been established. Following this seminar, the River representation was developed into the visualisation of 3d legal logic as a complete representation of all possible pathways to all possible Final results, with a view to developing a program with a trace that produced this visualisation.

Use of a series of decision trees to develop legal practice games was demonstrated by Baird, Gertner and Picker (1994). The practice games used game theory to assess the wins and losses of each party that would result from alternative ways of proceeding through a legal transaction or settlement negotiation. Game theory was used in this way to develop legal strategies by reference to the rules of law and an ontology of legal potentialities. Figures 4.23 to 4.29 are examples of the trees devised by Baird, Gertner and Picker; they show how legal practice problems can be broken down into sub-problems for reconstruction as larger more encompassing representations. Advice of this nature is given to a litigation client in terms of how much will be won with the estimated percentage chance of winning, against how much will be lost with the estimated percentage chance of losing. Chapter Four: Shell epistemology 263

Figure 4.23: Baird, D.G., Gertner, R.H. and Picker, R.C.: Figure 2.1 Game theory and the law, Harvard University Press, Cambridge, MA, USA, 1994, p.52.

Figure 4.24: Baird, D.G., Gertner, R.H. and Picker, R.C.: Figure 2.2a Game theory and the law, Harvard University Press, Cambridge, MA, USA, 1994, p.54. Chapter Four: Shell epistemology 264

Figure 4.25: Baird, D.G., Gertner, R.H. and Picker, R.C.: Figure 2.2b Game theory and the law, Harvard University Press, Cambridge, MA, USA, 1994, p.55.

Figure 4.26: Baird, D.G., Gertner, R.H. and Picker, R.C.: Figure 2.3 Game theory and the law, Harvard University Press, Cambridge, MA, USA, 1994, p.56. Chapter Four: Shell epistemology 265

Figure 4.27: Baird, D.G., Gertner, R.H. and Picker, R.C.: Figure 2.7 Game theory and the law, Harvard University Press, Cambridge, MA, USA, 1994, p.64.

Figure 4.28: Baird, D.G., Gertner, R.H. and Picker, R.C.: Figure 5.1 Game theory and the law, Harvard University Press, Cambridge, MA, USA, 1994, p.161. Chapter Four: Shell epistemology 266

Figure 4.29: Baird, D.G., Gertner, R.H. and Picker, R.C.: Figure 5.2 Game theory and the law, Harvard University Press, Cambridge, MA, USA, 1994, p.163.

In a Porphyry tree, there is no provision for the Negative and Uncertain Poles, the fins of the adversarial fishbone, or for the continuous connection between the negative monads, if any such connection could be shown. A tree compression of adversarial pathways in two dimensions, does not readily accommodate the epistemological synergy of the logic of justice and the relative freedom and constraints of the rules of law. The game trees show not relative freedoms and constraints in legal paths, but relative gains and losses. A user may treat eGanges as a game for learning the law, how it is applied, and the synergy of justice; this learning could occur by systematically exploring, testing the effects of variations of answers. Chapter Four: Shell epistemology 267

More recently, a tool has been produced to allow construction of argument graphics of a specific kind. Reed and Walton produced the public domain program called (2001-2003) (www.computing.dundee.ac.uk/staff/creed/araucaria). Epistemological chaining representations are the basis for determining a flow of reasoning or proceeding, in discourse or action. Araucaria was constructed to assist argument mapping, rather than logic processing by reference to an argument map.

Although decision trees are modelled on the paradigm of taxonomy trees, unlike the original Porphery tree, they do not limit the repetition of nodes. Taxonomy trees are classification systems that are comprehensive. If a decision tree that provides for all possible pathways is a taxonomy of a decision, showing the totality of available decision options, then it has a problem in managing large-scale possibility in a user- friendly way, suited to the limits of ordinary human cognition; paradoxically, the decision tree is founded on disjunction, yet manageable choices are not shown clearly in relation to each other when nodes are extensively duplicated in various ways.

In a decision tree, the hierarchy of a tree structure replaces the hierarchy of extended deductive Major premises of a rule system. The pathways of a decision tree are linear, irrespective of interim conclusions. This conceptual order is displaced in the River paradigm by reversing the basis of disjunction upon which the decision tree rests. A River represents a hierarchy of confluence; it is based on conjunction rather than disjunction. Repetition of nodes is not permitted, so the stable position of every node allows an understanding of all the nodes relative to each other. Disjunction is confined to where it occurs in the pattern of confluence. All streams must be satisfied to establish the Final outcome unless disjunction permits alternative ways of doing so.

Decision trees provide continuity for the purposes of backward and forward chaining. Each pathway produces its outcome if and only if each step in the pathway is established. The pathway could be treated as a rule with the logical character of a hook before the conclusion: a ⊃ b. Thus the chaining is valid reasoning whether backward or forward. However, rules of law are not hook propositions; extended deductive rules have interim consequents. In a tree, negation of a node on a pathway is failure of that pathway, and the user is moved to a pathway where the negated node is not required; backtracking along that pathway is required to check its Chapter Four: Shell epistemology 268 consistency with the cache information. When a consistent pathway is found, processing may proceed. Neutral nodes are iterated in a tree as another disjunction.

The eGanges River streamlines legal expertise so that large-scale choice in the law is manageable in several ways. Firstly, the deductive rules of law are selected out of the mixed logic legal information for epistemological mapping. These maps provide the location of relevant induction and abduction. Legal opinion is minimised by the deductive River, as it is isomorphic to the rules of law or procedure; any difference of opinion about what the rules or procedures are, are accommodated as a parallel map gloss for an alternative authority. Secondly, the River is a reconciled ideograph of all pathways that lead to a common outcome. In order to effect this synthesis, repeated nodes are removed and the alternative outcome leaves are reduced to a single Final consequent. Contradictory nodes and outcomes are removed for heuristic processing. Consistent disjunctions are retained, whether they are mutually exclusive or not. Consistent disjunctions are shown as alternative ways of establishing the same consequent, be it an interim consequent or a Final consequent. Rivers comply with the epistemology of conditional propositions, namely only one consequent is permitted in a rule; there may be many antecedents in conjunction or disjunction, but there can be only one consequent in a rule. Disjunctions may produce several rules that share the same consequent. Disjunctive antecedents in a fan of rules that share the same consequent, are all consistent with the consequent but, if they are mutually exclusive, they are not consistent with each other; although one must be chosen to establish the consequent, both cannot be chosen as simultaneous selections. The River epistemology makes the adjustments necessary for a coherent synthesis with a streamlined practical orientation; the program epistemology inherits these adjustments.

The Star is an alternative representation to a tree; it could be regarded as a transformation of a decision tree that sets out all possible pathways to all possible Final results of a system of rules. Thus, the River system is extracted from a tree; the prima facie positive River extracts only the branches that are required to arrive at one possible Final result. Other branches which obfuscate the choices available to reach the Positive result are excluded. Moreover, these branches are reconciled so that all the nodes are unique; to do this, the tributary structure of the River requires Chapter Four: Shell epistemology 269 satisfaction of all its hierarchy, except where there are alternative ways of establishing a consequent in the system of rules.

Chaining through trees may be a matter of navigation rather than reasoning. A River has a clear epistemology. The direction of the arrow shows the inference pathways but in the eGanges design, free navigation is possible, independently of the interrogation system and independently of the inference flow of chaining. The extended deductive premises indicate that the inference flow is valid, but there is no hook or tribar to permit valid inference either way. Establishing an interim consequence does not establish as a matter of necessity any antecedent that precedes that interim consequent in the flow. However, if an answer that establishes an interim consequent is changed, then the program dynamics restores to an unanswered state the antecedents that precede it.

The one common Final result that captures the alternate pathways for arriving at the Final result of the River, permits a strategic focus that is especially important in managing litigation. Lawyers always consider the opponents case, but their focus is on their client's case. The meta-rule that the opponent wins if only one positive node fails, unless there is a fan, largely incorporates the opponent's case in the client focus. If and only if all positive nodes are established, except for disjunction choices and neutral nodes, then the Final positive result is attained. The River discards the use of an arc to represent conjunction in order to give the ideograph a differentiation between conjunction and disjunction; this produces different lengths of conjunction pathways, in terms of number of nodes, so that the graphic can take on individual characteristics with mnemonic features. Cognitive art is given a greater scope. The River becomes an object of reified adversarial logic. Corresponding Rivers for opponents may be prepared, or a River may be processed according to heuristics that incorporate the opponent's Rivers, to identify weak points in a client's case.

The River visualisation is a departure from the common practice in knowledge engineering that was established during the 1980s, of using decision trees in the knowledge acquisition and representation stages of knowledge engineering. The River ideograph may be viewed as an extraction from a certain decision tree, by way of a deconstruction and reconfiguration of a selected part of a comprehensive decision tree that sets out all possible pathways to all possible Final results. It sets out the pathways severally, so that, like a tree, only one pathway has to be taken with Chapter Four: Shell epistemology 270 contradictory as well as consistent disjunctions provided at each decision point. In a tree it is difficult at any point to see which are the contradictory disjunctions and which are the consistent disjunctions for the user's purposes; a selected prima facie positive River makes this plain and user friendly, for both opponents. The opponent with the negative case must see that there are many opportunities to win, whereas the opponent with the positive case can stay focused on the task of establishing all points necessary to win.

In the twentieth century, in the new field of artificial intelligence, more complex search strategies of trees were developed to find classification and problem solutions. These solution search heuristics might proceed backward from the root node or forward to the root node, at some depth, or sideways across the breadth of the tree. Chaining through taxonomies is a knowledge search process that might simulate the backward and forward reasoning described by Ritchie (1923, p.10), following the establishment in science of the logical atomism of Russell and Whitehead (1910), as follows:

Logicians, of whom Mathematicians are a species, have started by considering the formal relations of familiar propositions which are considered to be true as a matter of fact, for instance, the propositions that two and two make four and that things which are equal to the same thing are equal to one another. These propositions are always asserted as describing relations believed to be true as a matter of fact, and to be in effect assertions of laws holding between the things of the external world.

That some people have also said they were necessary or self-evident, or implied by the laws of thought is beside the point. From such starting points the logician has worked both backwards and forwards, forwards to find what other propositions can be inferred and backwards to the propositions from which they can be inferred. The reason for working backwards is to obtain a coherent system of propositions beginning with a finite number of axioms which are not capable of being inferred from anything else and which are taken as primitive. Definitions are further required in the course of the process, a definition, of course, is not a proposition as it is neither true nor false. It is an act of will and the logician has theoretically a free choice in definition, as he chooses to define his terms so his system of propositions will develop. Actually, of course, he is very largely restricted by the facts of the external world and the limitation of his imagination.

However, search chaining must be distinguished from reasoning flow. The user must signify which is required in order to drive the chaining. In the eGanges program epistemology, search chaining may be purely navigational, according to a random Chapter Four: Shell epistemology 271 exploration of the River; or it may be inferencing according to the flow of the River system. Chaining can only claim to be logic processing, if it follows logic links. However, in the eGanges program epistemology, it is possible for navigational processing to also drive logic processing when input is provided in the course of navigation.

The structure in Figures 4.27 and 4.28 can be compared to that part of the adversarial fishbone structure of Figure 3.16, along the central line of positive monads, each of which is linked to its contradictory opposite in the negative line. Porphery’s tree does not show any links between the contradictory opposites to establish a wholly negative result; the taxonomy is not a complete system of adversarial logic. The strength of a negative case can be determined by how many negative points are established.

A negative Pole could not be added to the Porphery tree, even if ‘substance’ was the Final consequent; no contradictory of substance, i.e. ‘no substance’, is encompassed in the taxonomy. If substance was not all-inclusive, the simple Porphery tree could be a positive River, similar in structure to the River in Figure 3.1. The River might be read as a set of Major deductive premises in the form of rules, or as a set of Horn clauses to be used as the Minor premises for an extended deductive argument that assumes its Major premises.

The logical completeness of adversarial propositional deduction is not based on a comprehensive root node that subsumes the whole tree; rather, it has a meta-rule of propositional deduction that all possible cases are provided for by the stellar system of rules. Alternative adversarial Final consequents are necessary to the structure of all possible pathways in the conflict, beginning with a triad of Star points and ending with the five points of the Final results of the Star.

4.2.10 Rivers and flowcharts

Systems require flow, which is represented in a flowchart that links the various components or sub-systems of a system. A flow arrow indicates the direction of flow of the system processing. From the initial designs of computer technology by Neumann and Goldstein (Knuth and Pardo, 1980, p.208), and Turing (Copeland, 2005), flowcharts were established as an epistemological technique for designing hardware and software architectures and systems. In the prior analytics required for Chapter Four: Shell epistemology 272 legal knowledge engineering, flowcharts, like trees, may also be useful, especially in representing systems in fields of law, that might require various modules of connected eGanges Rivers, such as the complex system of contract law (Gray, 1986). In contract law, there are three possible stages of a contractual transaction: (1) formation, (2) performance and (3) closure, each of which has a set of rules that establishes the ontology of each stage, and its pathways and flow of legal choices. Some transactions jump the middle stage; in the stage of closure, there may be a return to the formation stage for the formation of a settlement contract, and so on. Semantic nets may also be useful in the prior analytics of legal knowledge engineering. Like eGanges Rivers, semantic nets are also isomorphic with formalised natural language, with the formalisation including processing directions of information pathways to interim or final results; they may approximate intelligent verbalisation. The rudimentary system of the specific meta-epistemological method is set out in a simple flowchart in Figure 1.6; this methodological system could be further developed. A system of client thinking, called SURMET (SURvival METasystem) was developed to assist application design (Gray, 1982, 1990 and 1997).

Systems ideographs were initially called flow diagrams, and this was later shortened to flowcharts; they showed circles, squares, and similar symbolic geometric shapes (Fitzpatrick, Keane and Montgomery, 2000, pp. 122-3), that are now flowchart conventions, joined by lines with arrows to indicate the direction(s) of proceeding. System components, with any subsystems, were not knitted together, as conceived by Ramus; rather they were 'glued' together, or connected dynamically by a flow, as originally suggested by Thales.

Software is available for creating flowcharts, with conventional symbols, and some provision may be made for processing the directions of the chart. Critical path packages allow flowcharting with additional considerations that identify mission critical points in the path, where certain steps have to be taken.

A River is a hybrid flowchart, tree and semantic net; the computational epistemology of 3d legal logic from which the eGanges prima facie River is derived, is a system. In considering the application of systems theory to the development of jurisprudence, Rourke (1986) recognised the need to consider epistemology in the legal domain. Chapter Four: Shell epistemology 273

The problem was to construct the legal logic system from the natural language of the law and how it is used in legal practice.

This thesis advances Fraunce’s logic model of case argument, by systematic separate provision for legal deduction, induction and abduction, and the extension of deduction according to the ontology of legal possibilities, with corresponding extensions of induction and abduction. Fraunce was concerned to represent the forms of logic used in the sequence of arguments in a real case; this thesis is concerned to keep separate for the purposes of automation, the deductive, inductive, and abductive premises of legal information that might be applied to a user’s case. Further development of Fraunce’s model was required.

4.2.11 Rivers and other 2d epistemology

Apart from geometry itself, since the Tree of Porphery and the Ramist graphics, from time to time, there have been other 2d ideographs, representing other epistemologies, such as those of Venn, Peirce, Korzybski and Ishikawa. Some of these have greater similarity to the visualisation in the epistemology of 3d legal logic, than others; none are legal ideographs. However any of these graphics may be useful in the prior analytics or glossing for an eGanges application.

The work of Australian artist, John Power (1934) instigated a search for Platonic perfection through the graphic representation of the design of great paintings; this work may add to the development of cognitive art in eGanges mapping, along with the work of indigenous Australian artists who use an ancient pictorial system with dots, circles, lines and arrows as their written language.

Comprehensive discussion of the development of epistemological ideographs historically, exceeds the scope of this thesis. An exploration of legal and social anthropology would be required. Where earlier epistemological ideographs, not yet explained, have some bearing on the computational epistemology of 3d legal logic, it is helpful to understand the ideographic categories to which the computational epistemology of 3d legal logic belongs. To this extent, earlier works may be considered, not in chronological order, but in some genealogical groups as determined by the structures of the computational epistemology of 3d legal logic; any of the genealogical groups also may be incorporated in the epistemology of 3d legal logic through the provisions for strata abduction. At least eight genealogical Chapter Four: Shell epistemology 274 groups might be identified: (1) Trees, (2) Rivers, (3) Semantic nets, (4) System flowcharts, (5) Rings, (6) Matrices (Hollis and Hollis), (7) Fields (Venn) and (8) Aesthetics (Power).

4.2.12 3d epistemology

Prior to the construction of Latent Damage Law, Susskind (1987, p.46) considered legal epistemology in regard to legal knowledge engineering, asserting that:

… legal epistemology falls firmly within the bounds of jurisprudence.

He also noted that Tur (1977; 1978) had argued that jurisprudence is legal epistemology. On this basis, Susskind identified various epistemologies of selected jurists of the analytical school of jurisprudence. In a tabular summary, he compared their epistemologies, with a view to forging a common basis of legal expert systems, namely a rule base system, that could accommodate this range of jurisprudential epistemologies, which used various forms of rules. He acknowledged the importance of logic, but except in terms of outcome prediction, he did not consider the implications of logic for those lawyers who have a litigation practice that requires understanding of the cases for both opponents, and the possible cases covered by the rules. It is probably necessary for legal practitioners to always have a view of both sides of a case, as well as the judge's view; that is the basis of their claim to work for justice. Lawyers who argue for one opponent, must live with their successes if they later find they are acting for a party in the position of the other opponent. eGanges does not make law to clarify the accommodation of the ontology of legal potentialities that falls into the realms of inventive reasoning; it just permits better management of new cases for the development of inventive argument. It also limits the problem of hard cases by providing inductive and abductive information at the point where a case is hard. It can give all the legal information that is available in contemporary legal resources to clarify the penumbra, but if some case are still hard, that is no reason to abandon technology to minimise hard cases. The real problem in legal knowledge engineering is not to anticipate the success of inventive legal argument, which usurps law-making authority, but to map the present large scale complexity of the rules of law that amounts to a bureaucratic crippling of the legal system, comparable to the Roman legal system at the outset of its codification stage. Without a logic front end such as eGanges, legal databases like AUSTLII, Chapter Four: Shell epistemology 275

LEXISNEXIS and WESTLAW are contributing to the information explosion of the law, as it is now possible to cite in legal argument, numerous networked retrieved cases, that considerably expand the cost of litigation. If electronically codified laws assist the use of the Roman techniques of codification (Gray, 1987, pp.132-5), such as simplification, clarification, condensation, and removal of inconsistencies, uncertainties and inequities, then such costs can be minimised and the problems of hard cases and inventive arguments can also be minimised.

Where the law is limited, Susskind saw uncertain cases as hard cases, to be resolved by judicial discretion, according to Hart, or unspecified expert heuristics, rather than by the nature of the nearest logical certainties or reasons. Expert heuristics were not derived from a precise specification of expert logic, but from a mystical expert knowledge:

Expert systems are usually … heuristic, by which is meant they reason with the informal, judgmental, experiential, and often procedural knowledge that underlies expertise in a given field … (Susskind, 1987, p.9)

Korzybski's logic teaching object suggested a three dimensionality in alternatives to Aristotelian logic; legal expertise that was not simply a Porphyry tree of Aristotelian predicate logic, might be thought to be three dimensional. The categories of a Porphyry tree, strung together as Korzybski differentials, could be seen to have many unused slots for systematic, semantic connections, and many unused potential links to further unused slots, or to each other. In early artificial intelligence, there was no clear distinction between expert epistemology and programming epistemology; it was thought that artificial intelligence simply required the simulation of human intelligence by duplication of it in the machine. PROLOG vindicated this view, and transparency was not seen to be a standard that technologically uninformed lay users should expect.

After the second World War, in the early development of databases systems, with designs based on the filing cabinet models (Hollis and Hollis,1969), a three dimensional matrix epistemology emerged for improved data analysis. Three dimensional categorisation and organisation of data were shown to enhance the information available from the data. This signalled the requirement for a transformation of a three dimensional epistemology of filing system intelligence, into a computational form for program design and programming. Searches across Chapter Four: Shell epistemology 276 related fields of data to find different aspects of a common element, could extract a more complete picture of that element, and permit a comparison of complete pictures of different common elements. Algorithms for relational databases, could be devised to knit data extracts together to suit various purposes. The filing cabinet model of Hollis and Hollis was pragmatic, in keeping with the pragmatism established by Peirce. Spreadsheets are a type of relational database that offer a choice of algorithms for processing slots relative to other slots.

Relational databases were rigorously systematic so that database design could simulate intelligent matching of data in various locations. Hollis and Hollis (1969) posed a filing cabinet model, or box matrix, that captured categories of information at different depths over extended periods of time for the filing, collating, retrieval and coding of complex information. They were concerned to develop information systems, in an emerging age of computer databases, for more appropriate use; the relevant information could be retrieved to assist the task at hand. Their file plans for utilisation and interpretation of information had a three dimensional schematic code system that determined the scope of coverage, showing the relationship between categories, depth, and time. Relational database design advanced from the box paradigm to the paradigm of a brick wall where ultimately each brick might represent a filing cabinet. From these designs, relational databases could provide various information about the totality of the content of files. Search algorithms also suggested the techniques of semantic nets which were usually illustrated by 2d ideographs.

Database design initially used the three dimensional filing cabinet model. A pattern of different fields was applied to each file so that a search could be made to find all the files that shared a common piece of information in any of those fields. Relational databases were concerned to allow a search of a database through cross-sections that linked information in various files in a way that would produce a taxonomy of the total information in the files.

The development of search epistemologies incorporated new intelligence into database technology. Artificial memory and electronic processing speed were inventions that brought a leap in the evolution of the capacity of human intelligence. The legal domain led the way forward in the development of databases, in 1956, with Horty's health law data retrieval project (Bing, 1984, p. 29). Chapter Four: Shell epistemology 277

Taxonomy trees and decision trees were used as teaching methods in the legal domain, apart from technology. Castro (197?) flowcharted much of Australian tax law as trees and the candidate flowcharted much of the common law of contract. Such trees can be used to develop eGanges Rivers. Also, independently of technology, the visualisation of the computational epistemology of 3d legal logic can be used to develop more systematic methods of legal practice - ones that can bring under control the vastness and complexity of legal information.

Intelligent legal database systems were developed by Hafner (1978, 1987) who treated the database as legal knowledge with its categories, such as issues of law, and issues of fact, as fields for retrieval. The candidate (Gray, 1997, p.229) also conducted numerous student projects which developed databases by reference to legal decision-making trees, which effected some three dimensionality, where one field in the database provided information on the selection of the next tree nodes for search.

The 3d complexity of legal epistemology was recognised in the works of Terrell (1984), Conover (1988) and Samuel (2003). Terrell and Conover suggested 3d ideographs; Samuel, who was largely concerned with European epistemology, did not. None of these works pose a comprehensive, three dimensional visualisation of legal epistemology, although Conover suggested that this seemed inevitable for the development of a legal epistemology for legal expert systems. Conover also thought that legal expert systems would assist in the specification of this three dimensionality.

Terrell's work was inspired by the Thoughtland allegory of Abbott (1884), a School headmaster in England, who originally published his work under the pseudonym, A. Square. Abbott began the development of three dimensional thinking just after the time that Peirce (1883, 1933, Vol. III, Exact Logic, pp.411-16) considered ideographs for four dimensional logic. The idea of three dimensional thinking was popularised by Abbott, as progressive.

The leading character in Abbott’s story is a Square who lives in Flatland, an imaginary place inhabited by two dimensional personalities. Abbott used geometric diagrams to illustrate the occupants of Thoughtland. The two dimensional personalities of Flatland were given a social status according to the rule that the more Chapter Four: Shell epistemology 278 sides there were to a personality, the greater was the social status: women were straight lines and viewed sideways, they were a point or 'all point', soldiers and workmen were triangles, professionals were squares, rectangles, or five-sided figures, and so on. Social status continuously increased with the number of sides until the figure became a circle; priests were circles. During the twentieth century, the term ‘square’ was used in a similar way to describe conservative people who rejected the cultural vibrance of the revolutionary new rock and roll music. When the Square happened to meet a Sphere, a three dimensional visitor from Spaceland, the Square gains a new understanding, and thereafter imagines worlds beyond Flatland as three dimensional worlds, four dimensional worlds, five dimensional worlds, and so on. The Square was able to pose the realms of 4d, 5d etc, by applying the same principle of extension from 2d to 3d, to an extension from 3d to 4d, from 4d to 5d, and so on: in the range of perspectives, just as 3d structures could be viewed as a compound of 2d structures, so 4d structures could be viewed as a compound of 3d structures, and so on. The Square stated (1884, 1992, pp.79-80) that, from the fourth dimension, a view could be had of what is inside the three dimensional structures. Finally, the Square is put in prison in Flatland indefinitely, as his attempts to relate publicly his new knowledge from the encounter with the Sphere, is seen to be madness, dangerous, revolutionary and sedition. The allegory was regarded as a satire on England's conservative Victorian culture, which had recently resisted Darwinian evolution, even though Darwin had journeyed extensively abroad and examined a broad range of environments that provided an empirical basis for his reasoned theory.

Terrell (1984) applied Abbott's three dimensional thinking to the legal domain to produce flatlaw, three-dimensional law, four dimensional law, and five dimensional law etc. He set out a three dimensional approach to law similar to the geometric box, the filing cabinet model of databases, and also a tesseract or hypercube as a four dimensional representation of Abbott’s 4d world, applicable to the legal domain.

Flatlaw was posed by Terrell (p.288) as a region of the human mind ‘within the larger mental territory known as legal reasoning’; he used this idea to create a model of legal reasoning. Flatlaw was ‘the realm of the legal positivist (and some non- positivists)’, such as Dworkin and MacCormick, whose reasoning he described as follows: Chapter Four: Shell epistemology 279

Linelaw and Flatlaw are therefore closely related, since both depend on this concept of institutional support to connect points and to form lines and figures, but they differ in analytic sophistication. To take another example: Where the student sees lines of cases dealing with offer, acceptance, consideration, and so forth, the teacher sees, and tries to communicate to the student, a geometric figure called “the law of contract” which contains these lines but connects them through broader concepts (nevertheless quite “legal” in character) such as “agreement,” “reliance,” and “expectation”. (Terrell, pp.297-8)

The dimension from which the geometric figures of Flatlaw can be evaluated is Terrell’s third dimension that he illustrates by reference to Plato’s Allegory of the Cave, set out in the dialogues between Socrates and his students in the Republic, Book VII.

Socrates proposes that most of the world can be compared to a cave in which the habitants are chained, unable to escape, unable to see direct sunlight, and unable to turn their heads to see one another. Their world is made up of the shadows of themselves and other objects in the cave created by the indirect light that manages to reach them. Reality, then, is entirely two dimensional, for all objects appear as only flat, shadowy forms. (Terrell, p.301)

The moral of Plato’s allegory, is that those who are able to get outside the cave and attain enlightenment should, when they return, take on the burden of leadership of the unenlightened, even though this may be difficult. They have found and know the true three dimensional metaphysics, and thus can lead with wisdom through informed decisionmaking. The Allegory of the Cave is applied to law by Terrell as follows:

What is this new perspective that permits the traveller to depart Flatlaw and enter another dimension of legal reasoning? Stated most generally, it is the recognition that legal rules and institutions are only “evidence” or “data” of larger social phenomena associated with the concepts of other disciplines. These phenomena include: community standards of morality and justice, economic efficiency, group behaviour, social and political forces, primal or sociobiological urges or necessities, and so on. (Terrell, p.303-4)

Terrell’s third dimension consists of interdisciplinary views of law. He wanted to expand the scope for solutions to legal problems, such as the environmental law problem of a homeocentric legal system. Also he posed a transcendental fourth dimension of legal reasoning described (p.310-16) as a dimension of creativity, like an art form, that could have religious characteristics; it could accommodate the recursion of paradoxes. The fourth dimension provided a higher view from where the lower dimensions could be evaluated critically; it was visualised by Terrell as a Chapter Four: Shell epistemology 280

Figure 4.30: Graphics of Terrell's four dimensions of legal reasoning of M. Conover (1988), Applying three-dimensional thinking and systems technologies to jurisprudence and legal management, in T. Rasmussen (ed.), Interactive systems and law, Spartan Press, Lansing, Michigan. Chapter Four: Shell epistemology 281 hypercube, shown as the last figure in Figure 4.30. Although Terrell was unable to clearly specify the content of his fourth dimension, he suggested, consistent with the view of Finnis (1980, pp.59-80), that its matters might include the principle of ‘the good of the pursuit and acquisition of three-dimensional knowledge’, because the fourth dimension presupposed such three-dimensional knowledge. He observed:

One of the lessons of the analogy between levels of reasoning and physical dimensions is that one level or dimension necessarily builds upon, and is understood in relation to, other levels or dimensions. (Terrell, p.334) …

…I must reemphasize that we cannot establish the fourth dimension apart from the other three. (Terrell, p.337)

…That is, if we seek to change the law, we will almost inevitably need justification from extralegal sources. (Terrell, p.339)

I do not seek to replace the first three dimensions of legal reasoning with a new one; rather, I seek to supplement and complement these other dimensions with one that attempts to reach beyond, and hence make more humble, our powers of logic and reason. (Terrell, p.342)

The fourth dimension posed by the candidate (1997, p.223), accommodates the flows inside and between the spheres of the legal universe, and also between their homeostatic differences and their change process sequences. Whereas the third dimension is thought, the fourth dimension is thinking. Terrell did not account for the flow or process in legal reasoning in a separate dimension, although he did acknowledge that it was Einstein’s Theory of Relativity that opened up dimensions beyond the first three (p.309).

The fourth dimension described by Terrell, is similar to the candidate’s concept of the fifth dimension of legal reasoning, the creative realm of designer legal intelligence.

It may be necessary to consider alternate four dimensional structures or processes, especially where there are inconsistencies in blackletter law or legal opinion. Alternate four dimensional intelligence can be set out, compared and developed in five dimensional mind space. The mind, which is real but intangible, may be thought of as the fifth dimension of the physical existence of humankind. In the human mind, real, potential, hypothetical and physical existences can be represented and used to determine interactions between people and between people and their environment. Mind has five dimensional metaphysical space in which alternate four dimensional metaphysical structures and processes can be arranged, and also rearranged, to suit design specifications. Designer legal intelligence operates in the fifth Chapter Four: Shell epistemology 282

dimension of the mind, which is apperceptive. It is the fifth dimension of metaphysics, which permits the design, evaluation and development of choice. The science of legal choice is a natural study of the fifth dimension. (Gray, 1997, p.108)

If the mind is the fifth dimension of the physical world, and is called the metaphysical dimension, then the metaphysical dimension itself has five dimensions, according to the views of Abbott and Terrell. In the fifth dimension of the five metaphysical dimensions, there is a space to model blue prints developed for the other four metaphysical dimensions, according to the constraints of the corresponding physical four dimensions. For instance, in legal knowledge engineering, a design task of a 4d jurisprudential system, for installation in a computer, might be seen to be homeostatic, making no changes to the law, or heterostatic, making changes.

This provides a new understanding of what Julian Huxley (Huxley, Hardy and Ford, 1954, p.13) observed, not long after the destruction of Hiroshima by atomic warfare, about the emergence of human intelligence:

The new phase of evolution thus opened up was characterised by a new relation between organism and its environment. The human type became a microcosm which, through its capacities for awareness, was able to incorporate increasing amounts of the macrocosm into itself, to organise them in new and richer ways, and then with their aid, to exert new and more powerful influences on the macrocosm. And the present situation represents a further highly remarkable point in the development of our planet – the critical point at which the evolutionary process, as now embodied in man, has for the first time, become aware of itself, is studying the laws of its own unfolding, and has a dawning realisation of the possibilities of its future guidance or control. In other words, evolution is on the verge of becoming internalised, conscious and self-directing.

The fifth dimension of mind may be evolving, so that it has a sixth dimension for the conscious orchestration of survival and evolution, by further transcendent evaluation and design of meta-epistemology. Three and four dimensional metaphysical systems may be planned in the fifth dimension, for implementation in the four dimensional physical world. Legal knowledge engineering meta-epistemology belongs in the fifth dimension of legal intelligence; its prototypes are created as three and four dimensional metaphysical entities for implementation in the four dimensional physical world of computer technology. However, programming epistemology is not a jurisprudential matter. It belongs in an adjacent area of mind, differentiated or Chapter Four: Shell epistemology 283 undifferentiated, in an area of five dimensional scientific intelligence, en par with the area of five dimensional legal intelligence.

Stratified levels of metaphysical or logic space, in a transcendental hierarchy, corresponding to cyberspace, are useful as legal knowledge engineering methodology, as they assist the distinctions required in the design and implementation tasks. The metaphysical levels may contain constraints derived from the corresponding physical levels; a one dimensional level may have a constraint of single points and single things, two dimensional constraints permit linear connections between points or things in one plane, a three dimensional level deals with three dimensional structures, and four dimensional levels deal with processes or structural dynamics.

In the fifth dimension of legal reasoning, four-dimensional jurisprudential systems might be seen to be greater than the sum of their parts; their synergy may be understood or created. Like Terrell’s fourth dimension, in the fifth dimension of legal knowledge engineering, there might be principles of wisdom and some artistic and religious considerations that function at this level. In her revised Master’s thesis, the candidate drew on Schelling’s (1800) system of transcendental idealism that posed levels of understanding, the highest of which transcended all others. Aristotle’s (Hutchins, Vol 8, p.587) definition of wisdom as the science of first principles was also relied on in delineating the fifth dimension of legal intelligence; In the Parmenides dialogues, Plato's suggestion that largeness was a first perfect form, was criticised by the Third Man argument, that added a third man to the first two men who established their common perfect form for man, and then a fourth man, and so on, ad infinitum; it may be that largeness is the first consideration in enlightenment for an escapee from Plato's cave. Largeness may be cumulative and divisible; it might take account of substance and no substance.

It is suggested that the intellectual artefacts of the computational epistemology of 3d legal logic, and of the eGanges applications referred to in Chapter Five, with the examples in the Appendix, may populate the fifth dimension of legal knowledge engineering, as metaphysical galaxies of legal reasoning. The scope for aesthetic variations of the graphics does not change their logic and the significance of the choices which represents consistency in diversity. In the fifth dimension, harmonious differences are possible and can be mapped as design. If Plato’s allegory is applied to Chapter Four: Shell epistemology 284 these designed artefacts, the perfect art form studied by Power (1934), for design, becomes even more pertinent.

If it is the light outside the cave that reveals the perfect three dimensional structure of the logic formulae of the rules of law, with their spectral connections, sub-strata, and flow, then it might be seen as dynamic legal logic of four dimensional law. From the fifth dimension, with use of meta-epistemology, four-dimensional law might be reconfigured and managed; its metamorphosis and evolution might be consciously envisaged and planned. However, for those still inside the cave, as Socrates envisaged, there must be communication in terms of shadows, which at the same time ensures that the shadows truly reflect the outside four dimensional metaphysics; in the reality of the cave, computers bring some artificial light to Plato's perfect forms.

Figure 4.30 shows Conover's representations of Terrell’s four dimensions of law that were based on Abbott’s concepts. Terrell had shown the hypercube. Conover adapted Terrell's cube to represent a three dimensional, procedure-based legal epistemology; Figure 4.31 is Conover's cube. Conover thought that Terrell's four dimensional matters could be accommodated as part of the his three dimensional structure, and he did not suggest a structure beyond three dimensions. Figure 4.31 is Conover's representation of how economics might be a plane in his cube of three dimensional law. Figure 4.31 further develops Terrell’s three dimensional model. The unidimensionality of law is represented in the first two diagrams in Figure 4.30 as a number of points. When these points are connected, by virtue of their relationship, so that the resulting lines run roughly in parallel, they become two dimensional, like the simple linear adversarial parallels in the adversarial fishbone of 3d legal logic, without the tributary and Pole structures; they require height and width when they are represented diagrammatically. Conover shows that two dimensional patterns may be placed in three dimensional cubes in parallel to show connections between them; he sets out, as planes in the cube, two dimensional diagrams of jurists’ epistemology and lawyers’ epistemology, placing reasoning flows between them. He grasped the three dimensionality of the structures of legal reasoning in specific terms, as system structure, with flows therein, and thus advanced by structural particularisation, from the three dimensional allegorical framework of Terrell. Chapter Four: Shell epistemology 285

Figure 4.31: Cube of lawyers' epistemology of M. Conover (1988), Applying three-dimensional thinking and systems technologies to jurisprudence and legal management, in T. Rasmussen (ed.), Interactive systems and law, Spartan Press, Lansing, Michigan.

The legal epistemology of Conover illustrated the three dimensional reasoning space required for legal practice. He suggested that three dimensional diagrams clarify the connectedness of professional tasks and how other disciplines might be employed in these tasks. A diagram or an icon can convey compressed information: it can communicate simply what might require many words, and what words cannot capture easily. Consequently, visualisation may efficiently manage and communicate infodensity in an information age. Like the customary men and women toilet signs around the world, it also may transcend or minimise national language differences and facilitate international understanding. Chapter Four: Shell epistemology 286

Although he saw flows in his two- and three-dimensional structures, Conover did not suggest that this brought in a fourth dimension. Without a fourth dimension of flow and processing, he could view Terrell’s fourth dimension as another strata of his three dimensional model. Conover considered that Terrell’s fourth dimension was another form of critical level parallel to others in his three dimensional matrix, similar to the Hollis and Hollis filing cabinet model of database design, shown in Figure 1.7. The legal epistemology of Conover further developed the work and diagrams of Terrell, but failed to differentiate the meta-epistemological dimension which distinguishes legal abduction in the fourth dimensional model and design abduction in the fifth dimensional level.

Although Conover’s epistemology was procedure-based, he acknowledged that the ‘logic flow’ and ‘path of logic’ within the epistemology, required clarification:

A somewhat similar chart can be drawn to describe the substantive analysis of a case. Clarification of the logic flow can help make the analysis both more understandable within a team of litigators, and more defensible to a client or managing partner. This epistemology of substantive law would be the traditional realm of jurisprudence analysis by a lawyer, but the path of logic would be shown more clearly. (Conover, 1987?, p.159?).

Conover was aware of the legal expert systems research at the time, and pointed to this research as the field likely to provide the logic clarification that his epistemology required. However, he gave no indication that the logic itself might also be three dimensional in structure, and he did not consider the epistemological adequacy for the legal domain of the legal expert systems to which he referred. The computational epistemology of 3d legal logic provides the three dimensional paradigm that clarifies the logic flow and choice of logic paths in substantive law that might be presented as legal argument. It can incorporate procedural matters and link specifically, parallel systems, as shown in the design of eGanges, where one application can be linked to another via the link form of gloss. The work of this thesis is to refine the visualisation of 3d legal logic as a computational legal epistemology and show its use in the design and production of a new genre of legal expert systems, through the specific meta-epistemological method.

Like the candidate (Gray, 1986), Conover (1986) commenced with a systems approach to law, before he posed a three dimensional legal epistemology to assist in computer-aided legal education and practice. His epistemology was a modification Chapter Four: Shell epistemology 287 of four patterns of legal reasoning in jurisprudence, posed by Terrell, and it located the place of legal expert systems in the modified pattern.

Also drawing on three dimensional legal reasoning posed by Terrell, Conover (1988) suggested a three dimensional system of legal expertise, particularised as an approach to legal expert system development; he suggested integration of his three dimensional model with existing rule processing expert systems. Conover applied Terrell’s three dimensional model to the problem of legal expert system design. He explicitly called this application of three dimensional thinking to law, a legal epistemology, by reference to the work of Rourke (1986) on legal epistemology in experimental jurisprudence. It was Conover (1988) who first recognised that it was legal epistemology that was three dimensional; he hoped that legal expert systems technology would provide the logic specifications for this three dimensional system designs. In the same year, the candidate (Gray, 1988) identified the major two dimensional ideograph in legal epistemology, and called it a River system, due to its tributary structure. A year later, the River system was expanded for her Master’s thesis as a visualisation of 3d legal logic. In the development of eGanges, the River structure is retained for the interactive visualisation of legal logic, and the remainder of the computational epistemology of 3d legal logic is accommodated in other interface knowledge structures and processing, as explained in the eGanges communication system. eGanges might be thought to have Flatland or Flatlaw status, as it has two dimensional logic ideographs and was designed to suit the ordinary cognition of Lord Bowen’s man on the Clapham omnibus (Rogers, 1979, pp.46-7), who is representative of the majority of subjects of the law. User-friendliness is specified as part of the legal domain epistemology. An effective legal system requires access to the law by all those who must obey it; otherwise they will not know what is required. Those who do obey the law should also have ready access to its benefits, as a matter of quid pro quo of an elementary social contract, and as a matter of reinforcing the legal system as a system of justice. Although the eGanges ideographs are two dimensional, its processing assumes the computational epistemology of 3d legal logic, and is based on a 4d model of law with heuristics providing controlled flows within and between the 3d logic structures. The meta-epistemology might be thought Chapter Four: Shell epistemology 288 to belong to a fifth dimension, where the content, structures and flow of the 1-4d can be evaluated, designed and managed.

Samuel addressed the matter of legal epistemology directly as a requirement for legal knowledge engineering methodology. He also recognised that there was a three dimensionality in legal reasoning.

Although his book was published in the same Applied Legal Philosophy Series as the candidate's (Gray (1997), Samuel (2003) did not consider as a legal epistemology the theory of 3d legal logic which provided a three dimensional model of rule system structure that was inherent in legal reasoning, and had emerged from legal knowledge engineering experiments; nor did he deal with the work of Conover, Terrell or Abbott. Nevertheless, he asserted that no one has been able to model the legal mind for artificial intelligence purposes, although he had asserted (1995) that three and four dimensions were needed. Following the inclusion of her theory of 3d legal logic in the Master’s thesis, the candidate first presented the theory of 3d legal logic in 1992, at the Subtech Conference at the Chicago-Kent Law School, and in 1993 at Jurix, after which it was published (Gray, 1995 and 1997).

The three dimensionality of law conceived by Samuel was different to the candidate's theory; Samuel (1995, p.217) defined the three dimensionality of legal reasoning as follows:

Yet once it is appreciated that the legal remedy itself (actio) is capable of acting as an active institution within an institutional legal model that is three-, rather than two-, dimensional in structure, the easier it is to appreciate how legal thought can be carried from obligatio (the relation between two legal subjects) to dominium (the relationship between legal subject and a legal object capable of being owned) via the concept of an ‘interest’ which is a notion that attaches to the relation between legal subject (persona) and legal remedy (actio).

At the conclusion of his work, Samuel (2003 p.339) suggested that a major paradigm shift in jurisprudence would be required to find a legal epistemology that could be automated, and that it might take a three dimensional or four dimensional form. For Samuel, three dimensionality was inherent in legal reasoning itself. However, Samuel did recognise that a systems approach might produce a three dimensional legal epistemology:

For example, systems theory can be used as a means of understanding law as a discourse in itself; here it has a role in appreciating the nature and definition of law in relation to other Chapter Four: Shell epistemology 289

knowledge discourses such as economics and political science. Equally, it can be used internally. That is to say, a systems approach can be employed as a form of analysis to explain the functioning of particular areas of law such as ‘contract’, ‘tort’ and ‘property’. …

What gives it a special relevance to the epistemologist is that systems analysis can be used to escape, not only the old dichotomy between a whole and its parts, but also the traditional (and two-dimensional) paradigm of logical positivism. …

Modern systems thinking is, accordingly, less about simplifying complexity than embracing it. A phenomenon can be modelled in terms of a mass of interrelating multidimensional systems whose constructed relations allow for a more sophisticated construction of the object-project. Contradictions and paradoxes can be understood in terms of elements and relations functioning in separate, yet connected dimensions. … (Samuel, 2003, p.305)

Systems theory could provide a three-dimensional model by which the substantive in rem right or relation is seen to function in a dimension that is conceptually separate from, yet interrelated with, the dimension in which the in personam relation functions. (Samuel, 2003, 306)

The epistemology of 3d legal logic provides for Samuel’s three dimensionality in the same way as Conover provided for Terrell’s four dimensionality, as part of the three dimensional structure. The epistemology of 3d legal logic, consistent with Conover’s model, accommodates the reasons for the rules of law and interdisciplinary matters that are relevant to rules of law, as part of its three dimensional structures; it recognises that sub-epistemologies might be required for parallel related areas. However, its primary three dimensional structure rests on its adversarial rule logic.

In his earlier work, Samuel (1995) was also concerned to distinguish legal ontology and legal epistemology. The distinction is dealt with in Chapter One of this thesis, and in the context of the Vienna Convention as a sample of substantive ontology that conforms to the computational epistemology of 3d legal logic. In the course of defining epistemology, Samuel (2003, p.11) settled on what it is concerned with, namely, scientific discourse, which he fashioned according to the view of Blanché (1983, p.120):

…treated as a system of signs combining between themselves according to certain rules independent of what they can evoke. Chapter Four: Shell epistemology 290

4.3 COMMUNICATION SYSTEM

4.3.1 Legal choice

The eGanges communication system is deigned to show the user the legal choices in a substantive field of law, and the effects of any selection from the alternatives. The River shows the alternative choices of requirements for establishing the positive Final result of the system of rules; fans indicate these alternatives, and there may a fan of mainstreams or hierarchies of fans arising from any mainstream node. The interrogation system gives the choice of adopting or not adopting the positive requirements; for each node there may be assistance in selecting an answer through any available inductive spectrum and other abductive glosses. The River maps in the Appendix may be seen as the formalised extended deductive rule structures and premises of the Convention; they are teleological, as they are goal-directed toward the Final result of a contractual enforcement entitlement. Also, the maps are technological, insofar as their particular arrangement is a matter of art within the constraints of the space necessary for node labelling requirements; also they have implicit processing heuristics, indicated by the flow arrows, and the fan structures as distinct from the stream structures.

There are several heuristics of legal deduction that apply in the processing of answer input. Failure of a fan choice is not fatal to the positive Final result until all alternative fan streams fail. Failure of one node in a fan stream entails failure of the whole stream; it may fail for negative input or uncertain input. If some streams in a fan fail for negative input and some fail for uncertain input, so that all fanstreams fail, then the net result is an uncertain Final result. Uncertainty prevails in the failure of a fan, where the uncertainty can be rectified by evidence or legal argument, in order to be replaced by positive input, so that one stream in the fan of alternatives then satisfies the fan requirement for a positive Final result; much depends on whether the user has the burden of proof. However, where a multi-node rulestream, including any particular multi-node fanstream, fails for both negative and uncertain input, then the negative prevails over the uncertain; salvaging the uncertain point would not remedy the negative. Chapter Four: Shell epistemology 291

There are two modes of use in the eGanges communication system: consultation and construction. It is necessary to understand the consultation mode before proceeding with the construction of an application.

In the consultation mode, heuristics of the communication system drive the following:

● Output questions/commands, Input adversarial Minor deductive premises

● Output of Minor deductive antecedents in adversarial windows

● Output - Cumulative results – Current pro tem or Final result.

These heuristics will be explained in the context of the question and answer logic and the Current result logic in the communication system.

The validity of the consultation depends on the legal expertise in connecting appropriate questions to River nodes, and arranging answers appropriately to suit the sorting of input into the appropriate adversarial window. Prior analytics in Stage 5 of the specific meta-epistemological method should account for this substantive validity. The program epistemology and design assumes this substantive validity; this is a form of hypothetical deduction in the legal knowledge engineering methodology. Instead of going back for more information, as in retroduction, there must be a forward projection, or production, to the Stage 5 application information, by way of anticipation of what must be assumed about it; it is a contingency deduction that must be realised for the validity of the Stage 3 programming epistemology to be vindicated.

4.3.2 Question and answer logic

The question-answer logic of an application is the method of applying the rule maps to the user's case. A left click of the mouse on a node in the River graphics produces, in the question window, a question that may be answered by the user. The eGanges shell epistemology distinguishes factual negation from adversarial negative. The answer buttons are aligned to show which answer will support which adversarial case: positive, negative, or uncertain. After the user selects an answer, by a left click on the requisite answer button, the node label is listed accordingly, subject to fan processing, in the Negative Case window, the Positive Case window or the Uncertainties window. At any time during a consultation, the user can see how many Chapter Four: Shell epistemology 292 points there are, if any, for each adversary or side, and how many uncertainties there are.

Although there are five answer buttons, only three are available for each question. The three alternative answers are positioned as necessary and sufficient conditions for a negative, positive, or uncertain Final result respectively, or as neutral antecedents all three of which are treated as consistent with the positive Result. The questions are put succinctly in the natural language that will establish the material fact that is necessary to establish the node antecedent; answers may be arranged to suit the natural language of the question. For instance, three alternative answers, no, yes and uncertain, may each be positioned, on the negative, positive, and uncertain buttons, respectively. However, sometimes a yes answer will support the positive case, and sometimes it will support the negative case; likewise, a no answer may sometimes be adversarially positive or negative. Answers must be positioned accordingly. The uncertain answer may bring on, in the Note window, hardcoded advice to answer further questions upstream to clarify the situation; otherwise, an uncertain answer, subject to fan processing, produces the node label in the Uncertainties window. If a node is neutral, all three alternative answers, yes, no and uncertain, will sort the node label into the positive adversarial window; all three possible answers to a neutral node will be presented to the user as positive buttons.

The order in which antecedents are placed in a rule is sometimes determined by the sense required for questions. A temporal sequence or presupposition may be required. For instance, a question, "Has the offer been accepted?", presupposes that there has been a question answered positively: "Was an offer made?" In the eGanges shell epistemology, a default order of questions operates, to put questions in sensible temporal sequence and to ask questions on a proper basis. If the user chooses to give answers in some other order, then the user may have to select a previous node in the River, by way of backtracking to recover the temporal sequence or presupposed answer.

A user may change an answer that has been given, by selecting the node label from the positive, negative or uncertainties lists in the respective Adversarial windows. A left mouse click on the requisite node, produces the rule map that contains that node, now the current node coloured green, and the node question in the question window Chapter Four: Shell epistemology 293 with the alternative answers to it on the answer buttons. The new answer may then be selected and the list windows will be adjusted accordingly.

The label on a node in the graphics window is the label that appears in a Adversarial windows list, following the selection of an answer. The labels are appropriate to the positive case. For example, in a contract law application, there may be a label 'offer'. If the user answers the offer question positively, then the label 'offer' will be listed in the Positive case window. If the offer question is answered negatively or as 'uncertain', then the label 'offer' will appear in the Negative case window or the Uncertainties window, respectively. In the Negative case window, the label 'offer' means that there is no offer; in the Uncertainties window, the label 'offer' means that it is not certain that there is an offer.

If there is a contract node label, 'no rejection', meaning no rejection of an offer, when this appears in the negative list following a negative answer, a double negative is intended: there is not no rejection, meaning there is rejection. The question will take the negative viewpoint: Has the offer been rejected? The negative answer is yes. If the user requests an eGanges Report on the consultation, then the Report will print beside the label 'No rejection', the question that was asked and the answer that was given.

Once a Final result is reached, according to the computational epistemology as partially negative but conclusive or as partially uncertain but inconclusive, then, from this time on, the label of the positive Final result in the eGanges River, will appear in the Negative case window or the Uncertainties window, respectively; the positive Final result is no longer attainable, although questions may continue to be answered to see how strong the negative case is relative to the positive case, and how many uncertainties there are. Inevitably, further questions will assume that no uncertain or negative point has been established previously. For example, further questions may assume that there is an offer even though input has established that the offer has been rejected; they may assume that there is acceptance although input has established otherwise. A user may wish to proceed as if each failure was the only failure; a wholly negative or a wholly uncertain Final result could be attained in this way. The positive result can only be attained if all the nodes in the River system are established, subject to fan processing, where only one positive fanstream is required for each fan. Chapter Four: Shell epistemology 294

In the fan processing, negatives and uncertains on the fanstreams not established as positive, are treated as part of the positive case, as happens in legal practice. In the eGanges epistemology, negative case answer input and uncertain answer input in a fan are treated as pro tem positive, until the fan alternatives are exhausted. A pro tem status is shown as feedback by the addition of (Neg) or (Unc) as appropriate beside a node label in the Positive case window. Once a fanstream fails for negative answer input, the default order of questions moves to the top node in the next fan stream; processing of a fan stream will continue if failure is only due to uncertain input, in case there is a negative that will eventually make the fan stream fail conclusively for the assessment of the Final result of all the fan processing. If each fanstream fails for some negative answer(s), then the fan is negative, and the Final result is negative. If all fanstreams fail only due to uncertain answer input and there is no negative answer input, then the whole fan fails for uncertainty.

Before an answer to a node question is given, eGanges allows a user to enter, in the Note window, the evidence in the user's case that supports the answer, or other user comment. This is the way in which the user's ontology may be raised; sometimes a client's view of facts does not match the legal ontology of the application, and sometimes the legal potentials that have been realised in the user's case, are not provided for in the limited ontology of legal possibilities in the application ontology. The Report of the consultation will include user Note input. For a legal practitioner, user's notes may be a source of inventive legal argument where there is a new case to be accommodated in the system of rules by some modification. User notes and user answers are the only input available to the user in a consultation.

4.3.3 Current result logic

The eGanges Current result button permits the user to obtain confirmation of the Current result during the use of an application. The Current result may or may not be a Final result; Final results are positive, negative or uncertain and if none of these results have been reached, in accordance with the computational epistemology of 3d legal logic, then the current result is given as the pro tem result, unanswered. The concept of pro tem is used in this way in connection with the Current result, as distinct from the pro tem positive Minor premise of a fan. If the user left clicks with the mouse on the Current result button, then the system will show the current result in the Current result window, according to the adversarial flows of the computational Chapter Four: Shell epistemology 295 epistemology of 3d legal logic; these flows are activated by user answer input, and their alternative nature may be regarded as the adversarial meta-rule of the eGanges shell epistemology. In summary, there are three sets of heuristics that give effect to this major adversarial meta-rule:

1. If an antecedent in a negative Pole rule is established, then the node label is shown in the Negative case window with the Final result label indicating that the negative case is established or wins; the user's case immediately has a partial but conclusive negative result. The more negative antecedents of Pole rules that are established in the Negative Case window, the stronger is the negative case; if all negative antecedents of Pole rules are established, the case has a wholly negative result. It is likewise if an antecedent in an uncertain Pole rule is established; then the node label is shown in the Uncertainties window with the Final result label indicating that the uncertain case is established. The user's case immediately has a partial but inconclusive Final result. The more uncertain antecedents of Pole rules that are established in the Uncertainties window, the stronger is the uncertainty of the case; if all uncertain antecedents of Pole rules are established, the case has a wholly uncertain result. Pole rules do not effect their consequent if they are constrained by positive fan processing; positive fans may produce pro tem positive antecedents that stop a Pole rule from firing. Some negatives and uncertains are consistent with a positive Final result pro tem.

Fan processing effects conjunction constraints of Pole rules. A positive disjunction (a fan) is equivalent to a negative conjunction, in accordance with de Morgan's laws of logic which state:

(a and b) = not a or not b

(a and b) → c = not a or not b → not c

These laws assist the prior analytics for an eGanges application to ensure the reconfiguration of the ontology of the stated rules of law, as a positive River in an adversarial system of Rivers. Pole rules, which are used to control combinatorial explosion in the computational epistemology of 3d legal logic, are derived from the second de Morgan law:

not a → not c; not b → not c Chapter Four: Shell epistemology 296

As noted in Chapter One, Morgan (1847) was the tutor of Ada Lovelace, who was concerned with the automation of Combinatorial Analysis. The derivation of Pole rules from the de Morgan's laws, should not be confused with the requirement of judicial ontology that introduces legal ontology as the subject of legal logic, in a true Major deductive premise; establishing a logical place is known as the Law of Identity, a fundamental Law of Thought in logic, which is existential, such as A is A. Logic space is Absolute, constant space that meets the requirement of truth in the physical world. However, the physical world is not constant. In the computational epistemology of 3d legal logic, the Law of Identity is a meta rule or logic heuristic that gives logic validity in the physical world, and allows logical space to build up to three dimensionality, like physical space.

As a consequence of de Morgan's laws, if a positive River has disjunctions, there will be a corresponding negative conjunction. Pole rules are constrained by this conjunction; they are represented as having one antecedent that is both the interim consequent of the negative conjunction rule which is part of the negative tributary system of rules, and also the antecedent of a negative Pole rule; Pole rules arise from negative interim consequents and from the failure of each antecedent in positive conjunction rules. The several antecedents of the conjunction in a negative rule, that corresponds to the positive fan, are not each also antecedents of a Pole rule. Where the positive River has conjunctions in a rule stream, then the corresponding negative structure is a Pole fan, where each negative antecedent shares the same Pole consequent; each antecedent is unconstrained and as soon as it is established, the Pole consequent, which is a Final consequent, applies. Where there is a positive disjunction, with a corresponding negative conjunction rule stream, until all of the several antecedents of the negative rule, linked in conjunction, are established, the interim consequent of that rule will not be established and therefore the antecedent of the Pole rule will not be established to effect the Pole Final result.

Thus there are two sorts of Polestreams or Pole rules, each of which has only one antecedent: those whose single antecedent is not also an interim consequent of another Tropic River system rule, and those single antecedent is also an interim consequent of a rule with which it overlaps. Some Polestreams have antecedents that overlap with antecedents in a Tropic rule, and some have antecedents that overlap Chapter Four: Shell epistemology 297 with interim consequents in a Tropic rule. In the latter case, an interim consequent effects a Final consequent.

Just as in the positive rule system, a negative conjunction stream may have a fan or a hierarchy of fans to establish any of its antecedents. Three dimensional visualisation of the structures of the corresponding conjunction and disjunction Rivers in an adversarial sphere, would require virtual reality cyberspace and would involve such complexity that a lawyer or other person on the Clapham omnibus or in Plato's cave would only be able to understand that it is complex, and not exactly the details of the complexity. In legal knowledge engineering, this Plato's cave phenomenon also has been a source of the bottleneck in legal knowledge acquisition.

However, the complexity can be managed in a program by processing of the data according to heuristics that give effect to the complexity of what corresponds to the positive fan, for the negative case. Pro tem negative or uncertain Minor premises may accumulate in the positive window until their interim negative or uncertain consequent is established as also the antecedent of a Pole rule which then effects the Pole Final result. In a hierarchy of negative and uncertain fans, the inferencing is similarly withheld, by virtue of the positive conjunction requirements and processing. If a deeply nested positive fan is satisfied, it may trigger inferencing downstream; correspondingly, a negative or uncertain fan, located in deep nesting, may trigger downstream inferencing. Sequential ordering of processing is heuristic and algorithmic; the eGanges shell epistemology is linear in receiving logic input and giving logic output. Even if the user gives answer input at random, or changes answers, the inferencing system makes only the established steps of proceeding in the chosen path, according to user instructions. Adjustments will also be processed upstream if a change to an interim consequent is made and requires this. Antecedents must be established to satisfy their interim consequent; if the interim consequent is changed, then their established antecedents will automatically be reset to unanswered.

2. If an antecedent in an uncertain Pole rule is established, then the positive case can not win unless uncertainties are resolved as positive. The user's case immediately has a partially uncertain result. The more uncertain Pole rule antecedents that are established, the stronger is the uncertainty in the case. Uncertainties may be viewed as risks for both the positive and the negative case. If all uncertain antecedents are Chapter Four: Shell epistemology 298 established, the case has a wholly uncertain result. In the legal domain, uncertainties are included in settlement advice that might resolve adversarial conflict.

3. Until one of the alternative sets of necessary and sufficient positive antecedents are established for a positive result, a Current result can not be positive. However, it may be tenuously positive, such as 'unanswered' or 'no negative or uncertain results'; the eGanges shell epistemology uses 'unanswered', as the simplest advice indicating that the user has not yet answered enough questions for a Final result.

If the user does not use the Current result button, then the system will show the Final result as soon as a negative or uncertain antecedent appears, respectively, in the negative or uncertain case list; this will indicate a Pole rule antecedent has been been established. The Current result will then be partially, but conclusively, negative or uncertain, as the case may be; the label of the positive Final result in the River is shown at the top of the list in the Negative case window, as soon an a node label appears there, or in the Uncertainties window, as soon as a label appears there, if the Final result label has not appeared in the Negative case window.

If a Final result label appears in the Negative case window, all the pro tem negative nodes that have been listed in the Positive case window, that complete the set of necessary and sufficient conditions for a negative Final result, are removed from the Positive case window and relisted in the Negative case window. Likewise, if the Final result label of the Positive River appears in the Uncertainties window, any pro tem uncertain nodes that have been listed in the Positive case window that complete the set of necessary and sufficient conditions for an uncertain Final result, are removed from the Positive case window and relisted in the Uncertainties window. If it is an uncertain Final result, rather than a negative Final result that is first established, then the Current result will be shown as uncertain, whether there is a partial uncertain Final result of the Pole, or a wholly Uncertain result. Subsequently, if a negative Final result that is either partial or wholly negative is established, then the Final result label is removed from the Uncertainties window and relisted at the top of the Negative case window list. Current result will change from uncertain to negative, but not from negative to uncertain. Negative Final results override uncertainty. The Final result that is reported in any of the adversarial windows is always at the top of the list of antecedents in that window. Chapter Four: Shell epistemology 299

The communication logic has the effect of transforming the user's pathway which might zig-zag through the Sphere, into a 2d linear pathway that records any zig- zagging as a colour change of the River nodes. A large scale legal expert system is likely to be deeply nested, so that nodes that are coloured to indicate adversarial significance of the user's case, will be recorded in depth.

4.3.4 Glosses

In the eGanges shell epistemology, Glosses are available to the Builder for construction of an application, and to the user during a consultation of that application, in two ways: • question notes in the Note window

• strata logic listed in the gloss menu

Questions can be glossed by Builder's hard-coded notes about the question or other related matters, in the Note window, directly below the Questions window. As the Note window is small, it may be necessary to scroll down in the window to see all of the notes.

If there are strata glosses available for the current node during a consultation, this will be indicated by a green icon with vertical and horizontal lines, that replaces the plain green square Build menu icon, at the top right corner above the Rivers window; the Build menu includes facilities to expedite the construction of a River, and also a range of gloss options for each River node. The Build menu icon remains constantly available in the construction mode; the gloss icon only appears during a consultation where glosses are available. In the consultation mode, a right click of the mouse on the gloss icon, produces a menu of glosses that are available, as strata logic associated with the node that is current, indicated by its green colour, in the Rivers window. In this way, the rule maps locate any inductive and abductive logic that is relevant to a deductive antecedent River node; legal induction and abduction are located precisely in relation to the necessary deduction of the rules of law.

In the eGanges shell epistemology, there are two types of strata glosses: inductive glosses and abductive glosses. The major form of inductive gloss is the spectrum gloss, which is a construction option in the Gloss sub-menu of the Build menu. If a Spectrum gloss is selected, it offers, vertically, three sectors, labelled Negative, Positive and Uncertain, respectively, in each of which gradient or iterative lists may Chapter Four: Shell epistemology 300 be made to identify what items would, and what items would not, constitute the material fact that would establish the current antecedent node in the River: gradients may suggest (1) arguments by analogy, (2) fine distinctions between items either side of the sector borders, (3) where gaps fall in the spectral continuum of negative- positive-uncertain material facts, and (4) how these gaps might be filled. Items in each list need not be related analogously; they may be iterations of material facts from precedent cases which are referenced in the list as authorities for that material fact establishing the current antecedent in the rule.

In the three sectors of a spectrum gloss, three lists may be entered, respectively, one for the material facts, any of which establish the negative antecedent, one for the material facts, any of which establish the positive antecedent, and one for the material facts, any of which establish the uncertain antecedent. For example, to show the material facts for the peppercorn rule in contract law, the negative material fact would be no peppercorns, the positive list would be 1 - n peppercorns, and the uncertain list would be greater than nil and less than one peppercorn. Such a spectrum shows the sector into which a material fact not yet decided, such as 24 peppercorns, is likely to fall, and also suggests other rules for resolving uncertainty, such as whether or not the specks of pepper could be tasted. Spectral arguments may be put that rely on the position of a material fact in the spectrum, or that use the nature of the uncertainty to raise other rules to distrubute the uncertain cases into either the positive or negative sectors.

A spectrum arranged from negative to positive to uncertain, reflects the adversarial structure. If the spectrum was arranged from positive to uncertain to negative it might reflect the gradation of quantity or quality; it may be easier to determine some gaps in this gradation. Sometimes in a gradient, it is apparent that a gap is located in the positive or negative sector of the continuum, or is in the uncertain sector.

Spectrum glosses are not the only form of inductive gloss that is available in the eGanges shell epistemology, but they readily accommodate judgments in which there are findings of material facts that establish antecedents in rules of law.

The remaining forms of gloss may be used for further inductive glosses or for abductive glosses; these remaining forms are: Chapter Four: Shell epistemology 301

1. Text gloss: these glosses might state the authority, justification or some explanation, including inter-disciplinary information, for the antecedent of the current node, its position in the rule, the rule itself, or some other information about these matters. Also text information might contain the natural language definition of the label of the glossed node, the natural language statement of the rule in which the glossed antecedent is located, and authorities for these definitions and statements. Any inconsistencies, alternative definitions and statements, critical comment, or reform views, might be included as commentary.

2. Links: the options include inter node, parallel River, other files, programs, databases, or websites. Examples are as follows:

Links to nodes in other applications are parallel River or legal universe links between Spheres, such as a link from a contract application to a negligence application, giving affect to inter-sphere links in a legal universe, according to the computational epistemology of 3d legal logic.

• Inter-node links: sometimes there is a relationship between nodes in a Positive River that are on different tributaries in the same River system. For instance, in a contract law application, the nodes of misrepresentation and mistake could be viewed as a double spectrum with fine distinctions that can be clarified in relation to each other. An inter-node link permits the spectrum gloss for each node to be considered relative to the other.

• Parallel river links: there may be rearrangements of river systems to suit a different perspectives or purposes of an area of expert knowledge. For instance, rivers of legal strategy may be created from the rules of contract law. Gloss links may be created to permit a consideration of other river systems as a parallel logic.

• File, program, database links: a link to another file, program or database may be created; for instance a current node labelled injuries in a negligence application, might be linked to a file with a glossary of injuries. There may also be links to databases of precedent documentation with which the application is concerned. Images such as Chapter Four: Shell epistemology 302

Venn diagrams for clarification of the logic sets in the law, or anatomical drawings for understanding injuries, may be linked; arithmetic calculation programs that might be needed in tax applications, may also be linked to carry out legal formulae. An epistemic logic program (Meyer and Hoek, 1995) which deals with the logic of knowing certain things in relation to other things, might be useful in expanding or verifying evidence as what is known. Enhanced PROLOG has been used to effect epistemic logic programming as a form of modal logic. Programs of other forms of modal reasoning, such as Hohfeld categorical logic, deontic logic and action logic, might also be linked as enhancements of an application.

• Website links: Αn example of an inductive link to a website is where there is a link from a River node such as a statutory concept of threatened species in environmental protection legislation, to a list of such species on a government website. An abductive link might be to the exact wording of the section of the legislation in which the concept occurs, in a database of legislation such as in AUSTLII. eGanges applications could also act as front ends and be given links to primary source law in legal databases such as AUSTLII, LEXISNEXIS and WESTLAW.

In legal epistemology, precedent material facts do not always conform to a spectral model; there may be several criteria requiring consideration. It may be possible to use several spectra in sequence, so that each of the several spectra is considered, in a sequence that gradually reduces the gradients to a list of material facts for each adversary; otherwise a 3d matrix might be required. There also may be a taxonomy tree to determine hierarchical categories of the three sectors of an inductive spectrum, or an evidentiary decision tree or River for each sector of the spectrum. The 3d Sphere of legal logic may contain inductive and abductive gloss forms that assist determination by the user of deductive answer input.

The gloss forms available provide for the different forms of definition in the field of logic, that permit different forms of logic processing. The categories of definition may assist the prior analytics of legal knowledge engineering to determine Chapter Four: Shell epistemology 303 appropriate glosses. The definition categories are listed and well defined by Baum (1996, pp.572-600); briefly, they are as follows:

1. Synonyms: one word definitions where there is sufficient overlap in meaning of the definiendum, what is defined and what defines it, the definiens; the synonym, as definiens, stands in the place of the definiendum.

2. Enumerative: there are two kinds of enumerative definitions, (1) ostensive which are constituted by a list of examples, whereby definiens in the list share common characteristics but also have different characteristics, (2) denotative which are constituted by a list of examples, which correctly cover all the characteristics of the definiendum although some of the definiens in the list do not have all the characteristics of the definiendum and may also have different characteristics; the sum total of denotative characteristics is called the extension of the definiendum.

3. Connotative: the definiens provides the essential characteristics of the definiendum to which only the definiendum can be applied. The sum total of these characteristics is called the intension of the definiendum. Connotative definitions may be given by genus and differential, as in a Porphyry tree.

4. Operational: the definiens is a procedure for determining whether or not the definiendum applies. 5. Stipulative: new meaning of the definiendum is introduced by the definiens. Legal definitions may be of this kind.

6. Precising: any of the above kinds of definitions used to eliminate ambiguity and vagueness without distorting the meaning of a definiendum or creating circularity.

7. Persuasive: any of the above kinds of definitions used for a purpose, by distorting the meaning of a definiendum for that purpose.

A distinction is also made between affirmative definiens, which state applicable attributes and negative definiens, which state attributes inapplicable to the definiendum. What is inapplicable might be extensive, so affirmative definiens are generally more efficient.

Spectrum glosses may provide finer definition of rule antecedents in a particular way; they are suited to lists of synonyms, and enumerative definition by ostensive and denotative examples. Venn diagrams may be useful glosses to explain the Chapter Four: Shell epistemology 304 overlapping of definiens, or the overlapping sets of characteristics concerned, particularly for connotative definitions,

To capture both enumerative and connotative material facts, River nodes may also be linked to Porphyry trees to locate material facts, and their source, in a taxonomy. Material facts may occur at any level of a Porphyry tree; they may be the root node, a genus, differentiae, species, or an instance at the bottom of the tree. From the relationships between the categories in a taxonomy tree, it is possible to argue upward from any of the specific instances, to the all encompassing universal; this might be regarded as inductive predicate reasoning:

All mortals have substance. All immortals have substance Therefore all mortals and all immortals have substance.

Alternatively, deductive reasoning might be possible in applying the hierarchical taxonomy to an instance of it: All mortals have substance. Harry is mortal. Therefore Harry has substance.

As an hierarchical classification of inductive instances of ‘substance’, The Porphyry tree might be set out as a 3d matrix spectrum, like the filing cabinet model shown in Figure 4. or Conover's cube in Figure 4.31. All nodes below the root are predicates of the root and nodes below other nodes are predicates of those above; some are also predicates of other predicates, horizontally or diagonally. Some predicates entail other predicates, given the taxonomy. The categories are intensional from the root to its intensional sub-categories; the lowest level of instances are extensional. In the original Porphyry tree, all the categories necessarily fall within the concept of substance. They include opposites that are contradictories, so that there are no excluded middle categories; the substance tree is comprehensive, covering all possible existences. Instances of any category are necessarily part of the system of categories; they are be located as a matter of predicate deduction, as the extensional levels at the bottom of the tree. Branching of the Porphyry tree separates sets of intensional and extensional items that might establish the material fact of the universal, or a genus that establishes the material fact and excludes the sets in other Chapter Four: Shell epistemology 305 branchings. The genus and differentiae levels of the original Porphyry tree are the intensional levels of the root node, indicating connotative definition. The categories of original Porphyry tree are intended to cover all possible instances; everything is either corporeal or incorporeal, every corporeal is either animate or inanimate, and so on. It would not be possible to establish No substance.

If the root of a tree represents a static whole, such as substance in the original tree of Porphyry, then reasoning may proceed from the starting point of the modern tree root node towards the leaves, to see the categories in the taxonomy. If something is to be categorised within the static whole, then it may be tested through the system of defined categories to see if and where it fits. This requires backward chaining; the search looks from the widest universal to the narrowest specific categories, in order to locate the relevant category of the thing in question. A hierarchy of defined categories follows, or can be inferred, from the root. Knowledge structure of the Porphyry tree implies a flow through disjunction of pathways based on the mutually exclusive contraries and contradictories in the knowledge.

If the root of the taxonomy tree is the material fact that establishes a deductive antecedent in a rule of law, then the root is the goal; but in a taxonomy tree, there is no single starting point to proceed to that goal; a selection would need to be made from the leaves, as facts of the user's case, to find the alternative pathways to the root. This requires forward chaining; the reasoning proceeds by finding the applicable detail of a leaf and working through further details of the categories in the pathway from that leaf until the root is found as the conclusion that is the material fact.

In the field of logic, the first Law of Thought, the Law of Identity, A is A, establishes the constraint of logical space, similar to the physical constraint of space, as a fundamental requirement of logic. Identification or definition confirms the content of a logical space. By adopting the space constraint as if logical space were physical space, then, the verification of logical reasoning, should be applicable in the physical world. However, as Jones (1911) pointed out, the Law of Identity is not axiomatic; she explains that A is A establishes nothing about B. Accordingly, the other Laws of Thought, namely the Law of Contradiction and the Law of the Excluded Middle, are not derived from the Law of Identity. In terms of logic space, the Law of Contradiction states that A cannot occupy the same logical space as not-A; the Law Chapter Four: Shell epistemology 306 of the Excluded Middle states that A and not-A have separate identities which cannot both occupy the same logical space; they each occupy different logical space and thus something is either A or not-A.

Jones suggests an alternative to the Law of Identity which she calls the Law of Significant Assertion, as a Law of Thought that is presupposed by the use of the Law of Contradiction and the Law of the Excluded Middle. The Law of Significant Assertion requires an assertion of a relationship between A and B, such as A is B or A is not-B; it is defined by Jones (1911, p.66) as a definitional assertion for the purposes of predicate logic, thus:

Every subject of predication is an identity (of denotation) in diversity (of intension).

This means that a root node material fact, as a subject of predication, provides intensional diversity for the location of identities of instances at the bottom of the Porphyry tree; selected characteristics in the diversity of the characteristic defined by the genera and differentiae that lie between the root node and the bottom level of instances, define the instances.

The effect of the Law of Significant Assertion is that the taxonomy of the Porphyry tree must first be established in toto by ontological assertions that do this, before it can be used inductively or deductively. The holistic logic of Toms (1991) reinforces this requirement, by looking at difference in identity, but it is more concerned to test consistency, in a holistic entity of relative existences and no existences; his ideal of holistic logic accommodates legal logic which also treats adversarial contradictories as each having logic places. In the legal domain, premises mean a place or places.

Toms, who set out to explore the relationship between consciousness and matter, for a resolution of idealism and realism, is particularly concerned with overlapping logical identities, and relative negation, that qualify consistency and prevent erroneous ontological presuppositions in a logical system. He initially compares his holistic logic to standard mereology, and set theory illustrated by Venn diagrams, in logic, in a table which is shown as Figure 4.32

Free logic (Lambert, 1991) is defined as presupposition-free logic in the existential import of a deductive premise. It may be useful in prior analytics of legal knowledge engineering, especially to ascertain existential presuppositions of the rules of law which might need to be taken into account in proving material facts, and in glossing Chapter Four: Shell epistemology 307

Figure 4.32: Toms, E.: Holistic logic: A formalisation of metaphysics second ed., published by the author, Edinburgh, Scotland, 1991, p.8. Chapter Four: Shell epistemology 308 of the phenomenon where it may occur. Complex legal definition may require careful determination and logical arrangement of legal ontology, according to the language that sets it down.

Mereology, is derived from the Greek word for part, meros, so it is concerned with the plan or scheme of parts, and is the study of wholes and parts; it may be useful in the development of prior analytics of legal knowledge engineering, as a form of systems analysis and design. Ramus includes parts in his logic method for judgment, as distinct from invention; for instance antecedents and consequents are parts of syllogisms. A taxonomy consists of kinds of parts. Lesniewski (1886-1939) introduced mereological studies to relativize semantic concepts as a matter of linguistic stratification, and deal with the implications of relations between parts. There are mereological categories of parts by which to logically order ontology (Simons, 1987), such as proper part, improper part, overlapping (having a part in common), disjoint (not overlapping), mereological product (the intersection of overlapping objects), mereological sum (a collection of parts), mereological difference, mereological complement, universal sum, and atom (that which has no proper parts); different categories have different implications relative to each other.

In the legal domain, it is material facts that establish rule antecedents. Sometimes the findings of fact in a precedent case determine material facts that are suited for inclusion in a spectrum list. If a legal practitioner wishes to access a precedent case to see how a material fact was proved, or how an antecedent was established, Wigmore charts of the evidence in the precedent case would be useful glosses; they may show how the facts that are not material facts, the mere facts, established the material facts (cf. Anderson and Twining, 1991). This is also useful to clarify the rule of evidence, res ipsa loquitur (the thing speaks for itself), the use of which usually has to be learned in legal practice. For instance, if scaffolding at a work site collapses, injuring a worker, then it is not enough to provide evidence of the collapsed scaffolding. The collapsed scaffolding is not a thing that ipso facto establishes negligence. Until there is evidence of how the scaffolding collapsed, there is no evidence of the point of negligence and who was responsible for that particular point of care required. If there is evidence of a significant crack in the scaffolding just prior to its collapse, that might be enough to invoke res ipsa loquitur; however, it is still prudent to call an expert on scaffolding to confirm that Chapter Four: Shell epistemology 309 such a crack would of necessity produce a collapse of the scaffolding, and there was no indication of other faults that could have produced the collapse. The nature of the crack, how it occurred, and how it could have been prevented, would also have to be explained by the expert witness.

In the field of legal knowledge engineering. Bex and Prakken (2004) applied arguments in dialogue to evidential reasoning, using graphical illustrations of arguments and argument moves. eGanges applications provide a map for the parts of evidential reasoning that might apply in the application of the rules of law to a user's case.

Logic moves involve changes in logic space; changes in the logic space position of logical units such as antecedents, can be shown as a sequence of changes in Absolute logic space. Argument moves, which are four dimensional, can be shown as a sequence of different three dimensional logic states, in much the same way as movie film. Geometric ideographs can now replace algebraic formal logic. In the eGanges shell epistemology, a change of node colour indicates a change of argument, and sometimes these changes may be the automated consequence of a user's change of answer. However, eGanges maps showing different arguments can be used to make up a sequence of changes in argument or conflicting arguments, as argument moves.

Porphyry trees of inductive instances may arise from facts and findings in case authorities; predicate trees may be a structure of inductive instances that ultimately define antecedents in the rules of law. The legal domain uses inductive logic in the form of spectra, but sometimes it requires taxonomy trees, as a gloss on its adversarial propositional deduction in the form of Rivers.

Text glosses are suited to stipulative, precising and persuasive elaborations of antecedent definitions, as they allow specification of what is required. Operational definitions of material facts that are not already provided in the rules of law, may require a link to an evidentiary River, or some other appropriate knowledge structure. Antecedents in Rivers or trees may represent Horn clauses, as well as isomorphic rules. In finding the material facts that satisfy antecedents of rules of law, Horn clauses are appropriate because they may assume the material fact, whereas, jurisprudentially, a rule of law should not be assumed. Chapter Four: Shell epistemology 310

The concept of inference in Ritchie’s pre-computing statement of backward and forward reasoning in logic, could cover both induction and deduction. Evidentiary taxonomy trees may be used deductively, and antecedents in a set of deductive Horn clauses may be defined inductively. Inference engines were designed to carry out the subtle shifts between induction and deduction that might occur during a process of extended deduction that involved inductive definition and selection. Backward and forward reasoning was used as a paradigm in artificial intelligence for data structuring and processing. Trees were seen as static structures of knowledge, to be searched, according to search heuristics, forward, backward, sideways, or some combination of these, according to a program epistemology that might determine depth-first, breadth first, or some combined sequence of depth and breadth searching. Searching subsumed reasoning in semantic matching. However, inference engines only effected deduction and its attendant induction when the branching represented the required deductive/inductive relationship between connected nodes; prior analytics is required to ensure the semantics in the static knowledge structures permit simulation of inferencing by the chaining. Backward chaining may be useful in prior analytics (cf. Dewey, 1924, p.23).

In data processing, the meaning of the data in the nodes of a tree, and the relationship between the nodes may determine whether or not the sequence of decisions represented by the tree, has some logical significance greater than the defined taxonomy of a decisionmaking process. Trees may represent defined taxonomies or the hierarchical logic of a complex decisionmaking process, as the basis for backward, forward and/or sideways chaining, involving sequences of depth and breadth heuristics that does or does not follow logical inferencing; they may employ various epistemologies. Whatever is required to simulate an operational definition of a material fact, must be considered to prepare an appropriate gloss; such definitions might become increasingly used in scientific areas of law, such as pollution control legislation.

Search epistemology, upon completion of matching user input to a knowledge structure, may effect some logical inferencing process, that may be deductively necessary, inductively definitional, or abductively more or less weak or strong. Alternative possible pathways through the knowledge structure may or may not be comprehensive. Leaves at the end of these pathways may be homogeneous, or Chapter Four: Shell epistemology 311 differentiated, so that each is unique, or belongs to one or several categories. Nodes may contain data that is a single antecedent in an evidentiary rule to be applied deductively, or some combination of deductive, inductive and abductive premises. Knowledge structures have the potential to accommodate various epistemologies, as qualifications of a decision-making taxonomy.

Where a tree is a decision tree rather than a taxonomy tree, the root node is the starting point of the decisionmaking process. The leaves are the final results of each path of branching. Each possibility must be set out and the leaves do not have to be different from each other. Given the semantics of the node information and their relationships represented by the branches, it may be necessary to duplicate the content of a node in a node that is located in another position in the tree. An example of duplication can be seen in Figure 2.24, in the repetition of the node i. It is also to be noted in regard to the tree in Figure 2.24, that a node without conjunction or disjunction, such as node c, which indicates conformity with the positive prima facie River, may have a single branch to another node, representing the next stage in the decisionmaking process. Nodes with single branches offer no consistent disjunctions, that is, no alternatives that are all consistent with attaining the Final positive result of the prima facie River.

In the case of a taxonomy tree, if the purpose of the search is to understand the place in the taxonomy of an unclassified member of a sub-category, then the search might proceed backwards through the tree like a list of attributes of categorisation until the sub-category that applies is determined as a leaf result. Alternatively, for instance, in the original Porphyry’s taxonomy tree, a leaf, Plato, might be the starting point to discover his characteristics that are matters of substance. If the interest is in immortal, and immortal connects to rational, then the decision tree would next connect to rational. If the interest was not in rational, then the next characteristic would be sensitive, and so on. The Porphery tree does not reflect an operational definition. However a decision tree may represent an operational definition, even if parts of it involve a Porphery tree structure.

As indicated by Winston's trees, for a decision tree, the root node or a leaf also may be the starting point to search the tree; thus a leaf or the root node of a decision tree may be the Final result or goal, not the universal or an instance of a Porphyry tree. The leaves as alternative starting points, or alternative goals, are independent rather Chapter Four: Shell epistemology 312 than necessarily intensional or extensional. It is the same for the River with its Final result and pathways of antecedents; the Final result does not encompass intensionally and extensionally, all of the antecedents or decision nodes, even though they are a set of related parts. Antecedents may be categories related to effect decision, choice or contingencies, not categories of existence, even though an existential assertions may be the basis of effecting decision, choice or contingencies.

The Final result of a River is not a universal but a goal; any of its leaves are its starting points. The processing is forward chaining only, to find the decision outcome as the root node of the River, that the Final result is established or not; a leaf or antecedent is not the Final result of processing the River, as it might be in a tree. For instance, in a decision tree, which is also a River, to select an employee for a job, the criteria for appointment would be considered in a predetermined sequence as the antecedents to be satisfied in order to reach the selection decision of interview or not; if only one applicant qualifies, it may not be necessary to consider others for interview. The decision tree captures the taxonomy of the decision making process itself; the epistemology of the knowledge structure of a taxonomy tree, depends on the intensional and extensional definition of the universal and is not focussed on the process of a decision. The taxonomy of a Porphyry tree, in a sense, provides a forgone conclusion of the universal or the leaf, as determined by the taxonomy which covers all possibilities by excluding the relationships between contradictories; in a decision tree that conforms to a River, the process of an operational definition determines the conclusion. In either case, the conclusion might be a material fact that establishes an antecedent in a rule of law.

The propositional logic of a River, rather than the predicate logic of a Porphery tree, can accommodate the independent parts of a system of evidence raised by an operational definition. As in the system of deductive rules, a system of evidence, determined by an operational definition, may be represented as a tree or a River so that it is possible to convert certain trees to a River or a River to a certain tree, and thus to convert the ontology of a tree to a River and vice versa. However, the ontology of possibilities of evidence, may be unmanageable, so the evidentiary tree like the River should be limited to the selected ontology of positive possibilities, managed by an evidentiary eGanges River that is linked as a parallel River to the law application. A River has one Final result only; node labels are unique, consistent with Chapter Four: Shell epistemology 313 the Law of Identity, and arcs to indicate conjunction in and/or trees are linear in Rivers. The consequent of the evidentiary antecedents of an arc in an and/or tree, is located after the sequence of antecedents in a River. Like a tree, there is branching in a River and available processing flow in alternative paths. The flow in a tree is in a selected path through disjunction, in a River, backtracking is required to satisfy all streams in the River, except where there are fans; a choice of pathways in the River is based on mutually exclusive contraries. If trees have disjunctions based on contradictories as well as mutually exclusive contraries, then these aspects of the tree cannot be included in its transformation to a prima facie River; contradictories are not consistent with the positive final result. In accommodating contradictories, trees must use conjunction arcs that complement positive disjunctions without arcs, in the same tree. The tree structures that convert to River structures must be selected, and the remainder left to eGanges processing of the River; interim disjunctive structures and cumulative conjunction structures in a River, may terminate the sequence of evidentiary antecedents to be considered beyond the consequent of the arc in a tree. Provision must be made for these structures in the transformation of a tree to a River.

An evidentiary tree would be required for each sector in a spectrum of material facts, to capture the ontology of evidentiary possibilities, and given the overlapping of possibilities in a combinatorial explosion, this would be difficult to draw as a tree, even as a three dimensional, three node tree, or a sphere, even if the sphere were flattened. A 3d tree structure still raises the problems of how to accommodate Pole rules, pro tem phenomena and neutral nodes as well as how colour coding might assist logic processing and user cognition. The 3d tree might be effectively transformed into a Star of Rivers. An operational definition is likely to require heuristic processing of its contradictories and uncertainties, relative to its positives, in order to establish the material fact that is its conclusion.

The legal system is only feasible if evidence is kept separate from law and material facts in cases, are kept separate from antecedents in rules of law. Otherwise the ontology of potentialities suggested by mere facts of precedent cases, may be unmanageable, whereas, the ontology of possibilities of established material facts may be determined in the same way as the ontology of legal possibilities, for rules of law. Chapter Four: Shell epistemology 314

There are similarities and differences between trees, but some trees can be transformed into prima facie Rivers. The eGanges shell epistemology may process transformable trees as Rivers. Gloss trees of this sort, as links, may provide evidentiary supplements to an eGanges application.

Plug-ins such as negotiating aids may also supplement eGanges applications. Where consultations are exchanged between negotiating parties, differences can be identified for resolution by exchange of benefits or detriments which they represent. Notes on suggested exchanges may be considered so that complex conflict may be avoided or settled by micro solutions.

A strategic application may be needed to avoid potential conflict in a transaction, such as the two party Convention application, where both parties are accommodated in the one application. Buyers obligations are what the seller wants and vice versa. Seeing the occasions for successful litigation is also seeing how to avoid it.

Various commentary may be relevant as abductive strata logic, and suitable glosses may be used from the range available. If there is non-monotonic or irregular but related logic that is relevant, in some way other than evidence that establishes a material fact, it may be included in an application at any precise point of relevance, as an abductive gloss.

A gloss that links nodes in different rules in the same tree may represent a complex interweaving that can be delineated as gloss arguments. Glosses may limit or modify rules. Reasons for a rule of law are abductive logic. Sometimes the reasons include concepts that have paradoxical or circular limits. For example, the concept of love in the reason for a duty of care, the biblical commandment, love thy neighbour, is an inherently circular concept: the greater the love, the closer love grows to being harmful, while the lesser the love, the closer it also grows to being harmful. These problems of balance make such concepts unsuitable as antecedents in rules of law, but useful for motivating approval of and compliance with the rule of duty of care.

Strata glosses indicate fields of abduction. Through its glosses, an application may have a blend of additional and various ideographs, including icons such as traffic or industry signs, pie charts, Pareto columns, scattergrams, and technical drawings. Photographs of equipment or places may also be added. As well as prior analytics, there is considerable scope for design of an application for automation and Chapter Four: Shell epistemology 315 communication that is a matter of legal knowledge engineering methodology. The eGanges shell allows logical attributes of legal epistemology to be designed as an interactive visualisation, and provides a communication system that, by computation, can convey the complexity of legal epistemology within the scope of ordinary human cognition for subjects of the legal system to be users. For the purposes of developing legal knowledge engineering jurisprudence, the art of application construction might also be developed. The legal domain grew out of the art of rhetoric, and now, in the field of technological jurisprudence, it reaches into the visual arts.

Fraunce began with 2d geometric representations of non-monotonic legal argument; the eGanges shell epistemology, via 3d representations of distinctions between legal deduction, induction and abduction, provides 2d extended deduction maps that can be processed for adversaries, as well as locating relevant induction and abduction, including evidentiary logic.

The developments of legal epistemology and legal ideographs, which began and were integrated in the work of Fraunce, thereafter took separate paths, and are now reintegrated in the eGanges shell epistemology; in their reintegrated form, they fall into the field of legal knowledge engineering methodology where they may undergo further separate or integrated development. The reintegration of the eGanges shell epistemology reinstates 2d legal ideographs, but in a processible form, due to the metaphysical story of Abbott which coaxed ordinary people to see better their position in Plato's cave, and founded the introduction of three dimensional logic space to jurisprudence. The suppressed computational epistemology of 3d legal logic, could then be explored and compressed for the practical constraints of the eGanges shell epistemology. Perhaps further development will establish the interactive visualisation of 3d legal logic as an art show for the development of human intelligence suited to a space age.

4.3.5 Communication by Notes

The Note window may be used in several ways:

1. It may provide output advice about the question or about the best order in which to select antecedents for input where the user has departed from the default order;

2. It may offer the user an opportunity to type in data, such as evidence that details how an antecedent may be proved in the user's case; Chapter Four: Shell epistemology 316

3. It may be used to advise the user on absurdities or impossibilities in the user input, if a certain answer is chosen, given prior answers.

4.3.6 Build Functionality

In order to build an application, the expert is required to construct:

● a River in the Rivers window;

● questions typed in the Questions window;

● any adjusted answers on the answer buttons to replace the default answers;

● any Notes required in the Note window;

● any information in the Introduction page which opens for typing on selection of the Build - New River, and then Build - current introduction. There is one button to toggle between Build and Stop Build and one button to toggle between Consult and Open. However, the Options menu is currently under review and this may be simplified soon.

The Build button, which is the plain green icon at the top right hand corner of the Rivers window, allows the user to develop nested tributaries of a River as rule or procedure maps, and glosses that pertain to antecedents or procedural points on the rivers; the menu drops over the adversarial windows which are not required in Build mode. Node Questions and possible Answers to the Questions, and notes that pertain to the Questions are constructed in the Questions window, on the answer buttons and in the Note window respectively, simply by placing the cursor there and typing what is required. Default answers appear on the answer buttons as No on the negative button, Yes on the positive button and Uncertain of the uncertain button. If the order of Yes and No is to be changed, new answers are typed over old answers. Answers to questions of neutral nodes, are all located on the three positive answer buttons; a left click on a positive answer button that has no default answer on it, automatically moves all answers to the positive answer buttons. Other forms of answers can be used by changing default answers as required; for instance if commands are used instead of questions, answers might be Not done for negative, and Done for positive. The HELP option provides the user manual for further clarification of functionality in Build and Consult modes.

In eGanges, strata logic may be added as glosses on an antecedent node. A right mouse click on a selected node will permit the user to build glosses. During the Chapter Four: Shell epistemology 317 construction of an application, the Builder may navigate the nested maps and may switch to Consult mode. The Consult button enables the user to consult an application, that has been built, by (1) navigating the nested River tributaries, (2) accessing the glosses, questions and notes, (3) providing input by selecting from the available answers and by typing input data in the Note window,. and (4) receiving feedback, following the input, in the Negative Case, Positive Case, Uncertainties and Current Result windows.

The Report button produces a Report of a consultation which lists the antecedent labels that have been recorded in the positive, negative and uncertainties windows together with their questions, the user’s answers, Notes advice, user Note data input, and the Current result. There is always a current result of a consultation, which may be the pro tem result, unanswered if the consultation is incomplete. A consultation may be saved and resumed and an application construction may be saved and resumed. Work on an application may produce cumulative benefit; whatever has been constructed may be useful.

4.3.7 Conveniences

Certain convenient buttons are available in both Build and Consult modes. There are options to go Back and go Forward, Undo, Redo, Save, Print, Search, Clear, and Close. The Save options allow an application construction, a consultation, a screen or a map to be saved. An application and a consultation are saved as a .gan file, and a screen or a map may be saved as a graphics file, initially as a .bmp file that can then be converted to a .gif file. Clear is only useful in Consult mode to clear a consultation, which may be cleared after it is saved as a .gan file with its own file name.

4.3.8 Communication system heuristics

The five alternative Final results of the computational epistemology of 3d legal logic are reduced to three: (1) negative which includes both partially and wholly negative, (2) positive, and (3) uncertain which includes both partially and wholly uncertain. Answer input triggers sorting of the current node label of the question for which the answer input was given. The sorting gives effect to the flows of the Star of adversarial rules that is activated by the answer; the adversarial label of an answer indicates whether the answer is a selection from consistent, contradictory or Chapter Four: Shell epistemology 318 uncertain disjunctions. Answers are either positive consistent with the positive Final result, contradictory to the positive Final result if the answer is labelled negative, and an uncertain disjunction if the answer is labelled uncertain. If the current node is in a fan on the prima facie positive River shown in the Rivers window, it has a double disjunction position, and is subject to disjunction heuristics that ensure that the user may choose only one fanstream to reach a positive result. Contradictory and uncertain answers may be given in other fanstreams and these are still positive pro tem.

Uncertainties are the risks of losing for the positive and negative cases. They are more likely to support the negative case, because the positive case usually, but not always, carries the burden of proof. For the positive case to win in a civil court action, subject to any fan constraints, every uncertainty must be proved to be positive on the balance of probabilities. In a criminal case, usually the prosecution must prove that every uncertain antecedent is, beyond a reasonable doubt, actually a positive antecedent. As long as the party with the negative case can keep an antecedent uncertain when the standard of proof is applied, the uncertainty will prevent the positive case from winning, even though no negative antecedent has been established. The party with the negative case may win by virtue of the uncertainties that are not found to be positives.

Subject to overriding sorting heuristics for positive fans, negative answer node labels are reported in the Negative case window, positive answer node labels are reported in the Positive case window, and uncertain answer node labels are reported in the Uncertainties window. Because neutral nodes have all positive answers, this sorting gives effect to the unnecessary and insufficient conditions in the River. As soon as a negative answer's node label appears in the Negative case window, a Negative Final result is established, and the positive result label in the prima facie River, appears at the top of the list of labels in the Negative case window, indicating that there is a negative Final result. Likewise, as soon as an uncertain answer's node label appears in the Uncertainties window, an uncertain Final result is established, and the positive result label in the prima facie River, appears at the top of the list of labels in the Uncertainties window, indicating that there is an Uncertain Final result.

Where there is a positive fan, negative answers are pro tem reported in the Positive Case window, with (Neg) in front of the label, and uncertain answers are pro tem Chapter Four: Shell epistemology 319 reported in the Positive Case window, with (Unc) in front of the label. As soon as the positive fan fails, either for a negative Final result or an uncertain Final result, some or all of the pro tem labels are removed from the Positive case window. Only those negative pro tem labels that complete the set of necessary and sufficient conditions that establish the negative Final result, are moved from the Positive case window to the Negative case window, as soon as the negative Final result is established; in the Negative case window, the (Neg) before the label when it was in the Positive case window, pro tem, is removed. Likewise for the uncertain pro tem labels in the Positive case window; as soon as the uncertain Final result is established, only those uncertain and negative pro tem labels that complete the set of necessary and sufficient conditions that establish the uncertain Final result, are moved from the Positive case window to the Uncertainties window, with the (Unc) removed but not the (Neg) removed. Pro tem uncertainties never appear in the Negative case window, but pro tem negative labels may appear in the Uncertainties window, as necessary and sufficient conditions for the uncertain Final result. Negative prevails over uncertain except where there are disjunctions in the Rivers window, where uncertain prevails over negative, so that the fan as a whole fails for at least one uncertain fanstream. Unless a pro tem label is re-sorted into the Negative case window or the Uncertainties window as a necessary and sufficient condition, it remains in the Positive case window.

If a set of necessary and sufficient conditions for an uncertain Final result is established, an uncertain Final result is given until a negative Final result is established; the Current result button will report uncertain and the Final result label in the Rivers window will be entered at the top of the Uncertainties list. If a negative Final result is subsequently established, then all the pro tem negative labels in the Positive case window and the Uncertainties window, that complete the set of necessary and sufficient conditions for the negative Final result will be re-sorted without their pro tem (Neg) addition into the Negative case window; the Final result label will also move from the top of the Uncertainties list to the top of the Negative case window list, indicating a negative final result. Once a negative Final result is established, the Current result button will report a negative result, and the negative Final result can not be changed without changing at least one answer. If other sets of necessary and sufficient conditions for a Negative final result are established, either Chapter Four: Shell epistemology 320 by answers or by the necessary consequents of answers, they are added to the Negative case list.

As soon as a Final result is established, the Final result label of the prima facie positive River, is entered at the top of the list in the appropriate Adversarial case window. An uncertain Final result will be reported in this way, indicating that there is no possible Positive Final result, without changing at least one answer. Once a positive Final result is established, it cannot be changed without changing at least one answer.

In eGanges, three dimensional graphics are not used as they are too complicated to navigate. Instead, there are three Adversarial windows, the Negative case window, the Positive case window and the Uncertainties window, each of which will list appropriately antecedents that are established in a user's case. If the user left clicks with the mouse on an antecedent in one of the three lists, the River map that has that antecedent appears in the Rivers window. The selection acts as a back button so that the lists are also navigation aids. The two dimensional river graphics work in tandem with the adversarial system of lists to expedite the implications of the full three dimensional system of adversarial rules.

The Sphere of the adversarial system of Rivers, confines the combinatorial explosion of possible cases, by the use of Poles. The positive River is placed in the equatorial plane as the central position, as it has no Pole streams. It may be longer than the negative and uncertain River systems, if it has neutral antecedents.

The difficulty in designing isomorphic rule-base systems, is the specification of all the alternatives in the combinatorial explosion of possible cases, as well as the modelling of overlaps and common parts of alternative possible cases. Ad hoc two dimensional decision trees are not suited to this task; if they represent pathways for both opponents, by two root nodes, contradictories of contraries sometimes have only a pro tem interim consequent. A consequent in one rule may also be an antecedent in another rule. There are alternative overlapping sets of necessary and sufficient conditions, sometimes mixed with unnecessary and insufficient conditions; sometimes the sets that apply to opponents’ cases share common conditions. A Negative case is different to failure of the Positive case. Legal knowledge is not statically determined in such a way that its patterns can be matched simply; its Chapter Four: Shell epistemology 321 patterns are not always mutually exclusive. Definitions, either the inductive instances or connotative characteristics, of a deductive antecedent, are separate from the extended deductive propositions, so that a tree that mixes the three forms of logic, would require heuristics that sort significant consequents of doing so. Inductive and abductive links may be co-conditions for deductive links in extended deduction; they are not themselves deductive links. The epistemology of 3d legal logic provides for these non-monotonics of logical complexities. The deductive complexity is easier to specify for processing without the unnecessary complications of a prioritisation system that manages non-monotonic logic; in any event non-monotonic legal logic processing is not required for judgment reasoning, the application of law to a user's case, although it may be essential for invention reasoning to deal with new cases that require a modification of law. The Ramist distinction between judgment and invention is useful in legal knowledge engineering methodology.

The communication system of the eGanges shell epistemology allows the user:

1. to freely navigate the River system to learn about the system of rules and questions; simply clicking on a node makes it the current node so its question and possible answer can be seen. A right click on a soccer ball node brings up its sub-map or return to the map above; the soccerball right click invokes the toggle between a map and its sub-map. There is also a small map numbering window beside the Current result button. so that a map number can be inserted to retrieve that map. The map numbers shown also indicate the total number of maps in the application

2. through the interactive visualisation, to freely explore the logic by giving and changing answer input; the transparency of the processing that warns by answer labelling of the effect of an answer on the case for attaining the Final result of the River, can be checked to see if the sorting of the node label for the answer accord with the answer label. Colour coding on the answer label and the adversarial windows, reinforces this expectation and the communication system process of matching and sorting.

3. with the cumulative effect of answer input, to check at any time during a consultation the Current result which is both reported immediately at the top of Chapter Four: Shell epistemology 322

the appropriate Adversarial window, or confirmed in the Current result window by pressing the Current result button.

4. to learn what is the set of necessary and sufficient conditions to establish a Final result in an Adversarial window, as only pro tem positive antecedents that are necessary to complete a set of necessary and sufficient conditions for a negative or uncertain Final result are moved from the Positive case window to the Negative or Uncertainties window, respectively, when a negative or uncertain Final result is established, respectively.

5. to learn when uncertain overrides negative in fan processing, and when negative overrides uncertain in Final result processing. In a fan, if one fanstream fails for uncertainty, and the other fan streams fail as negative, then the net result is uncertain overrides negative. In conjunction Rivers, if one antecedent fails as negative, then the whole rule fails, meaning that the interim consequent fails as an interim consequent, and also as an antecedent in the next rule that it is part of, so that that rule also fails, and so on downstream.

4.3.9 Conclusion

As answers to questions are labelled to correspond with the list of points for each case, and the Current result can be seen to depend on the content of the Cases and Uncertainties windows, there is transparency for users. eGanges satisfies the design criteria recommended by Meister (1991, p.350): its interface is simple, easy to learn and concrete; it is also suited to experts and their lay clients. eGanges aims to provide equality of knowledge access for people who may benefit from the expertise, and a framework for an affordable, effective and user-friendly codification of expertise.

An extensive study of effective interface design by Shneidermann (1998) includes a graphic of the cone tree (p.535), developed by Xerox PARC, Palo Alto, California, as a three dimensional hierarchical tree which can be rotated to see its differentiae and species. Chapter Four: Shell epistemology 323

4.4 STAGE 3 SPECIFIC META-EPISTEMOLOGICAL METHOD

Retroduction, anticipation and epistemological visualisation are used as meta- epistemological methodology in Stage 3 of the specific meta-epistemological method. There is selection of Stage 1 epistemology for cognition and a communication system, that can give effect to the whole of the Stage 2 computational epistemology of 3d legal logic. The use of epistemological visualisation as a Stage 2 specific meta-epistemological method continues in Stage 3, dropping from 3d to 2d, but giving effect to 3d as well as communication requirements of legal domain epistemology.

As Stage 2 specific meta-epistemological method, the jurisprudential systems analysis and design that was used to develop the 3d visualisation, was extended by legal epistemology to clarify and produce the epistemological visualisation for the computational epistemology of 3d legal logic. Then as Stage 3 specific meta- epistemological method, the Stage 2 epistemological visualisation was subject to further retroduction, experimentation, deconstruction, extraction and synthesis, as Stage 3 visualisation and processing heuristics in a user-friendly interface, for the eGanges shell epistemology The Stage 3 eGanges shell epistemology reaches a synthesis between the Stage 1 domain epistemological requirements for user- friendliness and the Stage 1 domain epistemological requirements for adversarial logic processing. This synthesis advances legal domain epistemology to enable legal experts to employ technology with generic legal intelligence, in which they may embed their specialist legal intelligence for their own purposes or directly for the benefit of others. It advances the science of legal choice.

These Stage 3 methods may be applied in the remaining two Stages of knowledge engineering, particularly in the acquisition and representation of application knowledge; they are transformation techniques that portray the metamorphosis of human intelligence to the artefacts of artificial intelligence. The epistemological visualisation of 2d ideographs that have been used in the knowledge engineering of eGanges have been developed from the computational epistemology of 3d legal logic and accommodate 3d epistemological complexity. Chapter Four: Shell epistemology 324

The visualisation in the computational epistemology of 3d legal logic, provides a sound epistemological basis for large scale knowledge engineering; it contains epistemological combinatorial explosion in its extended deductive premises by its provision for alternative overlapping sets of necessary and sufficient conditions that might be interspersed with neutral consistent conditions, that are unnecessary and insufficient condition, and also by the Poles. Also the computational epistemology of 3d legal logic, provides differentiated relative visualisation for deduction, induction and abduction. The methodology for large scale knowledge engineering was expanded from 2d visualisation to 3d epistemological complexity. To produce large scale artificial intelligence, some visualisation of human intelligence is an effective technique. Visualisation reifies the expert mystique; it can map the metaphysical domain of logic, ontology and available processing, even if it serves a three dimensional terrain of alternative possible worlds.

The candidate (Gray, 1986) originally posed a systems analysis and design approach to jurisprudence for the purposes of computed-aided learning and business, in 1985.

The candidate’s Master’s thesis (1990, revised 1997) was primarily concerned with jurisprudential systematisation to clarify and manage legal choices: 3d legal logic was posed as a visualisation and then an epistemological solution and paradigm that was potentially methodological for the field of legal expert systems. The jurisprudential system of legal choices, generic to all systems of rules in the legal domain, was modelled graphically by the computational epistemology of 3d legal logic, so that categorisation of the facts of a case, or conformity to the required categorisation for a case, could be managed.

The legal universe and its constituent parts, in the computational epistemology of 3d legal logic, are both a systems flowchart and also an object visualisation. The 3d systems visualisation is understood more precisely as:

1. 2d epistemological ideographs and

2. 3d epistemological complexity

As a Stage 2 specific meta-epistemological method, an ideograph may represent an epistemology and it may itself be an epistemology for determining its processing heuristics; a computational epistemology offers processing heuristics. The epistemological ideographs of the computational epistemology of 3d legal logic, Chapter Four: Shell epistemology 325 represent the structures of knowledge, and indicate heuristics for their processing. As knowledge becomes extensive and complex, ideographs may expedite epistemological communication. They may be designed to do this, and they may assist design.

A two dimensional River ideograph first was used by the candidate as a program design map in 1987, for the development and explanation of a small experimental legal expert system, CLIMS (Contract Law Information Management System) Pilot No.1 (Gray, 1988; Gray, 1990, p.530; Gray, 1997, pp.228-257). The prior analytics in the course of knowledge acquisition that was required to produce this sample of contract rules, distinguished rules of contract law from other legal information in the nature of abductive reasons for those rules, and case authorities for the rules. Rules of law apply deductively to a user’s case; once the antecedents in a rule are satisfied, then the consequent prescribed by the rule applies to the user’s case. In a system of rules, antecedents and consequents overlap as indicated in Chapter Three; the flow of reasoning is indicated by the inference arrows in the overlapping conditional propositions. The flow of extended deductive reasoning is automatable by heuristics that match user input answers to the relevant pathway of antecedents, with its ultimate Final result.

The epistemology of 3d legal logic is also a computational epistemology because it permits object-oriented programming. The eGanges shell epistemology was transformed in Stage 4 by object-oriented programming, using Java. Its programming epistemology is object-oriented programming epistemology, within the constraints of the Java programming language and the constraints of the Stage 3 eGanges shell epistemology. A sample applet can be tested at www.grayske.com 5 CHAPTER FIVE:

FURTHER DEVELOPMENTS:

APPLICATION EPISTEMOLOGY AND ONTOLOGY

5.1 APPLICATION EPISTEMOLOGY

5.1.1 Sample application

A sample draft of the eGanges Vienna Convention application maps, is shown in the Appendix. As sale of goods legislation in the Australian state of New South Wales, the full text of the Convention is accessible on the web at:

http://www.austlii.edu.au/au/legis/nsw/consol_act/sogca1986308/sch1.html

The Appendix submapping extends in some areas to 6 levels of nesting, which is some measure of the size and complexity of the Convention law (cf. Rescher, 2006). Nodes and fans may be counted also to measure size and legal choices in the Convention. Every node offers a choice of adversarial position, and every fan offers a choice of alternative ways to establish the positive Final result of the River. The combinatorial explosion managed in the rule system is at the same time, the choice of pathways to a choice of three alternative Final results, negative, positive an uncertain; because these pathways overlap, some choices constitute multiple choices, and when selections are made, they may preclude other choices. Where soccer balls are not particularised by sub-maps in the Appendix sample, they indicate from where further development of the application would spring. The Appendix illustrates that the epistemology of the Vienna Convention conforms to the eGanges shell epistemology which contains generic legal domain epistemology.

5.1.2 Prior analytics, mereology

Preliminary maps in the Appendix were produced to show the layout of the Convention, to assist design of the application. eGanges can be used as a prior analytics aid. The preliminary maps indicate what sort of an application might be feasible and from where in the Convention, the rules for it might be drawn. On the Chapter Five: Further developments: application epistemology and ontology 327 basis of the preliminary maps, a closer investigation of the Convention provisions was made.

The overview of the Convention in the preliminary maps, like most commercial law, indicates substance for litigation, namely the relative obligations and remedies of the buyer and seller, based on their concluded contract. Accordingly, enforcement entitlement was selected initially as the Final result of the application. In selecting the mainstream necessary and sufficient conditions for that Final result, recourse was had to the sub-epistemology of litigation in the legal domain. Prior analytics in Stage 5 of the meta-epistemological method may revisit the Stage 1 domain epistemology where required. In that part of the sub-epistemology of litigation which deals with civil actions, counterclaims are a deciding factor in commencing proceedings to enforce a remedy; they may override an enforcement entitlement. A contractual transaction is inherently a matter of the relative position of the parties, defined by the contract; binding agreements make private law that applies only to the contracting parties. Thus, in determining the structure of the mainstream in the Convention application, provision is made for a counterclaim, so that a net claim is considered as a condition of enforcement entitlement.

In the development of some applications, it might be useful to set out all the possible antecedents as ontological elements to be used, in a scattergram, and then, by reference to the language which lays down the law, initially select the ones that belong to the same Mainstream, the same secondary streams, tertiary streams, and so on up to the watershed where the rulestreams run out. The one and two dimension graphics in Figure 4.30 illustrate the scattergram approach. An initial scattergram might include positive and negative antecedents which might then be sorted into two scattergrams, one for positive nodes and one for negative nodes. Only positive nodes will be used to form the eGanges prima facie River. What is positive or negative depends on the Final result selected. The formulation of the contradictories of the express ontology, and sorting them into positive and negative, is particularly appropriate when the law is stated in a mix of positive and negative case rules, relative to the adopted positive Final result.

The scattergram will contain antecedents of varying granularity. Some may be related by virtue of the same or different levels in the granularity hierarchy. To assist sorting into Rivers, a scattergram may be arranged in ascending order from largest at Chapter Five: Further developments: application epistemology and ontology 328 the bottom to finest at the top. The finer definition of an abstract or general legal concept, may be found in this way. A scattergram was not used in developing the Convention application but resort was sometimes had to the possibility of a scattergram, as reinforcement of the nature of that law; partial scattergrams may be useful to build up a complete River. A scattergram captures fully deconstructed ontology, but sometimes it is counter-productive to fully disassemble the antecedents from their natural language position; application maps require rule formulation as conditional propositions, and the River representation is isomorphic to the formalised conditional propositions. The formalisation should minimise departure from the natural language expression of the rules.

Once the mainstream rule is formulated, with a selection of antecedents which may come from any level in the range of granularity, and which are related by conjunction in an appropriate order, then any secondary rules arising from each mainstream antecedent may be formulated, with another selection of antecedents, until all antecedents in the scattergram are correctly located in the River. In the Convention application mapping, the River was built up in that way. There was only one mainstream, although more could be managed in the eGanges shell epistemology.

Inductive and abductive information was included in the preamble of the Convention. The work of the application was limited to the construction of a sample part of the River, and no glosses were formulated. Of particular importance is the text gloss which may set out the specific Convention provision as authority for each antecedent or for its location in a rule. An indication of the relevant Convention provision is set out in the Figure descriptions of each map. The mapping concludes where an adequate sample demonstrates that the River system could accommodate large scale law, however complex and extensive.

Formulation of questions and answers is also required for each node, to complete the application. Interrogation was also excluded from the scope of the dissertation, as extensive work should be undertaken to provide interrogation methodology to reconcile the resources of available literature with the technology. In any event, the processing of eGanges will operate with no questions, on the basis of the default answers. Chapter Five: Further developments: application epistemology and ontology 329

The arrows in the Rivers, which appear automatically in its construction, indicate the flows that may be activated by input. If these flows are not activated, processing heuristics apply other flows not shown in the Rivers window; these other flows are introduced by the colour changes in River nodes, which indicate a zig-zag input through the Star of rules in the epistemology of 3d legal logic. In the eGanges shell epistemology, the combinatorial explosion of pathways through the Star, arises in the questions and answers logic, where it can be managed by the expert, according to expert practices and techniques, as well as the domain sub-epistemology of the law of evidence.

The River is an interrogation map as well as a rule map. Once large scale complex rule systems are accommodated, large scale interrogation is also thereby accommodated; then large scale evidence to establish the rules can also be provided for, and complex and extensive argument for a particular case can be managed, given the user's evidence. eGanges applications are constructed as if there were three dimensional logic space that extends in depth, breadth and width as far as the expertise requires. The pioneers of artificial intelligence, established this space as the search space (Forsyth, 1984, p.5). In this space, the legal ontology of the application law, is cast into logical structure by the formalisation of its expression. Nodes are located where they can best be linked to give effect to their formalisation. Dual roles of some nodes as an antecedent in one rule and an interim consequent in another rules, introduce the linking structure of extended deductive premises which conclude in a Final consequent. The dual role of some nodes as an interim consequent in one rule and an antecedent in another rule, create the two Poles. The dual role of some nodes as the consequent of more than one rule, creates fans. When all the rules of law are linked for extended deduction, the flow directions that might be activated by user input, can be seen as alternative pathways to alternative conclusions.

5.1.3 Jurisprudence of legal knowledge engineering

Prior analytics and mereology are major studies in the jurisprudence of legal knowledge engineering for application construction methodology. Legal knowledge engineering methodology could be further developed drawing on the resources of these studies. Chapter Five: Further developments: application epistemology and ontology 330

It was noted in the construction of the Convention application, that there were differences in the ontology and rules to those of the common law of contract. It would be possible in the construction of a River to reconcile and weave together provisions from legislation and case law through appropriate prior analytics or mereology. The distinction between deductive, inductive and abductive premises, provides for initial dichotomies as well as common pools; so too, a single application provides for dichotomies and a common pool. In an application, if a legislative provision converts a case material fact to a deductive antecedent in a rule of law, the River and its glosses may be adjusted accordingly. If legislation introduces new deductive antecedents, new rules, or changes that require deletion of common law antecedents or rules, this too can be quickly effected by a single rationalising application. The eGanges shell epistemology is a codification aid.

5.2 CONSTRAINTS

Although the thesis breaks through many of the limitations that have constrained the advance of legal expert systems technology, the thesis also has its limitations; further areas of legal knowledge engineering are not explored or resolved by the prototypes. However, the limits of the prototypes, and the further development of the prototypes, suggest further areas of study that are outside the scope of the doctoral work. Apart from demarcating for further study the refinement of prior analytics and problems of interrogation epistemology and interrogation design for an application, there are five other concerns about the eGanges shell itself:

1. Is there a need to make special provision in the eGanges shell for equity epistemology?

2. Is there a need to provide in the eGanges shell the option of a daemon for the interrogation process where the same question must be asked to establish an antecedent that occurs in two different nodes in different parts of the same tributary structure? Sometimes rules can not be conjoined at the point of their common antecedents.

3. Is there a need to modify the eGanges shell to accommodate integrated River deviations such as an evidentiary system that uses the graphical visualisation representation of Wigmore charts? Chapter Five: Further developments: application epistemology and ontology 331

4. Is there a need to make special provision in eGanges for abductive topologies?

5. What are the projected affects of eGanges on the legal system?

These five matters are not explored or resolved in this thesis. However, they are delineated as matters for further study; the prototypes permit their delineation.

5.2.1 Equity epistemology

Equity is not so rigid as common law in the deductive application of rules of law; it may intervene where there is reason to do so. Equity is a jurisdiction where over- riding legal abduction is introduced as discretion. Antecedents may be re-evaluated recursively as the answers that establish subsequent antecedents are processed. A context of circumstances may be gleaned with flexibility, as the facts of a case emerge. It may be that, to accommodate this type of equitable reasoning in gathering evidence, a process of spiral recursion is necessary.

For instance, in matrimonial property disputes, it may be necessary to determine the duration of cohabitation before determining the significance of the contribution of a spouse to the matrimonial assets, in order to finalise a distribution of the property on divorce; as the period of cohabitation of the spouses increases, so the financial contribution factor diminishes. The spread of relevant factors appears to take the form of gauges with degrees; the reading on one gauge affects the reading on another gauge. The gauges could be thought of as inductive spectra that interact. Further study is required to ascertain whether or not adjustments are required to available eGanges facilities, for this sort of equitable reasoning, and if so, what form they should take. However, it may also be possible to provide for alternative possible cases as tributary structures so that a recursive spiral is not necessary. Case particularisation of this sort, would occur at the evidentiary level, not as rules of law. This is a problem of application design that requires further study from which it may be determined that a modification of the eGanges shell epistemology is necessary.

It may be that interrogation design must make special provision for mixed questions of law and fact. There are only some areas of law that involve mixed questions of fact and questions of law that seem to require recursive adjustment in the sorting of answers into the adversarial windows. It may be that a question of fact determines Chapter Five: Further developments: application epistemology and ontology 332 the relevant question of law. Inductive spectra may assist in the determination of how to answer, taking into account the resolution of the question of law.

Equity reasoning is best understood as case reasoning, or reasoning about the evidence in a particular case; material facts are considered cumulatively in such a way that there are additions and subtractions that must somehow be calculated as contingent satisfaction of discretionary rule antecedents. These aspects of case reasoning, that are described here as equity reasoning, are Stage 1 matters that may require further development of the Stage 3 shell design, and consequent enhancement of the generic eGanges shell.

5.2.2 Daemons for disparate repetition

Where two antecedents that appear to be the same are located in two different rules, which cannot be locked together, then prima facie they are not one and the same antecedent. Their different locations distinguish them. For instance, the offeror must give consideration to the offeree, and the offeree must give consideration to the offeror. The antecedent consideration is not the same consideration in both rules. The direction of movement of the consideration is also different in each rule. The questions to be asked must incorporate this difference. In this regard, it is useful to be guided by the study of legal monads (cf. Leibniz, 1714). It may or may not be concluded at the end of a thorough study, that the option of a daemon should be available to the application builder, so that an answer to one question automatically also provides the same answer to the second question; changes to one automate changes to the other, all with notice to the user. The eGanges shell epistemology would have to be further developed if this problem is not addressed by existing possible solutions; notes on the problem could warn the user of the consequences of manipulating answers invalidly, but this may not be enough.

5.2.3 Evidentiary assessment systems

Where the evidence of two witnesses is contradictory, an evaluation of each witness and what they say has to be carried out in the determination of findings of fact. Minor premises of the successful argument are established, directly or indirectly, by the findings of fact. Whether or not this process of resolving the findings of fact can or should be seamless in an eGanges application, in the selection of Minor premises, requires study in order to determine the most appropriate solution. Wigmore’s system Chapter Five: Further developments: application epistemology and ontology 333 of evaluation of contradictory evidence provides an advanced scheme of criteria to assess interrogation (cf. Twining, 1985; Anderson and Twining, 1991). However, eGanges might also be used to provide a comparison of evidence to identify contradictory points that require closer evaluation of testimony. eGanges may be especially useful for the comparison of different cases in the conduct of litigation. Two consultations of the one application by the same party, putting the opposing cases, can be compared to determine if the outcomes are different. Consultations by opposing parties may be compared to determine any issues of fact. Consultations for different past and present cases can be compared to establish the usefulness of a precedent case for argument in a present case. Different precedent cases may be compared by the creation of consultation pathways for each. The hierarchical lists for a consultation that appear in the case windows can be compared to the hierarchical lists for a different consultation. Any inductive instance selected from a spectrum to support an input answer, may be noted by the user in the Notes window of the eGanges interface as user evidence; a comparison of inductive instances may be made by comparing the Reports of different consultations that include user Notes. Lists of Minor deductive premises may be compared, or the users’ pathways through the nested River system may be compared; one verifies the other.

There might be further shell enhancements for handling the implications of the burden of proof, in addition to use of interrogation epistemology and the existing gloss facilities for this purpose. Questions might ask: can you prove ...? Mostly, these enhancements would unnecessarily complicate the user-friendliness of the shell and are therefore not appropriate in the first instance; they might be appropriate after a widespread paradigm shift in the public mind set, that the frequent use of eGanges applications could produce. eGanges is a smart system that could be used to teach, and thereby advance, human intelligence itself.

5.2.4 Precedent retrieval

Parallel Rivers may be created for the retrieval of precedent cases as authorities for argument. Once the rule system id laid out, retrieval of case authorities occurs for each antecedent in the River for which there is a case authority. For each sector in the inductive spectrum of an antecedent, namely, negative, positive and uncertain, Chapter Five: Further developments: application epistemology and ontology 334 there may be one or more case authorities for one or more of the material facts that are the inductive instances in that sector. A case can not be an authority for more than one sector in a spectrum. However, a case may be an authority for contradictory sectors of different antecedents. Thus case A may be an authority for a material fact 1-A that establishes A. It can not have a contradictory ratio that has 1-A is not-A; it may be an authority for the negative sector, not-1-A is not-A, but only as one way to establish not-A. The same case may also be an authority for a material fact that establishes the antecedent not-F, which is the deciding factor for a negative Final result; thus it will also appear in a parallel River for the negative sector of the F node spectrum. It is this deciding factor of a precedent case that must be distinguished by the litigant with the positive case. For retrieval as argument, precedent authorities must be laid out to suit the logic of the rule system. One case may be an authority for one or more points in a system of rules, and, as authorities, cases may overlap even though their material facts are not analogous.

Preparation of large-scale cases for trial, requires preparation of evidence in relation to the antecedents of the rule system, and in relation to precedent cases, which are organised logically by the rule system. Precedent case analysis in legal practice is limited to the requirements of the client's case; prior analytics of legal knowledge engineering that is providing case retrieval support, must locate authorities in this way for the whole rule system of the application. Case retrieval support is not developed for the Vienna Convention application, as it is outside the limits of the doctoral work; however, eGanges can provide links to the many websites around the world that provide Vienna Convention decisions of courts of various countries. It would be in the interests of consistency if a United Nations civil court could provide an appellate jurisdiction.

5.2.5 Topological abduction

Topology in the 3d legal logic model is represented where there is abductive commentary on an antecedent in an eGanges application, whereby the antecedent in the eGanges application is the same as an antecedent in the abductive gloss system that is otherwise quite separate from the concerns of the application. For example, the best interests of the child is an antecedent in disputes between parents about the residence of the child, where parents do not live together. It may also be an antecedent in a system of reasoning about the perceptions of children, in the field of Chapter Five: Further developments: application epistemology and ontology 335 psychology, that is relevant in the parental dispute. The abductive strata logic of the eGanges shell epistemology, offers glosses of different sorts that might be viewed as strata topology.

What is meant by non-monotonic logic for the purposes of the Stage 1 legal domain epistemology is considered in Chapter Three, in terms of the mix of deductive, inductive and abductive premises and processes, and in this Chapter, in terms of the further mixing of strata types as topological logic in the abductive part of a legal argument. Topological logic may be admissible in legal argument, and in evidential reasoning where it is admissible in evidence, such as expert evidence, that uses a system of logic such as statistics or fuzzy logic, that is not otherwise used in argument about the application of rules of law.

5.2.6 eGanges affects Until eGanges is used widely, a study of its affects can only be hypothetical (cf. Casti, 1997; Rescher, 2006; Hendricks,2006). The area of its affects might be considered in the creation of law, compliance with law, and the evaluation of law. Since eGanges is applicable to some domains other than law, such as management, industrial relations, peace negotiation, communication and education, there may be combined affects of its use; the legal domain may have only one part of a much wider cluster of eGanges affects.

In the creation of law, eGanges applications might be used in legislative drafting to clarify the logic of the proposed rules. A posed application might be used to examine any differences between the social epistemology and legal epistemology pertaining to the substance of the application; refinements may be introduced accordingly.

Compliance applications provide an opportunity to examine the full extent and complexity of the law in relation to various goals or outcomes. Even if these applications make vast and complex law more viable for the realities of human life, the details of the law will be exposed to precise criticism. If compliance applications are created as a codification of the law, then there may be a sensible simplification of untenable detail, and other rationalisation, in much the same way as the Justinian codification of Roman law (Gray, 1997, pp.132-4). Techniques might be developed in the construction of compliance applications, to promote compliance and the efficiency and effectiveness of law. Chapter Five: Further developments: application epistemology and ontology 336 eGanges applications permit a precise evaluation of law. They can provide both the big picture of deductive justice and the microcosms of its inductive and abductive details. Related contexts of the legal system may also be considered where there are appropriate links to this information. Law may be evaluated at any information point. Sometimes law is purely regulatory, such as a determination of the side of the road for traffic directions of travel. An evaluation of regulatory law requires a certain set of considerations. Sometimes law has elements of social justice which requires another set of criteria for evaluation. The facilitation of the evaluation of different law by different sets of criteria, permits law to better adapt to the realities of human life, so that it may proactively optimise the quality of human society.

Some constraints in legal knowledge engineering are freed up by eGanges. Thus, it is shown how case reasoning occurs through the visualisation of eGanges processing in the consultation of an application. There are difficulties in designing case-based reasoning systems such as Ashley’s (1990) HYPO, in isolation of the system of rules already established by precedent cases. The generic computational epistemology of case reasoning, may be devised consistently with the computational epistemology of 3d legal logic; HYPO may be evaluated in the light of the eGanges shell epistemology with its nuances of rule-based reasoning, such as neutral nodes and pro tem negatives and uncertains.

5.3 SIGNIFICANCE

5.3.1 Hypotheses

Within the foregoing framework, three major hypotheses of the thesis are as follows:

1. that the computational domain epistemology of 3d legal logic may be used to devise a cost-effective legal expert system shell with an expert-friendly interface that provides transparency and a communication system for small and large scale legal knowledge engineering;

2. that the Vienna Convention has its own substantive epistemology, based on its ontology, that is subject to the generic computational domain epistemology of 3d legal logic; Chapter Five: Further developments: application epistemology and ontology 337

3. that the transformation from legal domain epistemology to computational domain epistemology, to shell epistemology, to programming epistemology, and then application epistemology, is an effective legal knowledge engineering method in the nature of a specific meta-epistemological method, suited to the development of small and large scale legal expert systems.

The thesis examines the concepts in these hypotheses, and verifies each hypothesis, partially in the light of conceptual analysis, and also through demonstration of the prototypes. Consideration of the hypotheses that have been posed are raised throughout the thesis so that relevant points are made as they arise in their context. Mostly, Chapter Three and Chapter Four are concerned with hypothesis 1; this Chapter and the Appendix are concerned with hypothesis 2; and all Chapters are concerned with hypothesis 3. This concluding Chapter, which is concerned with Stage 5, application ontology and epistemology, and Step 5, the testing of the shell by a large scale application sample, also delineates the extent to which the thesis verifies the hypotheses and falls short of a fuller verification of the hypotheses.

The thesis objective to clarify and develop certain areas of legal knowledge engineering, nevertheless is reached. It is not necessary to prove any of the three hypotheses absolutely in order to meet the objective of the thesis; partial verification is established, to the extent of the thesis.

The specific meta-epistemological method and its prototype implementation in this thesis are set out to show what they are, to indicate what further studies are required for a more complete implementation of the method, and, pursuant to the thesis objective, to clarify and develop certain areas of legal knowledge engineering. The thesis does not seek absolute or all possible proof of the hypotheses; to do so would assume that absolute proof is possible, or all possible proof, is achievable within the limits of one doctoral study; the thesis does not enter these philosophical debates. The thesis makes available an adequate sample of available verification pro tem, as much as is necessary to explain the specific meta-epistemological methodology and the prototypes in order to develop and clarify the areas of legal knowledge engineering to which they pertain. Chapter Five: Further developments: application epistemology and ontology 338

5.3.2 Epistemology

As specific meta-epistemological method, in this thesis, a selection of epistemological fragments from the work of four schools of Artificial Intelligence and Law, is extracted and synthesised in the prototypes produced by the specific meta-epistemological method. These fragments were appropriate for such synthesis. Further fragments might be introduced to the synthesis, by way of further development; however the extensive study required to do this is outside the scope of the thesis. The specific meta-epistemological method and the synthesis it produces, are a development of the Jurisprudence of Legal Knowledge Engineering in the Technological School (Gray, 1997) of legal knowledge engineering, which also includes the Computer Science of Legal Knowledge Engineering.

Schools of Artificial Intelligence and Law may be differently identified to the nomenclature adopted in this thesis. The range of schools identified include dominant schools, except for a fifth School, the Neural Nets School which does not contribute to the meta-epistemology used to develop the prototypes; it may be relevant peripherally to the cognitive aspects of the eGanges communication system. It is also possible to identify schools that are hybrids of the four identified Schools.

In addition to the epistemological fragments of the four Schools of Artificial Intelligence and Law, there are many other innovations developed for the prototypes; of major importance is the communication system, including the interactive visualisation of the rules system. Some of the innovations are also developments in the several four Schools.

The epistemological fragments of the four Schools of Artificial Intelligence and Law that are used in the development of the prototypes are as follows:

1. Legal Language School

The Legal Language School of Artificial Intelligence and Law, began with semantic nets and then posed the development of a computer language for the legal domain in terms of jurisprudential categories of natural and legal language. It might be thought that Dialog or dialectic epistemologies (Lodder, 1999; see also Petersen, 2002), as the basis for system design, refreshed this School by restoring the free use of natural language, although confining its logic largely to user's legal abduction. The script theory of Schank and Abelson (1977), finally found its place in Artificial Intelligence Chapter Five: Further developments: application epistemology and ontology 339 and Law, with game heuristics constraining the development by the users of their adversarial dialogue. The eGanges communication system might also be regarded as placing emphasis on the dialogue of interrogation and feedback as the basis of application processing, with intellectual artefacts, the rule maps of the interactive visualisation, as potentially a new legal language, a map language like icons or ideograms. Substantive maps of an application may capture jurisprudential categories, rules of law, and natural language, according to the express black letter law.

2. Rule-based School

Automation of the deductive use of rules of law, was posed by the Rule-based School. The necessity of deductive reasoning was initially identified as suitable for processing. There is no rule base in eGanges, but the rules of law, as Major deductive premises, are captured, isomorphically, in the interactive visualisation of the Rivers for extended deductive application according to the user's input. Implied rules that provide for the ontology of legal possibilities, the combinatorial explosion of cases within the express rules of black letter law, are implemented through the communication system of the interface.

3. Case-based School

Case reasoning is both deductive where the case is an authority for a rule, or inductive where the case is an authority for material facts that satisfy the antecedents of a rule of law. In eGanges, case induction and case authorities for a rule or part thereof, are located in relation to the system of rules, not automated. Case-base reasoning is usually deigned to synthesis the authority for a rule and the inductive aspects of cases. This is usually achieved by using factors of a case that are either the antecedents of the rule(s) relied on in the case, or material facts that have established antecedents of the rule(s) relied on; such factors are hybrid antecedents and material facts. This may be problematic in ascertaining how a case that is retrieved may be used in an argument, especially where the spectrum of material facts that are established is not clear, and where instances on the same inductive spectrum are not treated as such.

In eGanges, case reasoning is accommodated by glosses and also by the processing of the combinatorial possibilities that produce zig-zagging between positive, Chapter Five: Further developments: application epistemology and ontology 340 negative and uncertain rules. Case authorities at each point of zig-zagging are particularised, and processing of their cumulative significance in the Adversarial windows, confirm the cases that determine the Final consequent. Many cases may be relevant, but some have deciding factors; material facts may establish antecedents which trigger Pole rules that are deciding antecedents; Pole rules are deciding rules.

Case reasoning based on statistical epistemology of nearest neighbours, is not within legal epistemology. It raises problems if factors are treated independently of the rule hierarchy in which they occur. Even if cases are given factor patterns as the basis for matching of precedents and user cases, this is only reliable for cases on all fours; otherwise, the hierarchical and Spherical position of a factor in the system of rules which determines deciding factors, may be difficult to process. Cases that are different require a consideration of the significance of the difference, in terms of the rule system, not in terms of the extent of the pattern difference.

It might be thought that there is a Modules School in Artificial Intelligence and Law, which combined a rules module and a case module in one system, so that the rule module could assist the case module or vice versa. In eGanges, there is a combined rule and case system but only the rule reasoning is processed; the rule system processing carries with it, as glossing, the case information about the rules that are processed. Whatever triggers a Final result indicates the deciding case authority. The deciding antecedent produces the Final result label at the top of the list in the appropriate Adversarial case window; the node label directly beneath it, like any node label in the list, may be used to call up case glosses for it. The case glosses for each of the node labels in the Adversarial window may be accessed for precedent support. Spectrum glosses will reveal where material facts in different cases belong to the same spectrum sector for each antecedent; a spectrum may also suggest spectral arguments by analogy or position of a gap in the gradient.

4. Logic School

Following the development of Prolog in 1973, it became possible to write a logic program in code which would effect deductive logic processing (Kowalski and Sergot), rather than use a rule-base. With additional programming, this could provide more extensive output, giving reasons for a question or a decision, in natural language terms. However, the epistemology of PROLOG was not suited to the Chapter Five: Further developments: application epistemology and ontology 341 management of reasons of deeply nested rule systems implied by the ontology of legal possibilities, which was the complex logic of the legal domain; eGanges provides for the complex legal logic of the ontology of legal possibilities. Moreover, eGanges retains isomorphic rules of the Major deductive premises of syllogistic deduction, and makes separate provision for the accumulation of user-selected Minor premises and Final conclusions of the chosen logic pathway.

It might be thought that there is a Deontic and Action Logic School in Artificial Intelligence and Law, based on the work of von Wright (1951; 1963; 1983). To the extent that these paradigms appear in the substantive law of an application, they are provided for in eGanges, either in the labels of rule maps, or a gloss of the particular antecedent concerned. Any inconsistencies that thereby arise might warrant a warning in the Notes window.

There might also be a Non-monotonic Logic School which arose in the 1990s to address the difficulties of complex legal logic; the solutions posed required specific coding of irregularities. The eGanges shell epistemology distinguishes extended rule deduction which is suited to automation, and phenomena of legal induction and legal abduction. It keeps the three areas of logic separate as each form of legal logic is treated differently in legal argument. There may be a sequence of legal argument which orders parts of deduction, parts of induction and parts of abduction; the arrangement is a matter of advocacy skills. eGanges applications may assist the sequencing required.

5. Ontological School

The legal ontologies of the Ontology School, are derived from express black letter law and allow a determination of legal knowledge acquisition for the design of a legal expert system. In the eGanges shell epistemology, express ontologies are expanded by the implied ontologies of the computational legal epistemology of the thesis; possible legal ontologies also expand the range of teleological use of the law. Legal epistemology and its automation require provision for the possible cases within the scope of the black letter law, which also establish a basis for developing the directions in which it is likely to evolve, as a consistent whole.

The Ontological School of legal knowledge engineering, which emerged during the 1990s (Valente, 1995), assumed that legal ontologies are the express content of laws Chapter Five: Further developments: application epistemology and ontology 342 so that the meaning of law is contained objectively as data. However, the Ontological School does carry out some analysis of the legal ontology data, particularly in regard to the relations between ontological elements, patterns of abstraction and taxonomies; this analysis amounts to epistemological processing, but it is not a full specification of implied adversarial legal possibilities according to sound computational legal domain epistemology, i.e. how legal experts treat express legal ontologies and apply their knowledge of the law to client cases and client purposes.

The Ontology School, in a sense, superseded the early Semantic School and drew on the meaning of ontologies to find their processible structures. Search and match that simulates semantic processing, and search and match that simulates logic processing, ipso facto does not ensure sound legal epistemology in a legal expert system. A more holistic approach to the complex logic systems of substantive law is required, in order to retain the coherence and consistency that is necessary to sustain justice. eGanges provides this deep model of legal epistemology through its use of the computational epistemology of 3d legal logic, which incorporates the express and implied ontology of legal possibilities, and its user-friendly processing in the communication system of the shell epistemology.

5.4 JURISPRUDENCE OF LEGAL KNOWLEDGE ENGINEERING

The Dialogue School (Lodder, 1999) that emerged just after the Ontology School, did not limit the users to legal ontologies or legal language; but together these two Schools went to the fringes of jurisprudence where legal epistemology was abandoned. They reflected the inability of computer science to free up the Feigenbaum bottleneck. Lawyers had worked for many centuries with a duality of idealism and realism; such a duality was anathema to science and the approach of science was always to subject idealism to realism. This thesis shows that a compromise can be reached whereby law-making power can be retained to design civilisation with technological aids more suited to science. However, the compromise requires an immersion in the ideal, in artificial metaphysics, to find out how it can be enhanced by science and technology. A jurisprudence of legal knowledge engineering is produced from the specific meta-epistemological method which is a methodology to achieve that compromise. Chapter Five: Further developments: application epistemology and ontology 343

Overall, the specific meta-epistemological method used the large scale space of its artificial metaphysics to develop major paradigms in most of the existing Schools of Artificial Intelligence and Law, and then synthesised these developments in a new way, containing synergy.

The theoretical overlapping of the four Schools of Artificial Intelligence and Law, and the work on hybrid systems, assisted the task of developing the compromise synthesis. All the Schools of Artificial Intelligence and Law belong to the Technological School of Jurisprudence (Gray, 1997); they each contribute some jurisprudential significance. However, with an appropriate compromise synthesis, the Jurisprudence of Legal Knowledge Engineering can move forward with the major jurisprudential task of a technological codification to assist and extend legal practice. The substantive law has to be mapped and glossed by teams of experts, the interrogation data has to be created, and, along the way, any reforms recommended if the prior analytics and mereology point to problems. Artificial legal intelligence can grow in the space of artificial metaphysics, as a synthesis of computer science and jurisprudence, created by legal knowledge engineering; it may constitute a new collective legal intelligence, in a computer age which Minsky (1986) conceived as the society of mind.

The thesis also uses fragments of philosophy that were determined to be significant for legal knowledge engineering: There are plenty more fragments available for exploitation. The field of philosophy offers advanced human intelligence; it once encompassed science. While artificial intelligence must be user-friendly, expertise is generally advanced intelligence; the task of artificial intelligence is to capture advanced intelligence and convey it in a way friendly to its users. Legal expertise must be friendly to ordinary people, as they are users of the legal system.

5.5 STAGE 5 SPECIFIC META-EPISTEMOLOGICAL METHOD

The specific substance of the Vienna Convention indicates that the Convention has many social purposes that are international. The Stage 5 specific meta- epistemological method, is concerned with purposes. It is at this Stage that instructions might be taken from the application owner, on what the application is to Chapter Five: Further developments: application epistemology and ontology 344 be used for, the nature of the users, and the environment in which it will be made available. These matters may affect the design of the application. The design of the application must accord with the generic epistemology of the shell. Synthesis is appropriate method in the construction of the application. Retroduction to Stage 1 or via Stage 1 to Stage 2 and Stage 3, is also a method to be used as and when required. A communications expert or a linguist may be consulted for the formulation of the questions or interrogation. A graphic artist may be consulted for shaping the maps. Consideration of the synergy of the application overall might also be advisable.

6 8 APPENDIX

eGanges interface and eGanges River maps from the United Nations Convention on

Contracts for the International Sale of Goods

(Vienna Convention) Application APPENDIX 383

Figure A-1 APPENDIX 384

Figure A-2

Figure A-3 APPENDIX 385

Figure A-4

Figure A-5 APPENDIX 386

Figure A-6

Figure A-7 APPENDIX 387

Figure A-8 APPENDIX 388

Figure A-10

Figure A-11 APPENDIX 389

Figure A-12

Figure A-13 APPENDIX 390

Figure A-14

Figure A-15

Figure A-16

Figure A-17 APPENDIX 391

Figure A-18

Figure A-19 APPENDIX 392

Figure A-20

Figure A-21 APPENDIX 393

Figure A-22

Figure A-23 APPENDIX 394

Figure A-24

Figure A-25 APPENDIX 395

Figure A-26

Figure A-27 APPENDIX 396

Figure A-28

Figure A-29 APPENDIX 397

Figure A-30 LIST OF REFERENCES Abbott, E.A. (1884, 1992): Flatland: a romance of many dimensions, (originally published by Seeley & Co., London, England) Dover Publications Inc, New York, USA.

Aikenhead, M. (1996): Book review, A. Valente, Legal knowledge engineering, International Review of Law, Computers and Technology, Vol. 10, Issue 2, p351, Oct.

Allen, L.E. (1957): Symbolic logic: a razor-edged tool for drafting and interpreting legal documents, 66 Yale Law Journal, 833.

Allen, L.E. (1961): WFF'N PROOF: the game of modern logic, Autotelic Instructional Materials Publishers, Turtle Creek, New Haven, USA.

Allen, L.E. (1968): A language-normalization approach to information retrieval in law, 9 Jurimetrics Journal, 1

Allen, L.E. (1974): Formalizing Hohfeldian analysis to clarify the multiple senses of 'Legal Right': a powerful lens for the electronic age, 48 Southern California Law Review, 428-487.

Allen, L E. (1980): Language, law and logic: plain legal drafting for the electronic age, in B. Niblett (ed.), Computer science and law, Cambridge University Press, Cambridge, England.

Allen, L E. (1982): Towards a normalized language to clarify the structure of legal discourse, in A.A. Martino (ed.), Deontic logic, computational linguistics and legal information systems, North-Holland, Amsterdam, Holland.

Allen, L E. (1983): Two modes of representing sets of legal norms: normalization and an arithmetic model, in Proceedings of the Third International Congress on Juridical Informatics and the National and International Community, Centre for Electronic Documentation, Supreme Court of Italy, Rome, Italy.

Allen, L E. (1995): Enriching the deontic fundamental legal conceptions of Hohfeld, in J. Bing and O. Torvund (eds), Anniversary anthology in computers and law, TANO-publ., Oslo, Norway.

Allen, L E. (1996): From the fundamental legal conceptions of Hohfeld to LEGAL RELATIONS: refining the enrichment of solely deontic legal relations, in List of References 346

M.A. Brown and J. Carmo (eds), Deontic logic, agency and normative systems, Springer and The British Computer Society, Sesimbra, Portugal.

Allen, L.E.and Caldwell, M.E. (1963): Modern logic and judicial decisionmaking: a sketch of one view, in H.W. Baade (ed.), Jurimetrics, Basic Books Inc., New York, USA.

Allen and Saxon, C.S. (1986): Analysis of the logical structure of legal rules by a modernised and formalized version of Hohfeld fundamental legal conceptions. A. A. Martino and F.S. Natali (eds), Automated analysis of legal texts, Elsevier, Amsterdam, Holland.

Allen, L. E., and Saxon, C. S.(1991): 'More 1A needed in A1: Interpretation Assistance for coping with the problem of multiple structural interpretations', in Proceedings of the Third International Conference on Artificial Intelligence and Law, Association for Computing Machinery, New York, USA.

Allen, L. E., and Saxon, C. S. (1993): A-Hohfeld: a language for robust structural representation of knowledge in the legal domain to build interpretation- assistance expert systems, in J-J.C. Meyer and R.J. Wieringa, Deontic logic in computer science: normative system specification, John Wiley and Sons, New York, USA.

Allen, L. E., and Saxon, C. S. (1994): Controlling inadvertent ambiguity in the logical structure of legal drafting by means of the prescribed definitions of the A-Hohfeld structural language, 9 Theoria, 135-172.

Allen, L. E., and Saxon, C. S. (1995): Better language, better thought, better communication: the A-Hohfeld language for legal analysis, in Proceedings of the Fifth International Conference on Artificial Intelligence and Law, Association for Computing Machinery, New York, USA.

Allen, L E., and Saxon, C. S. (1997): Achieving fluency in modernized and formalized Hohfeld: puzzles and games for the LEGAL RELATIONS Language, in Proceedings of the Sixth International Conference on Artificial Intelligence and Law, Association for Computing Machinery, New York, USA. List of References 347

Anderson, B.F. (1980): The complete thinker, Prentice-Hall, Englewood Cliffs, N.J., USA.

Anderson, J.R.(1983): The architecture of cognition, Harvard University Press, Cambridge, MA, USA.

Anderson, T. and Twining, W. (1991): Analysis of evidence. How to do things with facts based on Wigmore's Science of judicial proof, Little, Brown and Company, Boston, USA.

Anon. (1952): Biographical note Francis Bacon, 1561-1626, in R. M. Hutchins (ed.) Great books of the Western World, Vol 30, Encyclopaedia Britannica Inc., Chicago, USA.

Aristotle (1952, originally c.335 BC): Organon, in R. M. Hutchins (ed.) Great Books of the Western World, Vol 8, Encyclopaedia Britannica Inc., Chicago, USA.

Aristotle (1952, originally c.330 BC) Metaphysics, in R. M. Hutchins (ed.) Great Books of the Western World, Vol 8, Encyclopaedia Britannica Inc., Chicago, USA.

Ashley, K.D. (1990): Modeling legal argument: reasoning with cases and hypotheticals, MIT Press, Cambridge, Massachusetts, USA.

Atkinson, K., Bench-Capon, T. and McBurney, P. (2005): Arguing about cases as practical reasoning, in Proceedings of the Tenth International Conference on Artificial Intelligence and Law, ACM, New York, USA.

Atkinson, M. (1988): Cognitive science and philosophy of mind, in McTear, M.F., (ed.) Understanding Cognitive Science, Ellis Horwood Ltd, Chichester, England.

Augustynek, Z. and Jadacki, J.J. (1993): Possible ontologies, Editions Rodopi B.V., Amsterdam, Holland.

Austin, J. (1995, originally published in1832): The province of jurisprudence, W.E. Rumble (ed.), Cambridge University Press, Cambridge, England.

Auyang, S.Y. (1998): Foundations of complex-system theories, Cambridge University Press, Cambridge, England. List of References 348

Bacon, F. (1952, first published 1620): Novum Organum, in R.M. Hutchins, Great Books of the Western World, Vol. 30, Encyclopaedia Britannica Inc., Chicago, USA

Baird, D.G., Gertner, R.H. and Picker, R.C. (1994): Game theory and the law, Harvard University Press, Cambridge, MA, USA.

Balkin, J.M. (1993): Understanding legal understanding, 103 Yale Law Journal, 105.

Bankowski, Z. and Mungham, G. (1976): Images of law, Routledge and Kegan Paul, London, England.

Bankowski, Z., White, I. and Hahn, U. (eds) (1995): Introduction, in Z. Bankowski, I. White and U. Hahn (eds), Informatics and the foundations of legal reasoning, Kluwer Academic Publishers, Dordrecht, Holland.

Baral, C. (2003): Knowledge representation, reasoning and declarative problem- solving, Cambrige University Press, Cambridge, England.

Barker, S.F. (1965): The elements of logic, McGraw-Hill Book Co., New York, USA.

Baum, R. (1996): Logic, 4th ed., Harcourt Brace College Publishers, Orlando, USA

Bay, R.(1992): Roundabout: mind maps and questionnaires for fluency practice, Classic Communication Skills, NSW, Australia.

Belnap, N. (1996): Agents in branching time, in Copeland, B.J. (ed.), Logic and reality, Clarendon Press, Oxford, England.

Benthem, J.van (1996):Modal logic as a theory of information, in Copeland, B.J. (ed.), Logic and reality, Clarendon Press, Oxford, England.

Bergmann, G. (1960): Meaning and existence, University of Wisconsin Press, Madison, USA.

Bertalanffy, L.von (1968): General System Theory: Foundations development applications, Penguin Books, London, England.

Bertalanffy, L.von (1972): The quest for systems philosophy, Metaphilosophy 3, (April), 142-5.

Bex, F.J. and Prakken, H. (2004): Reinterpreting arguments in dialogue: an application to evidential reasoning, in Legal knowledge and information systems, JURIX 2004, IOS Press, Amsterdam, Holland. List of References 349

Bing, J. (ed.) (1984): Handbook of legal information retrieval, Elsevier, Amsterdam, Holland.

Bix, B.(1996): Jurisprudence: theory and context, Sweet & Maxwell, London, England.

Blanché, R. (1983), L’épistémologie, 3rd edn, Presses Universitaires de France, Paris, France.

Blocker, H. G. (1974): The meaning of meaninglessness, Martinus Nijhoff, The Hague, Holland.

Bochenski, I.M. (1970, first published 1956 in Germany): A history of formal logic, translated by Ivo Thomas, Chelsea Publishing Co., New York, USA.

Bohnert, H.G. (1945): The semiotic status of commands, Philosophy of Science, 12 (July) pp.302-315.

Bolinger, D. (1977): Meaning and form, Longman's, London, England.

Bolzano, B. (1950): Paradoxes of the Infinite, Routledge and Kegan Paul, London, England.

Boole, G. (1952, first published 1847): Mathematical analysis of logic, in Collected Logical Works, Watts, London, England.

Bourbakis, N.G.(ed.) (1992): Artificial intelligence methods and applications, World Scientific Publishing Co. Pte. Ltd., Singapore.

Bourbakis, N.G. (1992): Introduction, in Bourbakis, N.G.(ed.), Artificial intelligence methods and applications, World Scientific Publishing Co. Pte. Ltd., Singapore.

Bowden, B.V. (1953): A brief history of computation, in B.V. Bowden (ed.), Faster than thought, Sir Isaac Pitman and Sons Ltd, London, England.

Bradley, R. and Swartz, N. (1979): Possible Worlds, Basil Blackwell, Oxford, England.

Branting, L. K. (1991): Reasoning with portions of precedents, in Proceedings of the Third International Conference on Artificial Intelligence and Law, Association for Computing Machinery, New York, USA.

Buchanan, B.G., Sutherland, G. and Feigenbaum, E. (1969): Heuristic DENDRAL: a program for generating explanatory hypotheses in organic chemistry, in B. List of References 350

Meltzer and D. Michie (eds), Machine Intelligence 4, Edinburgh University Press, Edinburgh, Scotland.

Buchanan, B.G. and Headrick, T.E. (1970): Some speculation about artificial intelligence and legal reasoning, Stanford Law Review Vol.23, No.1, Nov. pp.40-62.

Buchanan, B.G. and Shortliffe, E. (eds) (1984): Rule-based expert systems: The MYCIN experiments of the Stanford heuristic programming project, Addison- Wesley, London, England.

Buchler, J. (1961): The concept of method, Columbia University Press, New York, USA.

Burn, A.R.(1960, Reprint with amendments, 1967): The Lyric Age of Greece, Edward Arnold, London, England,.

Buzan, T. (1974): Use your head, BBC Books, London, England.

Buzan, T. and Buzan, B. (second edition, 1995, first edition, 1993): The Mind Map Book, BBC Books, London, England.

Capper, P. and Susskind, R.E. (1988): Latent Damage Law: The Expert System, Butterworths, London, England.

Carnap, R. (1947): Meaning and necessity: a study in semantics and modal logic, University of Chicago Press, Chicago, USA.

Carney, J.D. and Scheer, R.K. (1974, 2nd ed.): Fundamentals of logic, Macmillan Publishing Co., New York, USA.

Carr, B.(1995): Knowledge theory for computer scientists, Prentice Hall, Upper Saddle River, NJ, USA.

Cassirer, E. (1961): The logic of the humanities, Yale University Press, New Haven, Conn., USA.

Casti, J. L. (1994): Complexification, HarperCollins Publishers, New York, USA

Casti, J. L. (1997): Would-be worlds – how simulation is changing the frontiers of science, John Wiley and Sons Inc. NY. USA.

Castro, E.H. (1978, 1981, 1985): Understanding income tax law the flowchart way, Butterworths, Sydney, Australia.

Ceci, S.J. (1996): On intelligence, Harvard University Press, Cambridge, MA,.USA List of References 351

Codd, E.F. (1970): A relational model of data for large stored data banks, Communications of the ACM, 13, 377-87.

Coecke, B., Moore, D.J. and Smets, S. (2004): Logic of dynamics and dynamics of logic: some paradigm examples, in S.Rahman, J. Symons, D.M. Gabbay and J.P.v. Bendegem, Logic, epistemology and the unity of science, Kluwer Academic Publishers, Dordrecht, Holland.

Collingwood, R.G. (1940): Essay on metaphysics, Clarendon Press, Oxford, England.

Conover, M. (1986): Applicability of systems analysis to jurisprudence, in T. Rasmussen (ed.), System science and jurisprudence, Spartan Press, Lansing, Michigan, USA, pp.88-97.

Conover, M. (1988): Applying three-dimensional thinking and systems technologies to jurisprudence and legal management, in T. Rasmusson (ed.) Interactive systems and law, Spartan Press, Lansing, Michigan, USA

Cook, S., Hafner, C.D., McCarty, L.T., Meldman, J.A., Peterson, M., Sprowl, J.A., Sridharan, N.S. and Waterman, D.A. (1981): The applications of artificial intelligence to law: a survey of six current projects, in Proceedings of the National Computer Conference, Chicago, Il., USA.

Copeland, B.J. (ed.), (1996): Logic and reality, Clarendon Press, Oxford,.England.

Copeland, B.J. (ed.), (2005): Alan Turing's automatic computing engine, Oxford University Press, Oxford,.England.

Copi, I.M. (1961): Introduction to logic, Macmillan Co., New York,.USA.

Copi, I.M. & Gould, J.A. (1967): Contemporary readings in logical theory, Macmillan Co., New York, USA.

Curzon, L.B. (1979): Jurisprudence, Pitman Publishing, London, England.

Daniels, J.J. and Rissland, E. L. (1997): Finding legally relevant passages in case opinions, in Proceedings of the Sixth International Conference on Artificial Intelligence and Law, Association for Computing Machinery, New York, USA.

Davis, R. (1980): Meta-rules: reasoning about control, Artificial Intelligence 15, 179- 222 List of References 352

Davis, R. (1982): Applications of meta-level knowledge to the construction, maintenance and use of large knowledge bases, PhD thesis, Dept. of Computer Science, Stanford University, Stanford, CA., 1976, and as Teiresias: Applications of meta-level knowledge, in R.Davis and D. Lenat, Knowledge-based systems in artificial intelligence, Part Two, McGraw-Hill, New York,.USA.

Davis, W.H. (1972): Peirce's epistemology, Martinus Nijhoff, The Hague, Holland.

De Bono, E. (1972): PO : Beyond Yes and No, Simon & Schuster, New York, USA.

Deanesly, M. (1929): Medieval Schools to C1300, in J.R. Tanner, C.W. Previte- Orton and Z.N. Brooke (eds), Cambridge medieval history, Vol. V, Cambridge University Press, Cambridge, England.

Deedman, C. (1987): Building rule-based expert systems in case-based law, LL.M thesis, University of British Columbia, Faculty of Law, Canada.

Derrida, J. (1976): Of grammatology, John Hopkins University Press, Baltimore, USA.

Derrida, J. (1978): Writing and difference, Routledge, London, England.

Derrida, J. (1988): Limited Inc., Northwestern University Press, Evanston, IL, USA.

Descartes, R. (1955, first published in 1637): Discourse on method, in The Philosophical Works of Descartes, translated by E.S. Haldane and G.R.T. Ross, Dover, New York, USA.

Descartes, R. (1955, first published in 1644): Principles of philosophy, in The philosophical works of Descartes, translated by E.S. Haldane and G.R.T. Ross, Dover, New York, USA.

Detmer, D. (1986): Freedom as a value, Open Court, La Salle, Il., USA.

Detmold, M.J. (1984): The unity of law and morality: a refutation of legal positivism, Routledge and Kegan Paul, London, England.

Dewey, J. (1924): Logical method and law, 10 Cornell Law Quarterly 17.

Dickerson, R. (1963): Some jurisprudential implications of electronic data processing, in H.W.Baade (ed.), Jurimetrics, Basic Books Inc., New York, USA List of References 353

Dreyfus, H. (1992): What computers still can't do, MIT Press, Cambridge, MA, USA.

Dreyfus. H., and Rabinow, P. (1982): Michel Foucault: Beyond structuralism and hermeneutics, With an Afterword by Michel Foucault, University of Chicago Press, Chicago, USA

Du Feu, D. (1980): Selecting welfare benefits by computer, in B.Niblett (ed.), Computer science and law, Cambridge University Press, Cambridge, England.

Dunn, C.M. (1969): Introduction, in C.M. Dunn (ed.), The logike of the moste excellent philosopher P. Ramus Martyr translated by Roland MacIlmaine, San Fernando Valley State College, Northbridge, California, USA.

Dworkin, R.((1985): A matter of principle, Harvard University Press, Cambridge, MA,.USA.

Easterby, R. and Zwaga, H. (eds), (1978, 1984): Information design, John Wiley and Sons Ltd., New York, USA.

Einstein, A. (1920): Relativity: the special and general theory, translated by R.W. Lawson, Methuen and Co. Ltd, London, England

Einstein, A. (1922): The meaning of relativity, translated by E.P. Adams, Methuen and Co. Ltd, London, England.

Etherington, D. (1987): Formalizing non-monotonic logic, Artificial Intelligence 31, 41-85.

Everson, S.(ed.), ( 1990): Epistemology, Cambridge University Press. Cambridge, England.

Everson, S. (1990): Introduction, in Everson, S.(ed.), Epistemology, Cambridge University Press. Cambridge, England.

Fauconnier, G., (1985): Mental spaces: Aspects of meaning construction in natural language, MIT Press, Cambridge, MA, USA.

Fauconnier, G., (1997): Mappings in thought and language, Cambridge University Press, Cambridge, England.

Fauconnier, G. and Turner, M. (2002): The way we think: Conceptual blending and the mind's hidden complexities, Basic Books, New York, USA. List of References 354

Feigenbaum, E.A. (1981): Expert systems in the 1980s, in A.H. Bond (ed.), Machine intelligence, Infotech State of the Art Report Series 9, no.3, Pergamon Infotech, Maidenhead, Berkshire, England.

Feigenbaum, E.A. (1983): Knowledge engineering: the applied side, in J.E. Hayes and D. Michie (eds), Intelligent systems: the unprecedented opportunity, Ellis Horwood, Chichester, England.

Feigenbaum, E. and Feldman, J. (eds), (1963): Computers and thought, McGraw- Hill, New York, USA.

Fine, K. and Schurz, G. (1996): Transfer theorems for multi-modal logics, in B.J. Copeland (ed.), Logic and reality, Clarendon Press, Oxford, England.

Fitzpatrick, M., Keane, T. and Montgomery, A.Y. (2002): Building Information Systems, 2nd ed., Social Science Press, Tuggerah, Australia.

Flugel, J. (1947): An Inquiry as to popular views on intelligence and related topics, British Journal of Educational Psychology, 27, 140-52.

Flegg, H.G. (1974): From geometry to topology, The English Universities Press Ltd, England.

Forsyth, R. (1984): Expert systems, Chapman and Hall, London, England.

Foucault, M. (1966, 1970): The order of things: an archaeology of the human sciences, English translation by A.M. Sheridan-Smith, Random House, New York, USA.

Foucault, M. (1969, 1972): The archaeology of knowledge, English translation by A.M. Sheridan-Smith, Pantheon Books, New York,USA.

Foucault, M. (1983): Beyond structuralism and hermeneutics, University of Chicago Press, Chicago, USA.

Fraunce, A. (originally published.1588, 1969):, The lawyer's logic, Reproduced from the original manuscript in the British Museum, Lawiers Logike, first published by William How, London, The Scolar Press Limited, Menston, Yorkshire, England.

Frege, G. (1879): Begriftsschrift, in J.v. Heijenoort (ed.), Frege and Godel: two fundamental texts in mathematical logic, Harvard University Press, Cambridge, England. List of References 355

Freud, S. (originally published.1915, 1984): The unconscious, in A.Richards (ed.), On metapsychology, Pelican Freud Library, Vol.II, pp.159-222, Harmondsworth,England.

Furnham, A. (2000): Thinking about intelligence, The Psychologist, Vol.13, No.10, October, pp510-15.

Gardner, A.v.d.L. (1987): An artificial intelligence approach to legal reasoning, MIT Press, Cambridge, USA

Gardner, H. (1999): Intelligence reframed, New Books: Basic Books, New York, USA.

Garnham, A. (1987): Mental models as representations of discourse and text, Ellis Horwood Ltd, Chichester, England.

Gazdar, G. (1981): On syntactic categories, in H.C. Longuet-Higgins, J. Lyons, and D.E. Broadbent, (eds), The psychological mechanisms of language, The Royal Society and the British Academy, London, England.

Gentner, D. and Stevens, A.L., (eds) (1983): Mental Models, Lawrence Erlbaum, Hillsdale, NJ, USA.

Gick, M.L. and Holyoak, K.L. (1983): Schema induction and analogical transfer, Cognitive Psychology, 15, 1-388.

Gillies, D. (1996 ): Artificial intelligence and scientific method, Oxford University Press, Oxford, England.

Ginsberg, M. (1986, 1987): Multi-valued logics, AAAI 86, 243-247, and in M. Ginsberg (ed.), Readings in non-monotonic reasoning, Morgan Kaufmann, Los Altos, CA, USA.

Ginsberg, M. (ed.) (1987), Readings in non-monotonic reasoning, Morgan Kaufmann, Los Altos, CA, USA.

Girle, R. (2003): Possible worlds, Acumen Publishing limited, Chesham, Bucks, England.

Gordon, T.F. (1991): An abductive theory of legal issues, International Journal of Man-Machine Studies, 35, 95-118.

Gordon, T.F. (1994): The pleadings game: an exercise in computational dialectics, Artificial Intelligence and Law, 2, 239-292. List of References 356

Gordon, T.F. (1995): The pleadings game, Kluwer, Academic Publishers, Dordrecht, Holland.

Gordon, W.J.J. (1961): Synectics, Harper. New York, USA.

Gottlieb, G. (1968): The logic of choice: an investigation of the concepts of rule and rationality, Allen & Unwin, London, England.

Grant, M. (1989): The classical Greeks, Charles Scribner's Sons, NY, USA.

Gray, P.N. (1985): Contract law management information system: A Jurisprudential System in J. A. Bowden and S. Lichtenstein (eds.) Student control of learning: computers in tertiary education, Proceedings of the Calite 85 Conference, Melbourne University, December 1985, Centre for the Study of Higher Education, Melbourne University, Melbourne, Australia.

Gray, P.N. (1988): The CLIMS (Contract Law Information Management System) Pilot - automatable law, in Proceedings of the 4th International Congress on Computers and Law, Rome, Italy.

Gray, P.N. ( 1990): Choice and jurisprudential systems, LL.M. thesis, University of Sydney, Sydney, Australia.

Gray, P.N. (1995): Scaling up to a three dimensional graphic trace, in C. Ciampi, F. Socci Natali, and E.G. Taddei (eds), Verso un sistema esperto giuridico integrale, Vol. 1, Cedam, Padua, Italy.

Gray, P.N. (1997): Artificial legal intelligence, Dartmouth Publishing Co., Aldershot, England.

Gray, P.N. (2002): Explanatory rule maps. Computers and Law Journal for the Australian and New Zealand Societies for Computers and the Law 50:11-13, Dec.

Gray, P.N. (2004): Intellectual artefacts of expert systems meta-epistemology, in J. Weckert and Y. Al-Saggaf (eds), Computers and Philosophy, Australian Computer Science Communications, Vol 37, pp.51-8.

Gray, P. N. (2005): eGanges: epistemology and case reasoning, in P.E. Dunne and T. Bench-Capon (eds), Argumentation in artificial intelligence and law, Wolf Legal Publishers, Nijmegen, Holland. List of References 357

Gray, P.N. and Gray, X. (2003): A map-based expert-friendly shell, in D. Bourcier, (ed.), Legal Knowledge and Information Systems, IOS Press, Amsterdam, Holland.

Gray, P.N. and Mann, S. (2003): The Fraunce (1588) model of case-based reasoning, in Proceedings of the Ninth International Conference on Artificial Intelligence and Law, Association for Computing Machinery, New York, USA.

Gray, P.N., Gray, X. and Zeleznikow, J., (2007): A negotiating logic: for richer or poorer, to be published in Proceedings of the Eleventh International Conference on Artificial Intelligence and Law, Association for Computing Machinery, New York, USA.

Greenleaf, G., Mowbray, D., and Tyree, A. (1987): Expert systems in law: the Datalex Project, in Proceedings of the First International Conference on Artificial Intelligence and Law, Association for Computing Machinery, New York, USA.

Gupta, A. (1980): The logic of common norms: an investigation in quantified modal logic, Yale University Pres, New Haven, Conn, USA.

Haan, N.d. (1996): Automated legal reasoning, PhD thesis, University of Amsterdam, Holland.

Hafner, C. (1978): An information retrieval system based on a computer model of legal knowledge, PhD thesis, University of Michigan, USA.

Hafner, C. (1987): Conceptual organisation of case law knowledge bases, in Proceedings of the First International Conference on Artificial Intelligence and Law, Association for Computing Machinery, New York, USA.

Hager, P., and Roe, J.(eds) (1995): Out on a lemma, University of Technology, Sydney, Australia.

Hampton-Turner, C. (1981): Maps of the mind, Collier Books, NY, USA.

Hao, W. (1990): Computation, logic, philosophy, Science Press, Beijing, China.

Harre, R. and Madden, E.H. (1975): Causal powers, A theory of natural necessity, B. Blackwell, Oxford, England. List of References 358

Harre, R. and Krausz, M., (1996): Varieties of relativism, Blackwell, Oxford, England.

Hart, H.L.A.(1953): Definition and theory in jurisprudence, Clarendon Press, Oxford, England.

Hart, H.L.A.(1961): The concept of law, Oxford University Press, London, England.

Hart, P.E. (1982): Directions for AI in the eighties, 79, SIGART Newsletter, 11.

Hayes-Roth, F., Waterman, D.A. and Lenat, D.B. (eds), (1983): Building expert systems, Addison-Wesley Publishing Company Inc., London, England.

Heath, P. (ed.) (1966): August de Morgan on the syllogism and other logical writings, Routledge and Kegan Paul, London, England.

Hegel, G.W.F. (originally published 1830, 1991): The encyclopaedia logic, 3rd ed. translated by T.F. Garaets, W.A. Suchting, & H.S. Harris, Hackett Publishing Co. Inc., Indianopolis, USA.

Hendricks, V. F. (2006): Mainstream and formal epistemology, Cambridge University Press, Cambridge, England

Hodgson, D. (1991): Mind matters: consciousness and choice in a quantum world, Oxford University Press, Oxford, England.

Hohfeld, W. N. (1913): Fundamental legal conceptions as applied in judicial reasoning, Yale L.J., (23), 16-59.

Hollis, J.W. and Hollis, L.U. (1969): Personalizing information processes, The Macmillan Company, London, Collier-Macmillan Limited, Toronto,.Canada.

Holmes, O.W. (1881): The common law, Little Brown and Co., Boston, USA.

Holmes, O.W. (1897): The path of the law, vol.10, Harv. L.R., 457.

Holmes, O.W. (1899): Law in Science and Science in Law, 12, Harv. L.R., 443.

Holmes, O.W. (1920): Collected legal papers, Constable, London, England.

Horn, A. (1951): On semantics which are true on direct unions of algebras, J. of Symbolic Logic, 16.

Horrock, I., (2005): Ontologies and the semantic web, http://www.epsg.org.uk/pub/needham2005/Horrocks_needham2005.pdf, List of References 359

Howe. C.S. (1961): Foreword, in Cassirer, E., The logic of the humanities, Yale University Press, New Haven, Conn, USA.

Hunt, A. and Wickham, G. (1994): Foucault and law, Pluto Press, London, England.

Hurley, P.J. (1994): A concise introduction to logic, Wadsworth Publishing Company, Belmont, California, USA.

Hutchinson, A.(ed.) (1989): Critical legal studies, Rowman & Littlefield, Totowa, N.J., USA.

Huxley, J. (1974): Evolution – the modern synthesis, 3rd ed., George Allen and Unwin, London, England.

Huxley, J., Hardy, A.C. and Ford, E.B. (eds) (1954): The evolutionary process, George, Allen and Unwin, London, England.

International Organisation for Standardization, (1997): Human-centred design processes for interactive systems, Geneva, Switzerland.

Ishikawa, K. (1985): What is total quality control? The Japanese way, translated by David J. Lu., Prentice-Hall Inc., Englewood Cliffs, N.J., USA.

Johnson-Laird, P.N. (1988): The computer and the mind, Harvard University Press, Cambridge, MA,.USA.

Jones, E.E.C. (1911): A new law of thought and its logical bearings, Cambridge University Press, Cambridge, England.

Kant, I. (1955, first published in 1781): Critique of pure reason, in Great Books of the Western World, Vol. 42, translated by J.M.D. Meiklejohn, T.K. Abbott and J.C. Meredith, Encyclopaedia Britannica, Chicago, USA.

Kant, I. (1955, first published in 1788): Critique of practical reason, in Great Books of the Western World, Vol. 42, translated by J.M.D. Meiklejohn, T.K. Abbott and J.C. Meredith, Encyclopaedia Britannica, Chicago, USA.

Kant, I. (1955, first published in 1790): Critique of judgment, in Great Books of the Western World, Vol. 42, translated by J.M.D. Meiklejohn, T.K. Abbott and J.C. Meredith, Encyclopaedia Britannica, Chicago, USA.

Kelsen, H.. (originally published 1911, 1967,): Pure theory of law, translated by M.Knight, University of California Press, Berkeley, Los Angeles, USA.

Kelsen, H..(1945): General theory of law and state, Russell and Russell, NY, USA. List of References 360

Kelsen, H. (1974): Essays on legal and moral philosophy, Reidel, Dordrecht, Holland.

Kevelsen, R. (ed.) (1987): Law and semiotics, Vol. 1, Plenum Press, New York, USA

Kevelsen, R.(1988): The law as a system of signs, Plenum Press, New York, USA

Kingsley, E., Kopstein, F.F. and Seidel, R.J. (1971): Graph theory as a meta- language of communicable knowledge, in M.D. Rubin (ed.) Man in systems, Gordon and Breach Science Publishers, New York, USA.

Kirschner, P. A. (ed) (2003): Visualizing argumentation, Springer, New York, USA.

Kneale, W. & Kneale, M. (1962): The development of logic, Oxford at the Clarendon Press, Oxford, England.

Knuth, D.E. and Pardo, L.T., (1980), The early development of programming languages, in N. Metropolis, J. Howlett, and G-C Rota, eds (1980), A history of computing in the twentieth century, New York, Academic Press.

Kochen, M. (ed.) ( 1975): Information for action, Academic Press Inc., New York,.USA.

Kornblith, H. (1993): Inductive inference and its natural ground - an essay in naturalistic epistemology, MIT Press, Cambridge, MA, USA.

Korzybski, A., (1958, originally published 1933, second edition, 1941) Science and sanity: an introduction to non-Aristotelian systems and general semantics, The International Non-Aristotelian Library Publishing Company, Lakeville, Conn., USA.

Kosslyn, S.M.(1983): Ghosts in the mind's machine, W.W. Norton & Co., New York, USA.

Kowalski,.A. (1991), Case-based reasoning and the deep structure approach to knowledge representation, in Proceedings of the Third International Conference on Artificial Intelligence and Law, Association for Computing Machinery, New York, USA.

Kowalski, R.A. (1979a): Algorithm=Logic+Control, J.ACM, 22, 424-436

Kowalski, R.A. (1979b): Logic for problem-solving, American Elsevier/North Holland, New York, USA. List of References 361

Kowalski, R.A. and Sergot, M, (1985): Computer representation of the law, in Proceedings of the Ninth International Joint Conference on Artificial Intelligence (IJCAI-85), IJCAI, Los Angeles, California, USA.

Kuhn, T.S. (1962, 1970), The Structure of scientific revolutions, University of Chicago Press, Chicago, USA.

Lacan, J. (1977): Ecrits, W.W.Norton, NY, USA.

Lacan, J. (1988): The seminar of, Book II, W.W.Norton, NY, USA.

Lacan, J. (1993): The seminar of, Book III, W.W.Norton, NY, USA.

Laing, R.D. (1970): Knots, Penguin Books, Harmondsworth, England.

Lakoff, G. & Johnson, M., (1980, 2003), Metaphors we live by, University of Chicago Press, Chicago, USA.

Lambert, K.(ed.) (1991): Philosophical applications of free logic, Oxford University Press, Oxford, England.

Lambert, K. (1991): The nature of free logic, in Lambert, K.(ed.), Philosophical applications of free logic, Oxford University Press, Oxford, England.

Latta, R. (1951, originally published in 1898):, Introduction, and Appendices A-I, in Leibniz, G.W. von, Monadology, translated by R. Latta, Oxford University Press, Oxford, England.

Latta, R. & MacBeath, A. (1956): The elements of logic, MacMillan Co, London, England.

Lee, M-K. (2005): Epistemology after Protagoras: responses to relativism in Plato, Aristotle, and Democritus, Clarendon Press, Oxford, England.

Leibniz, G.W. von (1992, first published in 1714): Monadology, translated and edited by N. Rescher, Routledge, London, England.

Leith, P. (1986a): Fundamental errors in legal logic programming, The Computer Journal, Vol. 29, No. 6.

Leith, P. (1986b): Legal expert systems: misunderstanding the legal process, Computers and Law, No.49, Sept.

Leith, P. and Hoey, A. (1998): The Computerised Lawyer, Springer, London,.USA. List of References 362

Leith, P., (1990): Formalism in AI and computer science, Ellis Horwood, New York, USA.

Leith, P. (1994) The problem with law in books and law in computers: the oral

nature of law, in I. Carr and K. Williams (eds), Computers and law, Intellect Ltd, Bristol, England, pp.225-236.

Lenat, D. and Guha, R. (1990): Building large knowledge-based systems.

representation and inference in the CYC Project, Addison-Wesley, London, England.

Levine, M. (1988.): Effective problem solving, Prentice Hall, Englewood Cliffs, N.J., USA.

Lewin, K. (1936): Principles of topological psychology, McGraw Hill Book Co Inc., NY, USA.

Lietzmann, W. (1969): Visual topology, Chatto and Windus, London, UK.

Lindsay, R., Buchanan, B. G., Feigenbaum, E. A. and Lederberg, J. (1980): Applications of artificial intelligence for chemical inference: The Dendral Project, McGraw-Hill Book Company, New York, USA.

Linne, C. v. (originally published in 1751, c.2003), Philosophia botanica, translated by S.Freer,, Oxford University Press, Oxford, England.

Lifschitz, V. (ed.) (1990): Formalizing common sense: papers by John McCarthy. Ablex Publishing Company, Norwood, N.J., USA.

Litowitz, D.E. (1997): Postmodern philosophy and law, University Press of Kansas, Lawrence, Kansas,.USA.

Litowitz, D.E.(1994): Dworkin and critical legal studies on right answers and conceptual holism, 18 Legal Studies Forum 135.

Livingston, C. (1993): Knot theory, The Mathematical Association of America, Washington, USA.

Lodder, A.R. (1999): DiaLaw: On legal justification and dialogical models of argumentation, Kluwer Academic Publishers, Dordrecht, Holland.

Lodder, A.R. (2004): Law, logic, rhetoric: a procedural model of legal argumentation, in S.Rahman, J. Symons, D.M. Gabbay and J.P.v. Bendegem, List of References 363

Logic, epistemology and the unity of science, Kluwer Academic Publishers, Dordrecht, Holland.

Lorenz, K. (1977): Behind the mirror, Methuen, London, England.

Luchins, A.S., Mechanization in problem solving, Psychological Monographs, 1942, 54, (whole no.6).

MacIlmaine, R.(originally published c.1555, translation 1574, 1969): The logike of Peter Ramus, San Fernando Valley State College, Northbridge, California, USA.

Madell, G. (1988): Mind and materialism, Edinburgh University Press, Edinburgh, Scotland.

Maggs, P.B. and deBessonet, C.G. (1972): Automated logical analysis of systems of legal rules, Jurimetrics Journal 12, 158-169.

Maimonides, M., (originally c. 1180, 1550): Mishneh Thorah: Annotation of Jewish Code of Laws (with Aristotelian and other commentary)..D. Pizzighettone and A. Dayyan (eds), C. Adelkind for M.A.Giustiniani, Venice, Italy.

Major-Poetzl, P. (1983): Michel Foucault's archaeology of western culture – toward a new science of history, The University of North Carolina Press. Chapel Hill, USA.

Marcuse, H. (1964): One Dimensional Man, Beacon Press, Boston, USA.

McCarthy, J. (1958): Programs with common sense, in Proceedings of the Symposium on Mechanisation of Thought Processes, vol.1, pp.77-84,. Her Majesty’s Stationery Office, London, England.

McCarthy, J. (1968): Programs with common sense, in M.L. Minsky (ed.), Semantic information processing, MIT Press, Cambridge, MA., USA.

McCarthy, J. (1977): Epistemological problems in artificial intelligence, in Proceedings of the Fifth International Joint Conference on Artificial Intelligence (IJCAI-77), IJCAI, Cambridge, MA, USA.

McCarthy, J. (1980): Circumscription: a form of non-monotonic reasoning. Artificial Intelligence, 13(1-2):27-39. List of References 364

McCarthy, J. and Hayes, P.J. (1969): Some philosophical problems from the viewpoint of artificial intelligence. In B. Meltzer and D. Mitchie (eds), Machine intelligence 4, Edinburgh University Press, Edinburgh, Scotland.

McCarty, LT. (1977): Reflections on Taxman: an experiment in artificial intelligence and legal reasoning, Harvard Law Review, 90: 837-93.

McCarty, LT. (1980): The Taxman Project: Towards a cognitive theory of legal argument, in B. Niblett (ed.), Computer Science and Law, Cambridge University Press, Cambridge, England.

McCarty, LT. (1984): Intelligent legal information systems: Problems and prospects, in C. Campbell (ed.), Data Processing and the Law, Sweet and Maxwell, London.

McCarty, LT. (1987): Intelligent legal information systems: an update, in H. Fiedler, F. Haft, and R. Traunmuller (eds), Expert systems in law: impacts on legal theory and computer law, Attempto-Verlag, Tubingen, Germany.

McCarty, LT. (1988a): Clausal intuitionistic logic. I. Fixed-point semantics, Journal of Logic Programming. 5(1): 1-31.

McCarty, LT. (1988b): Clausal intuitionistic logic. II. Tableau proof procedures, Journal of Logic Programming, 5(2): 93-132.

McCarty, LT. (1989): A language for legal discourse: 1. Basic features, in Proceedings of the Second International Conference on Artificial Intelligence and Law, Association for Computing Machinery, New York, USA.

McCarty, LT. (1990): Artificial intelligence and law: How to get there from here, Ratio Juris, vol. 3, no. 2, July, pp.189-200.

McCarty, LT. (1991): Circumscribing embedded implications, in A.Nerode et al. (eds), Proceedings of the First International Workshop on Logic Programming and Non-Monotonic Reasoning, MIT Press, Cambridge, MA., USA.

McCarty, LT. (1995): An implementation of Eisner v Marcomber, in Proceedings of the Fifth International Conference on Artificial Intelligence and Law, The Association for Computing Machinery, New York, USA. List of References 365

McCarty, LT. (1997), Some arguments about legal arguments, in Proceedings of the Sixth International Conference on Artificial Intelligence and Law, The Association for Computing Machinery, New York, USA.

McCarty, LT. and Meyden, R.v.d. (1991): Indefinite reasoning with definite rules, in Proceedings of the Twelth International Joint Conference on Artificial Intelligence, Morgan Kaufmann Publishers, San Francisco, USA.

McCarty, LT. and Sridharan, N.S. (1980): The representation of an evolving system of legal concepts: I: Logical templates, in Proceedings of the Third Biennial Conference of the Canadian Society for Computational Studies of Intelligence, Victoria, British Columbia, Canada.

McCarty, LT. and Sridharan, N.S. (1981): The representation of an evolving system of legal concepts: II. Prototypes and deformations, in Proceedings of the Seventh International Joint Conference on Artificial Intelligence, (IJCAI-81), IJCAI, Vancouver, Canada..

McCorduck, P. (1979): Machines who think, W.H. Freeman, San Francisco, USA.

McDermott, D. and Doyle, J. (1980): Non-monotonic Logic 1, Artificial Intelligence 13 41-72.

McTear, M.F. (ed.) (1988): Understanding cognitive science, Ellis Horwood Ltd, Chichester, England.

McTear, M.F. (1988): Introduction, in M.F. McTear (ed.), Understanding Cognitive Science, Ellis Horwood Ltd, Chichester, England.

Mehl, L. (1958): Automation in the legal world: from the machine processing of legal information to the “Law Machine” in Mechanisation of thought processes, National Physical Laboratory, England.

Meister, D. (1991): Psychology of System Design, Elsevier, Amsterdam, Holland.

Meldman, J.A. (1975): A preliminary study in computer-aided legal analysis, PhD thesis, MIT., Cambridge, MA, USA.

Meldman, J.A. (1977): A structural model for computer-aided legal analysis, Rutgers Journal of Computers and the Law, vol.6, no.1, pp.27-71.

Mandelbrot, B.B. (1982): The fractal geometry of nature, W.H. Freeman, San Francisco, USA. List of References 366

Metropolis, N., Howlett, J. and Rota, G-C. (eds) (1980), A history of computing in the twentieth century, Academic Press, NY, USA.

Meyer, J. and Hoek, W. van der (1995): Epistemic logic for AI and computer science, Cambridge University Press, Cambridge, England.

Miller, G.A. ( 1956): The magical number seven, plus or minus two: some liimits on our capacity for processing information, Psychological Review, 63, 81-97.

Miller, P. (1954): The New England mind, Harvard University Press, Cambridge, MA, USA.

Minsky, M. (1967): Computation: finite and infinite machines, Prentice Hall, Englewood Cliffs, NJ, USA.

Minsky, M. (ed.) (1969): Semantic information processing, MIT Press, Cambridge, MA. USA.

Minsky, M. (1981): A framework for representing knowledge, in J. Haugeland (ed.), Mind Design, MIT. Press, Cambridge, MA. USA.

Minsky, M. (1986): The society of mind, Simon and Schuster, NY, USA.

Minsky, M. (2006): The emotion machine, Simon and Schuster, NY, USA.

Minsky, M.and Papert, S. (1972): Perceptrons: an introduction to computational geometry, MIT Press, Cambridge, MA, USA.

Mithen, S. (1997): The prehistory of the mind: the cognitive origins of art, religion and science, Thames and Hudson, London, England.

Moles, R.N. (c.1987): Definition and rule in legal theory, Basil Blackwell, Oxford, England.

Morgan, A.de (1847): Formal Logic, Taylor and Walton, London, England.

Morgan, R.(c.1989): How we use the mind maps: an introduction of O & M Horizons, Ogilvy and Mather Horizons, Melbourne, Australia.

Morgan, T. (2002): Business rules and information systems, Addison-Wesley, Boston, USA.

Mundle, C.W.K.(ed.) (1952, unrevised 5th edition published in 1946) Stebbing, L.S., A modern elementary logic, 5th ed., Methuen & Co. Ltd, London, England. List of References 367

Neumann, J.v. and Morgenstern, O. (1944, 1953): Theory of games and economic behavior, Princeton University Press, Princeton, NJ, USA.

Newell, A.(1982): The knowledge level, Artificial Intelligence 18, 87-127.

Niblett, B., (1981): Expert systems for lawyers, Computers and Law 29, 2.

Niblett, B., (1988): Book review, R. E. Susskind, Expert systems in law, Computers and Law 56, p.32, Jun.

Norman, D.A., (ed.) (1981): Perspectives on cognitive science, Ablex, Norwood, NJ., USA.

Norman, D.A. and Draper, S.W. (eds) (1986): User centered system design, Lawrence Erlbaum Associates, Hillsdale, New Jersey, USA.

O'Connor, J. and McDermott, I. (1997): The art of systems thinking, Thorsons, London, England.

Ogden, C.K.(1932): Bentham's theory of fictions, Kegan Paul, Trench, Trubner & Co. Ltd, London, England.

Ogden, C.K. and Richards, I.A. (first published 1923, 1989): The meaning of meaning, Harcourt Brace Jovanovich, New York, USA.

Ong, W.J. (1958): Ramus method and the decay of dialogue, Harvard University Press, Cambridge, MA, USA.

Open University (1973): Topology axioms, topological closure, induced topologies,.England.

Ortony, A. (ed.) (1979): Metaphor and thought, Cambridge University Press, Cambridge, England.

Ortony, A. (1979): Metaphor: A multidimensional problem, in A. Ortony, (ed.), Metaphor and Thought, Cambridge University Press, Cambridge,.England.

Osborne, A.F.(1963): Applied Imagination: principles and procedures of creative problem solving, Scribners, New York,.USA.

Payne, J.W., Bettman, J.R. & Johnson, E.J. (1992): Behavioural Decision Research: A Constructive Processing Perspective, Annual Review of Psychology, Vol. 43, pp.87-131. List of References 368

Peirce, C.S. (1982): Writings of Charles S. Peirce, A chronological edition, E.C. Moore (ed.), Vol. 1 (1857-1866), Indiana University Press, Bloomington and Indianopolis, USA.

Peirce, C.S. (1984): Writings of Charles S. Peirce, A chronological edition, E.C. Moore (ed.), 2 (1867-1871), Indiana University Press, Bloomington and Indianopolis, USA.

Peirce, C.S. (1986): Writings of Charles S. Peirce A Chronological Edition, C.J.W. Kloesel (ed.), Vols 3 (1872-1878), and 4 (1879-1884), Indiana University Press, Bloomington and Indianopolis,.USA.

Peirce, C.S. (1931): Principles of Philosophy, Collected Papers Vol.1, C. Hartshorne and P. Weiss (eds), Harvard University Press, Cambridge, MA, USA.

Peirce, C.S. (1932): Elements of logic, Collected Papers Vol.2, C. Hartshorne and P. Weiss (eds), Harvard University Press, Cambridge, MA, USA.

Peirce, C.S. (1933a): Exact logic, Collected Papers Vol.3, C. Hartshorne and P. Weiss (eds), Harvard University Press, Cambridge, MA, USA.

Peirce, C.S.(1933b): The simplest mathematics, Collected Papers Vol.4, C. Hartshorne and P. Weiss (eds), Harvard University Press, Cambridge, MA, USA.

Peirce, C.S. (1934): Pragmatism and pragmaticism, Collected Papers Vol.5, C. Hartshorne and P. Weiss (eds), Harvard University Press, Cambridge, MA, USA.

Peirce, C.S. (1935): Scientific metaphysics, Collected Papers Vol.6, C. Hartshorne and P. Weiss (eds), Harvard University Press, Cambridge, MA, USA.

Peirce, C.S. (1958): Science and philosophy, Collected Papers Vol.7, A.W. Burks (ed.), Harvard University Press, Cambridge, MA, USA.

Peritz, R.J. (1990): Exploring the limits of formalism: AI and legal pedagogy, in Proceedings of the International Conference on Computers in Legal Education, Salzburg, Austria, August 25.

Petersen, U. (2002): Diagonal method and dialectical logic, Der Andere Verlag, Osnabruck, Denmark. List of References 369

Pethe, V.P., Rippey, C.P. and Kale, L.V. (1989): A specialised expert system for judicial decision support, in Proceedings of the Second International Conference on Artificial Intelligence and Law,.Association for Computing Machinery Inc., New York, USA.

Piaget, J. (1953): Logic and psychology, Manchester University Press, Manchester, England.

Plato (originally c. 340BC, 1936): The works of Plato, B.Jowett, Translator, Dial Press, New York, USA.

Pollard, B.W. (1953): The circuit components of digital computers, in B.V. Bowden (ed.), Faster than thought, Sir Isaac Pitman and Sons Ltd, London, England, pp.32-66.

Polya, G. (1957): How to solve it, Princeton University Press, Princeton, N.J.,.USA.

Poole, D.L. (1988): A logical framework for default reasoning, Artificial Intelligence 36, 27-47.

Popp, W.G. and Schlink, B. (1975): JUDITH, a computer program to advise lawyers in reasoning a case, Jurimetrics Journal 15, 303.

Pople, H. (1973): On the mechanization of abductive logic, in Proceedings of the Third International Joint Conference on Artificial Intelligence (IJCAI-73), IJCAI, Stanford University, California, USA.

Pople, H. (1977): The formation of composite hypotheses in diagnostic problem solving: an exercise in synthetic reasoning, in Proceedings of the Fifth International Joint Conference on Artificial Intelligence (IJCAI-77), IJCAI, Cambridge, MA, USA.

Popper, K.R. (1972): Objective Knowledge, Clarendon Press, Oxford, England.

Popple, J. (1996): A pragmatic legal expert system, Dartmouth Publishing Company Limited, Aldershot, England.

Posner, R. (1990): The problems of jurisprudence, Harvard University Press, Cambridge, MA, USA.

Post, E.L., A general theory of elementary propositions, American Journal of Mathematics, xliii (1921), pp.163-85.

Pound, R. (1908): Mechanized jurisprudence, 8 Colum. L. Rev., 605. List of References 370

Pound, R. (1921): The spirit of the common law, Marshall Jones Company, Boston, USA.

Pound, R. (1942): Social control through law, Yale University Press, New Haven, USA.

Power, J.W.( 1934): The elements of pictorial construction, Antoine Roche, Paris, France.

Prado, C. G. (2006): Searle and Foucault on truth, Cambridge University Press, Cambridge England.

Praken, H. (1991): A tool in modelling disagreement in law: preferring the most specific argument, in Proceedings of the Third International Conference on Artificial Intelligence and Law, Association for Computing Machinery, New York, USA.

Praken, H. (1993): Logical tools for modeling legal argument, PhD thesis, Vrije University, Amsterdam, Holland.

Pressley, M., Levin, J.R. and Delaney, H. (1982): The mnemonic keyword method, Review of Educational Research, 52, 61-91.

Quillian, M.R. (1968): Semantic memory, in M.L. Minsky (ed.), Semantic information processing, MIT Press, Cambridge, MA, USA.

Quinlan, J.R. (1979): Discovering rules by induction from large collections of examples, in D.E. Michie (ed.), Expert systems in the micro-electronic age. Edinburgh University Press, Edinburgh, Scotland.

Ramus, P., Dialecticke, (1543, translated in 1574 as The logicke of Peter Ramus by Roland MacIlmaine, 1969), C.A. Dunn (ed.), San Fernando Valley State College, Northbridge, California, USA.

Rasmussen, T. (ed.) (1986): System science and jurisprudence, Spartan Press, Lansing, Michigan, USA

Raz, J. (1980): The concept of a legal system, 2nd ed., Clarendon Press, Oxford, England.

Read, S. (1994): Thinking about logic, Oxford University Press, Oxford, England.

Rescher, N. (2006): Epistemetrics, Cambridge University Press, Cambridge, England. List of References 371

Reichgelt, H. (1991): Knowledge representation, Ablex Pub., Norwood, New Jersey, USA.

Reiter, R. (1980): A logic for default reasoning, Artificial Intelligence 13, 81-132

Restall, G. (2000): An introduction to substructural logics, Routledge, London, England.

Rickards, T. and Moger, S. (1999): Handbook for creative team leaders, Gower, Brookfield, Vt, USA.

Rissland, E.L. (1982): Examples in the legal domain: hypotheticals in contract law, in Proceedings of the Fourth Annual Cognitive Science Society Conference, University of Michigan, Ann Arbor, USA.

Rissland, E.L. (1983): Examples in legal reasoning: legal hypotheticals, in Proceedings of the International Joint Conference on Artificial Intelligence, Karlsruhe, West Germany.

Rissland, E.L., (1984), Hypothetically speaking: experience and reasoning in the law, in Proceedings of the First Annual Conference on Theoretical Issues in Conceptual Information Processing.

Rissland, E.L. (1985): Argument moves and hypotheticals, in C. Walter (ed.), Computing power and legal reasoning, West Publishing Co., St. Paul, USA.

Ritchie, A.D. (1923): Scientific method, Kegan Paul, Trench, Trubner and Co, London, England.

Rogers, W.V.H. (ed.), (1979): Winfield and Jolowicz on tort, 11th edition, Sweet and Maxwell, London, England.

Rosch, E. (1978): Principles of categorization, in E. Roche and B.B. Lloyd.(eds), Cognition and categorization, Erlbaum, Hillsdale, N.J., USA. pp.27-48

Rourke, N.E. (1986): On the way to experimental jurisprudence, in T. Rasmussen (ed.), System science and jurisprudence, Spartan Press, Lansing, Michigan, USA, pp.187-219.

Routen, T. (1989): Hierarchically organized formalizations, in Proceedings of the Second International Conference on Artificial Intelligence and Law, The Association for Computing Machinery, New York, USA. List of References 372

Rumelhart, D.E. & McClelland, J.L. (1986): Parallel distributed processing: exploration in the microstructure of cognition, Foundations, Vol.1, MIT Press, Cambridge, MA, USA.

Russell, B. (1961): History of western philosophy, George Allen and Unwin, London, England.

Russell, B. and Whitehead, A.N. (1910), Principia mathematica, Cambridge University Press, Cambridge, England.

Russell, S.J. and Norvig, P. (1995): Artificial intelligence, Prentice-Hall International Inc., London, England.

Sainsbury, M. (1991): Logical forms, Basil Blackwell, Oxford, England.

Samuel, G., (1995): Ontology and dimension in legal reasoning, in Z. Bankowski, I. White, and U. Hahn (eds), Informatics and the foundations of legal reasoning, Kluwer Academic Publishers, Dordrecht, Holland.

Samuel, G. (2003): Epistemology and method in law, Ashgate, Aldershot, England.

Sanford, D.H. (1989, 1992): If p, then q Conditionals and the foundations of reasoning, Routledge, London, England.

Sartor, G. (1991): The structure of norm conditions and non-monotonic reasoning in law, in Proceedings of the Third International Conference on Artificial Intelligence and Law, The Association for Computing Machinery, New York, USA

Sartor, G. (1994): A formal model of legal argumentation, Ratio Juris, 7, 177-211.

Sartor, G. and Branting, C. (eds) (1998): Judicial applications of artificial intelligence, Kluwer Academic Publishers, Dordrecht, Holland.

Sarup, M. (1993): An introductory guide to post-structuralism and postmodernism, University of Georgia Press, Athens, USA.

Saussure, F. de, (1959): Course in general linguistics, Philosophical Library, New York, USA.

Savasdisara, P. (1994): Computer-assisted legal analysis systems: Part 1: The origins of computer-aided support systems, Vol.5., Issue 2, Computers and Law, Jun/Jul, pp.28-30. List of References 373

Schank, R.C. And Abelson, R.P. (1977): Scripts, plans, goals and understanding, Lawrence Erlbaum, Hillsdale, NJ., USA.

Schelling, F.W.J.v. (originally published 1800, 1993): System of transcendental idealism, translated by P.L. Heath, University of Virginia Press, USA.

Schild, U.J., (1992): Expert systems and case law, New York, Ellis Horwood.

Schmidt, R.W. (1966): The domain of logic according to Saint Thomas Aquinas, (Revised University of Toronto PhD thesis.) Martinus Nijhoff, The Hague, Holland.

Searle, J.R. (1969): Speech acts: an essay in the philosophy of language, Cambridge University Press, Cambridge, England.

Searle, J.R. (1981): Minds, brains, and programs, in D.R. Hofstadter and D.C. Dennett, The mind’s I, Basic Books Inc., New York, USA.

Searle, J.R. (1995): The construction of social reality, Penguin, Harmondsworth, Middlesex, England.

Shadbolt, N. (1988): Models and methods in cognitive science, in McTear, M.F., (ed.) Understanding cognitive science, Ellis Horwood Ltd, Chichester, England.

Shannon, D. T. and Golshani, F. (1988): On the automation of legal reasoning, Jurimetrics Journal, Vol 28, no. 3, p305

Shaw, P. (1997): Logic and its limits, 2nd ed., Oxford University Press, Oxford, England.

Shneiderman, B. (1998): Designing the user interface, Addison-Wesley, Reading, MA, USA.

Shoben, E.J.(1988,): The representation of knowledge, in M.F. McTear (ed.), Understanding cognitive science, 3rd ed., John Wiley and Sons, New York, USA. pp.102-119.

Shoham, Y. (1987): Non-monotonic logics: meaning and utility, in Proceedings of the Tenth International Joint Conference on Artificial Intelligence (IJCAI- 87), IJCAI, Milan, Italy, and in M. Ginsberg (ed.), Readings in non- monotonic reasoning, Morgan Kaufmann, Los Altos, CA, USA. List of References 374

Shortliffe, E. (1976): Computer-based medical consultation: MYCIN, American Elsevier, New York, USA.

Siewiorek, D.P., Bell, C.G. and Newell, A. (1982): Computer Structures, McGraw Hill, New York, USA.

Simon, H.A. and Newell, A.(1958): Heuristic problem solving: The next advance in operations research. Operations Research, 6:1-10.

Simon, H.A. ( 1966): The logic of heuristic decisionmaking, in N. Rescher (ed.), The logic of decision and action, University of Pittsburgh Press, Pittsburgh, USA.

Simon, H.A. ( 1969): The sciences of the artificial, MIT Press, Cambridge, MA., USA.

Simon, H.A. and Newell, A.(1958): Heuristic problem solving: The next advance in operations research. Operations Research, 6:1-10.

Simons, P. (1987): Parts, A study in ontology, Clarendon Press, Oxford, England.

Skalak, D.B. And Rissland, E.L. (1991): Argument moves in a rule-guided domain, in Proceedings of the Third International Conference on Artificial Intelligence and Law, Association for Computing Machinery, New York, USA.

Skalak, D.B. And Rissland, E.L. (1992): Arguments and cases: an inevitable intertwining, Artificial Intelligence and Law, 1:3-44

Sloman, A. (1979): Epistemology and artificial intelligence in D.E. Michie (ed.), Expert systems in the micro-electronic age, Edinburgh University Press, Edinburgh, Scotland.

Smith, J.C. (1997): The use of lexicons in information retrieval in legal databases, in Proceedings of the Sixth International Conference on Artificial Intelligence and Law, Association for Computing Machinery, New York, USA.

Smith, J.C. and Deedman, C. (1987): The application of expert system technology to case-based law, in Proceedings of the First International Conference on Artificial Intelligence and Law, Association for Computing Machinery, New York, USA. List of References 375

Smullyan, R.M. (1981): An epistemological nightmare, in D.R. Hofstadter and D.C. Dennett, The Mind’s I, Basic Books Inc., New York, USA.pp.415-429.

Soames, S. (1989): Presupposition, in D. Gabbay & F. Guenthner (eds), Handbook of Philosophical Logic, Vol IV, D. Reidel Pub.Co., Dordrecht, Holland.

Son, T. C. and Tu, P. H. (2006): On the Completeness of Approximation Based Reasoning and Planning in Action Theories with Incomplete Information, KR 2006

Stafford, B.M. (1999): Visual analogy, MIT Press, Cambridge, MA, USA.

Staines, P. (1995): Form, Invalidity and the Massey Mistake, in P. Hager and J. Roe (eds), Out on a Lemma, University of Technology, Sydney,.Australia.

Stamper, R. (1980): LEGOL: modeling legal rules by computer, in B. Niblett (ed.), Computer science and law, Cambridge University Press, Cambridge, England.

Stamper, R., Tagg, C., Mason, P., Cook, S. and Marks, J. (1982): Developing the LEGOL semantic grammar, in C. Ciampi (ed.), Artificial intelligence and legal information systems, North-Holland, Amsterdam, Holland.

Stebbing. L.S. (1952): A modern elementary logic, 5th ed. Rev by C.W.K. Mundle, Methuen & Co Ltd, London, England.

Steinhart, E.C. (2001): The logic of metaphor: analogous parts of possible worlds, Kluwer Academic Publishers, Dordrecht, Holland.

Sternberg, R.J. (1990): Metaphors of mind: conceptions of the nature of intelligence, Cambridge University Press, New York, USA.

Sternberg, R.J. (1985): Beyond IQ: a triarchic theory of human intelligence, Cambridge University Press, New York, USA.

Sternberg, R.J. (1997): Successful intelligence, Plume, New York, England.

Stone, J. (1985): Precedent and law: dynamics of common law growth, Butterworths, Sydney, Australia.

Stone, M. (1995): Focusing the law: what legal interpretation is not, in Patterson, D. (ed.), Wittgenstein and Law, Ashgate, Aldershot, England.

Susskind, R.E. (1987): Expert systems in law: a jurisprudential inquiry, Clarendon Press, Oxford, England. List of References 376

Susskind, R.E. (1989): Pragmatism and purism in artificial intelligence and legal reasoning, AI and Society, vol.3, no.1, Jan-Mar, pp.28-38.

Susskind, R.E., (1996): The future of law, Clarendon Press, Oxford, England.

Tammelo, I. (1955): Sketch for a symbolic juristic logic, 8 Journal of Legal Education 277.

Tammelo, I. (1959): On the logical openness of legal orders, 8 The American Journal of Comparatice Law 187.

Tammelo, I. (1978): Modern logic in the service of law, Springer-Verlag, New York, USA.

Tarnas, R. (1991): The passion of the western mind, Ballantine Books, NY, USA.

Teilhard de Chardin, P. (1955): The phenomenon of man, translated by B. Wall, Harper and Row, NY, USA.

Terrell, T.P. (1984): Flatlaw: an essay on the dimensions of legal reasoning and the development of fundamental normative principles. Calif. L.R. 72(3): 288- 343.

Thimbleby, H. (1990): User interface design, Addison-Wesley Publishing Company, Wokingham, England.

Toms, E. (1991): Holistic logic A formalisation of metaphysics, second ed., published by the author, Edinburgh, Scotland.

Toulmin, S.E. (1958): The uses of argument, Cambridge University Press, Cambridge, England.

Traub, J.F., Wasilkowski, G.W.& Wozniakowski, H. (1988): Information-based complexity, Academic Press Inc., Boston, USA.

Tur, R.H.S. (1977): Positivism, principles and rules, in E. Attwooll (ed.), Perspectives in Jurisprudence, University of Glasgow Press, Glasgow, Scotland.

Tur, R.H.S. (1978): What is Jurisprudence?, 28 Philosophical Quarterly, 149.

Turing, A.M. (1981): Computing machinery and intelligence, in D.R. Hofstadter and D.C. Dennett, The Mind’s I, Basic Books Inc., New York, pp.53-68.

Turner, V., The Forest of Symbols, Cornell University Press, Ithaca, New York, 1967. List of References 377

Turner, V. (1969): Forms of symbolic action, in R.F. Spencer (ed.), Forms of symbolic action, University of Washington Press, Seattle, USA.

Twining, W. (1985): Theories of evidence: Bentham and Wigmore, Weidenfeld and Nicolson, London, England.

Twining, W. and Miers, D. (1999, 4th Edition): How to do things with rules, Butterworths, London, England.

Tymienicka, A. T. (1964): Leibniz' cosmological synthesis, Van Gorcum and Comp., Netherlands.

Tyree, A. (1989): Expert systems in law, Prentice Hall, New York, USA

Vaihinger, H. (1911, 1965): The philosophy of “As if”, translated by C.K. Ogden, Routledge, London, England.

Valente, A. (1995), Legal Knowledge Engineering: A Modelling Approach, Frontiers in Artificial Intelligence and Application, Vol 30, Amsterdam: IOS Press.

Venn, J. (1876): Boole's Logical System, Mind, 1 (1876), 479-491.

Venn, J. (1881, 1894): Symbolic Logic, 2nd ed., Macmillan, London,.England.

Vermeesch, R.B. and Lindgren, K.E. (1998): Business law of Australia, Butterworths, Sydney, Australia.

Wacks, R. (1999): Jurisprudence, London, Blackstone Press Limited.

Waldrop, M.M.:(1987): Man-made minds, Walker and Company, USA.

Waller, A.D. (1912): A contribution to the psychology of logic, University of London Press, London, England.

Waller, L. (1995): Derham, Maher and Waller An introduction to law, LBC Information Services, Sydney, Australia.

Walton, C. (1999): Ramus, in R. Audi (ed.), Cambrige Dictionary of Philosophy, 2nd ed., Cambridge University Press, Cambridge, England. pp.770-1.

Warnock, G.J. (ed.) (1967): Philosophical logic, Oxford University Press,.Oxford, England.

Watson, J. B. (1919): Psychology from the standpoint of a behaviorist, Lippencott, Philadelphia, USA. List of References 378

Weinreb, L.L.(2005), Legal Reason – the use of analogy in legal argument, Cambridge University Press, Cambridge, England.

Welby, V. (1903): What is meaning?, Macmillan & Co Ltd, New York, USA.

Wellman, C. (1971): Challenge and response: justification in ethics, Southern Illinois University Press, Il., USA

Wells, H.G. (1938): World brain, Doubleday, Doran, Garden City, New York, USA.

Whewell, W. (1859): History of the inductive sciences, Vol.1, D. Appleton, NY, USA.

Whitehead, A.N. (1928, 1950): The aims of education and other essays, 2nd ed., Ernest Benn Limited, London, England.

Whitehead, A.N. (1929): Process and reality, Cambridge University Press, London, England.

Whitehead, A.N. (1933), Adventures of ideas, Cambridge University Press, Cambridge, England.

Wigmore, J.H. (1913, 1931): Principles of judicial proof as given by logic, psychology and general experience and illustrated in judicial trials, Little, Brown and Company, Boston, USA.

Wigmore, J.H. (1937): The Science of judicial proof as Given by Logic, Psychology and General Experience, and illustrated in Judicial Trials, Little, Brown and Company, Boston, USA.

Winston, P.H. (1984): Artificial intelligence, 2nd ed., Addison-Wesley Publishing Co Inc, London, England.

Winter, S. (2001): A clearing in the forest: law, life, and mind, University of Chicago Press, Chicago, USA.

Wisdom, J. (1951, 1973): Gods, in A. Flew, (ed.), Logic and language, Basil Blackwell, Oxford, England.

Wittgenstein, L. (1922): Tractatus logico-philosophicus, Routledge and Kegan Paul, London, England.

Wittgenstein, L. (1958): Philosophical investigations, translated by G.E.M. Anscombe, Blackwell, Oxford,. England. List of References 379

Woods, M. (1997): Conditionals, Wiggins, D., (ed.), Clarendon Press, Oxford, England.

Wright, G.H.von (1951): Deontic Logic, Mind, 60, 1-15.

Wright, G.H.von (1963): Norm and Action, Routledge and Kegan Paul, London,. England.

Wright, G.H.von (1983): Practical reason, Basil Blackwell, Oxford, England.

Zhang, Y. (2003): Minimal change and maximal coherence for epistemic logic program updates, in Proceedings of (IJCAI-2003), Morgan Kaufmann Publishers, San Francisco, USA.

Zuse, K., (1980): Some remarks on the history of computing in Germany, in N. Metropolis, J. Howlett, and G-C Rota, eds (1980), A history of computing in the twentieth century, New York, Academic Press, pp.611-627.

7 FURTHER READING Adams, J.L. (1974): Conceptual blockbusting, W.H.Freeman, San Francisco, USA.

Appleby, J., Covington, E., Hoyt, D., Latham, M. and Sneider, A (eds) (1996): Knowledge and postmodernism in historical perspective, Routledge, London, England.

Arbuthnot, J. (originally published in 1712, 1976): The history of John Bull, edited by A.W. Bower and R.A. Erickson, Clarendon Press, Oxford, England.

Audi R. (ed.) (1999): The Cambridge Dictionary of Philosophy, 2nd ed., Cambridge University Press, Cambridge, England

Black, M. (1946): Critical thinking, Prentice Hall Inc., New York, USA.

Bochenski, J.M. (1962): Philosophy, D. Reidel, Dordrecht, Holland

Bonjour, L. and Sosa, E. (2003): Epistemic justification, Blackwell Publishing Ltd, Oxford, England.

Bressan, A. (1972): A general interpreted modal calculus, Yale University Pres, New Haven, Conn, USA.

Browne, D., Totterdell, P. and Norman, M. (eds) (1990): Adaptive user interfaces, Academic Press limited, London, England Further Reading 380

Chihara, C.S. (1973): Ontology and the vicious-circle principle, Cornell University Press, Ithaca and London, England.

Cohen, M.R. and Nagel, E. (1934): Logic and scientific method, Routledge and Kegan Paul, London, England.

Davis, M.H. (1996): Empathy, Westview Press, Harper Collins Publishers, Boulder, Colorado, USA.

Deleuze, G. (1963, 1984): Kant”s Critical Philosophy, translated by H. Tomlinson and B. Habberjam, The Athlone Press, London, England.

Ellenberger, H.F. (1970): The discovery of the unconscious, Basic Books, New York, USA.

Eysenck, M., A handbook of cognitive psychology, Lawrence Erlbaum Associates, Hillsdale, NJ., 1984.

Fawcett, E.D. (1916): The world as imagination, Macmillan & Co Ltd, London,. England.

Guttenplan, S. (ed.) (1994): A companion to the philosophy of mind, Blackwell, Oxford, England.

Haldane, Viscount (1921): The reign of relativity, John Murray, London, England.

Hayek, F. A. (1978): The three sources of human values, The London School of Economics and Political Science, London, England.

Joad, C.E.M. (1948): Guide to modern thought, Pan Books Ltd, London, England.

Leiboff, M. and Thomas, M. (2004): Legal theories in principle, Thomson Legal and Regulatory Ltd., Sydney, Australia

Montgomery, S.L. (1994): Object-oriented information engineering, Academic Press Inc, , London, England.

Olafson, F.A., (1979): The dialectic of action, University of Chicago Press, Chicago, USA.

Parsons, J.J. and Oja, D. (2006): Computer concepts, 8th ed., Thomson, Thomson Learning Inc, Boston, MA, USA.

Pinker, S. (1997): How the mind works, W.W. Norton, New York, USA.

Prazak, M. (1963) Language and logic, Philosophical Library, NY., USA. Further Reading 381

Priest, G. (2000): Logic, Oxford University Press, Oxford, England.

Sperber, D. (1975): Rethinking Symbolism, Cambridge University Press, Cambridge, translated by Alice L. Morton, originally published in French, 1974.

Sperschneider, V. and Antoniou, G. (1991): Logic – a foundation for computer science, Addison-Wesley, Reading, MA., USA.

Spiegl, F. (2003): Contradictionary, Kyle Cathie Limited, London, England.

Stcherbatsky, F.T. (c.1930, 1962): Buddhist logic, Vols 1-2, Academy of Sciences, Leningrad, USSR, republished by Dover Publications Inc., New York, USA.

Stewart, I. (2001, 2003): Flatterland, Pan Books Ltd, London, England.

Sullivan, J. W.and Tyler, S. W. (eds) (1991): Intelligent user interfaces, ACM Press, New York, USA

Tamanaha, B.Z. (1997): Realistic socio-legal theory, Oxford University Press, Oxford, England

Trauth, E.M. (2001): Qualitative research in IS: issues and trends, Idea Group Publishing, Hershey, PA.,USA.

Wedberg, A. (1982-4): History of Philosophy, Vol 1-3, Clarendon Press, Oxford, England.

Wilber, K. (ed.) (1982): The holographic paradigm, New Science Library, Boston, MA., USA.