Bidirected graph From Wikipedia, the free encyclopedia Contents

1 Bidirected graph 1 1.1 Other meanings ...... 1 1.2 See also ...... 2 1.3 References ...... 2

2 3 2.1 Construction ...... 3 2.2 Examples ...... 3 2.3 interpretation ...... 4 2.4 Properties ...... 4 2.5 Other double covers ...... 4 2.6 See also ...... 5 2.7 Notes ...... 5 2.8 References ...... 5 2.9 External links ...... 6

3 Complex question 7 3.1 Implication by question ...... 7 3.2 Complex question fallacy ...... 7 3.2.1 Similar questions and fallacies ...... 8 3.3 Notes ...... 8

4 10 4.1 Basic terminology ...... 11 4.2 Indegree and outdegree ...... 11 4.3 Degree sequence ...... 12 4.4 Digraph connectivity ...... 12 4.5 Classes of digraphs ...... 13 4.6 See also ...... 14 4.7 Notes ...... 15 4.8 References ...... 15

5 Downward entailing 16 5.1 Strawson-DE ...... 16

i ii CONTENTS

5.2 See also ...... 17 5.3 References ...... 17

6 Entailment (pragmatics) 18 6.1 Types of entailment ...... 18 6.2 See also ...... 18 6.3 References ...... 18 6.4 Further reading ...... 18

7 Fallacy 19 7.1 Formal fallacy ...... 19 7.1.1 Common examples ...... 19 7.2 Aristotle's Fallacies ...... 19 7.3 Whately's grouping of fallacies ...... 20 7.4 Intentional fallacies ...... 20 7.5 Deductive fallacy ...... 20 7.6 Paul Meehl's Fallacies ...... 20 7.7 Fallacies of Measurement ...... 21 7.8 Other systems of classification ...... 22 7.9 Assessment of Fallacies - Pragmatic Theory ...... 22 7.10 See also ...... 22 7.11 References ...... 23 7.12 Further reading ...... 24 7.13 External links ...... 25

8 Fixed point () 26 8.1 Attractive fixed points ...... 27 8.2 Applications ...... 28 8.3 Topological fixed point property ...... 28 8.4 Generalization to partial orders: prefixpoint and postfixpoint ...... 29 8.5 See also ...... 29 8.6 Notes ...... 29 8.7 External links ...... 30

9 Graph 31 9.1 Variations ...... 31 9.2 Motivation ...... 31 9.3 Whitney theorem ...... 32 9.4 Recognition of ...... 32 9.5 See also ...... 33 9.6 Notes ...... 33 9.7 References ...... 33 CONTENTS iii

10 34 10.1 Definitions ...... 35 10.1.1 Graph ...... 35 10.2 Applications ...... 35 10.3 History ...... 37 10.4 Graph drawing ...... 38 10.5 Graph-theoretic data structures ...... 38 10.6 Problems in graph theory ...... 39 10.6.1 Enumeration ...... 39 10.6.2 Subgraphs, induced subgraphs, and minors ...... 39 10.6.3 ...... 39 10.6.4 Subsumption and unification ...... 40 10.6.5 Route problems ...... 40 10.6.6 Network flow ...... 40 10.6.7 Visibility problems ...... 40 10.6.8 Covering problems ...... 40 10.6.9 Decomposition problems ...... 41 10.6.10 Graph classes ...... 41 10.7 See also ...... 41 10.7.1 Related topics ...... 41 10.7.2 Algorithms ...... 42 10.7.3 Subareas ...... 43 10.7.4 Related areas of mathematics ...... 43 10.7.5 Generalizations ...... 43 10.7.6 Prominent graph theorists ...... 43 10.8 Notes ...... 44 10.9 References ...... 45 10.10External links ...... 45 10.10.1 Online textbooks ...... 45

11 Implication graph 46 11.1 Applications ...... 47 11.2 References ...... 47

12 Implicational hierarchy 48 12.1 Phonology ...... 48 12.2 Morphology ...... 48 12.3 ...... 48 12.4 Bibliography ...... 49

13 Implicational 50 13.1 Virtual completeness as an operator ...... 50 iv CONTENTS

13.2 Axiom system ...... 50 13.3 Basic properties of derivation ...... 51 13.4 Completeness ...... 51 13.4.1 Proof ...... 51 13.5 The Bernays–Tarski axiom system ...... 53 13.6 Testing whether a formula of the implicational propositional calculus is a tautology ...... 53 13.7 Adding an axiom schema ...... 54 13.8 An alternative axiomatization ...... 54 13.9 See also ...... 56 13.10References ...... 56

14 Implicature 57 14.1 Types of implicature ...... 57 14.1.1 Conversational implicature ...... 57 14.1.2 Conventional implicature ...... 58 14.2 Implicature vs entailment ...... 58 14.3 See also ...... 58 14.4 References ...... 58 14.5 Bibliography ...... 59 14.6 Further reading ...... 59 14.7 External links ...... 59

15 Implicit 60 15.1 Mathematics ...... 60 15.2 Computer science ...... 60 15.3 Other uses ...... 60 15.4 See also ...... 60

16 Informal fallacy 61 16.1 Formal deductive fallacies and informal fallacies ...... 61 16.2 See also ...... 61 16.3 References ...... 62 16.4 Further reading ...... 62 16.5 External links ...... 62

17 (mathematics) 63 17.1 General properties ...... 63 17.2 Involution throughout the fields of mathematics ...... 64 17.2.1 Euclidean geometry ...... 64 17.2.2 Projective geometry ...... 64 17.2.3 Linear algebra ...... 64 17.2.4 Quaternion algebra, groups, semigroups ...... 64 17.2.5 Ring theory ...... 65 CONTENTS v

17.2.6 theory ...... 65 17.2.7 Mathematical logic ...... 65 17.2.8 Computer science ...... 66 17.3 References ...... 66 17.4 Further reading ...... 66 17.5 See also ...... 66

18 Linguistic universal 67 18.1 Terminology ...... 67 18.2 In semantics ...... 68 18.3 See also ...... 68 18.4 Notes ...... 68 18.5 References ...... 68 18.6 External links ...... 69

19 Linguistics 70 19.1 Nomenclature ...... 71 19.2 Variation and Universality ...... 71 19.2.1 Lexicon ...... 71 19.2.2 Discourse ...... 71 19.2.3 Dialect ...... 71 19.2.4 Structures ...... 72 19.2.5 Relativity ...... 73 19.2.6 Style ...... 73 19.3 Approach ...... 73 19.3.1 Methodology ...... 73 19.3.2 Analysis ...... 74 19.3.3 Anthropology ...... 74 19.3.4 Sources ...... 74 19.4 History of linguistic thought ...... 74 19.4.1 Early grammarians ...... 75 19.4.2 Comparative philology ...... 75 19.4.3 Structuralism ...... 76 19.4.4 Generativism ...... 76 19.4.5 Functionalism ...... 76 19.4.6 Cognitivism ...... 77 19.5 Areas of research ...... 77 19.5.1 Historical linguistics ...... 77 19.5.2 Sociolinguistics ...... 78 19.5.3 Developmental linguistics ...... 78 19.5.4 Neurolinguistics ...... 78 19.6 Applied linguistics ...... 78 vi CONTENTS

19.7 Inter-disciplinary fields ...... 79 19.7.1 Semiotics ...... 79 19.7.2 Language documentation ...... 79 19.7.3 Translation ...... 79 19.7.4 Biolinguistics ...... 79 19.7.5 Clinical linguistics ...... 80 19.7.6 Computational linguistics ...... 80 19.7.7 Evolutionary linguistics ...... 80 19.7.8 Forensic linguistics ...... 80 19.8 See also ...... 80 19.9 References ...... 82 19.10Bibliography ...... 83 19.11External links ...... 84

20 Loaded question 85 20.1 Defense ...... 85 20.2 Historical examples ...... 85 20.3 See also ...... 86 20.4 References ...... 86 20.5 External links ...... 87

21 Material conditional 88 21.1 Definitions of the material conditional ...... 89 21.1.1 As a truth function ...... 89 21.1.2 As a formal connective ...... 89 21.2 Formal properties ...... 90 21.3 Philosophical problems with material conditional ...... 90 21.4 See also ...... 91 21.4.1 Conditionals ...... 91 21.5 References ...... 91 21.6 Further reading ...... 91 21.7 External links ...... 92

22 Material implication 93 22.1 See also ...... 93

23 Modus ponens 94 23.1 Formal notation ...... 94 23.2 Explanation ...... 95 23.3 Justification via ...... 95 23.4 See also ...... 95 23.5 References ...... 96 23.6 Sources ...... 96 CONTENTS vii

23.7 External links ...... 96

24 Presupposition 97 24.1 Negation of a sentence containing a presupposition ...... 97 24.2 Projection of presuppositions ...... 98 24.3 Presupposition triggers ...... 98 24.3.1 Definite descriptions ...... 98 24.3.2 Factive verbs ...... 99 24.3.3 Implicative verbs ...... 99 24.3.4 Change of state verbs ...... 99 24.3.5 Iteratives ...... 100 24.3.6 Temporal clauses ...... 100 24.3.7 Cleft sentences ...... 100 24.3.8 Comparisons and contrasts ...... 100 24.3.9 Counterfactual conditionals ...... 100 24.3.10 Questions ...... 100 24.3.11 Possessive case ...... 101 24.4 Accommodation of presuppositions ...... 101 24.5 Presupposition in Critical discourse analysis ...... 101 24.6 See also ...... 101 24.7 References ...... 101 24.8 Further reading ...... 102

25 Question 103 25.1 Uses ...... 103 25.1.1 By purpose ...... 103 25.1.2 By grammatical form ...... 104 25.2 Grammar ...... 105 25.3 Responses ...... 105 25.4 Learning ...... 106 25.5 Philosophical questions ...... 106 25.6 Origins of questioning behavior ...... 106 25.7 See also ...... 106 25.8 References ...... 107 25.9 Further reading ...... 108

26 Question dodging 109 26.1 Form ...... 109 26.2 See also ...... 110 26.3 References ...... 110

27 Semantics 111 27.1 Linguistics ...... 111 viii CONTENTS

27.2 Montague grammar ...... 112 27.3 Dynamic turn in semantics ...... 112 27.4 Prototype theory ...... 113 27.5 Theories in semantics ...... 113 27.5.1 Model theoretic semantics ...... 113 27.5.2 Formal (or truth-conditional) semantics ...... 113 27.5.3 Lexical and conceptual semantics ...... 113 27.5.4 Lexical semantics ...... 113 27.5.5 Computational semantics ...... 114 27.6 Computer science ...... 114 27.6.1 Programming languages ...... 114 27.6.2 Semantic models ...... 114 27.7 Psychology ...... 115 27.8 See also ...... 115 27.8.1 Linguistics and semiotics ...... 115 27.8.2 Logic and mathematics ...... 116 27.8.3 Computer science ...... 116 27.8.4 Psychology ...... 116 27.9 References ...... 117 27.10External links ...... 117

28 Signed graph 118 28.1 Examples ...... 118 28.2 ...... 119 28.3 Orientation ...... 119 28.4 ...... 119 28.5 Switching ...... 119 28.6 Fundamental theorem ...... 120 28.7 Frustration ...... 120 28.8 Matroid theory ...... 120 28.9 Other kinds of “signed graph” ...... 121 28.9.1 Signed digraph ...... 121 28.10Coloring ...... 121 28.11Applications ...... 121 28.11.1 Social psychology ...... 121 28.11.2 Spin glasses ...... 122 28.11.3 Data clustering ...... 122 28.12Generalizations ...... 122 28.13Notes ...... 122 28.14References ...... 122

29 Skew- 123 CONTENTS ix

29.1 Definition ...... 123 29.2 Examples ...... 123 29.3 Polar/switch graphs, double covering graphs, and bidirected graphs ...... 124 29.4 Matching ...... 124 29.5 Still life theory ...... 125 29.6 Satisfiability ...... 125 29.7 Recognition ...... 126 29.8 References ...... 126

30 Transpose graph 128 30.1 Notation ...... 128 30.2 Applications ...... 128 30.3 Related concepts ...... 128 30.4 References ...... 129 30.5 Text and image sources, contributors, and licenses ...... 130 30.5.1 Text ...... 130 30.5.2 Images ...... 135 30.5.3 Content license ...... 136 Chapter 1

Bidirected graph

loose edge

extraverted introverted edge edge

introverted directed extraverted half-edge edge half-edge

The different types of edge in a bidirected graph

In the mathematical domain of graph theory, a bidirected graph (introduced by Edmonds & Johnson 1970)*[1] is a graph in which each edge is given an independent orientation (or direction, or arrow) at each end. Thus, there are three kinds of bidirected edges: those where the arrows point outward, towards the vertices, at both ends; those where both arrows point inward, away from the vertices; and those in which one arrow points away from its and towards the opposite end, while the other arrow points in the same direction as the first, away from the opposite end and towards its own vertex. Edges of these three types may be called, respectively, extraverted, introverted, and directed. The“directed”edges are the same as ordinary directed edges in a directed graph; thus, a directed graph is a special kind of bidirected graph. It is sometimes desirable to have also edges with only one end (half-edges); these get only one arrow. An edge with no ends (a loose edge) has no arrows. The edges that are neither half nor loose edges may be called ordinary edges. A skew-symmetric graph is the double of a bidirected graph.

1.1 Other meanings

A symmetric directed graph (that is, a directed graph in which the reverse of every edge is also an edge) is sometimes also called a “bidirected graph”.*[2]

1 2 CHAPTER 1. BIDIRECTED GRAPH

1.2 See also

• Skew-symmetric graph

• Signed graph

1.3 References

[1] Edmonds, Jack; Johnson, Ellis L. (1970), “Matching: a well-solved class of linear programs”, Combinatorial Structures and their Applications: Proceedings of the Calgary Symposium, June 1969, New York: Gordon and Breach. Reprinted in Combinatorial Optimization ̶Eureka, You Shrink!, Springer-Verlag, Lecture Notes in Computer Science 2570, 2003, pp. 27–30, doi:10.1007/3-540-36478-1_3.

[2] Mehlhorn, Kurt; Sanders, Peter (2008), Algorithms and Data Structures: The Basic Toolbox, Springer Science & Business Media, pp. 49 and 170–171, ISBN 978-3-540-77978-0 Chapter 2

Bipartite double cover

In graph theory, the bipartite double cover of an undirected graph G is a bipartite covering graph of G, with twice as many vertices as G. It can be constructed as the G × K2. It is also called the Kronecker double cover, canonical double cover or simply the bipartite double of G. It should not be confused with a of a graph, a family of cycles that includes each edge twice.

2.1 Construction

The bipartite double cover of G has two vertices ui and wi for each vertex vi of G. Two vertices ui and wj are connected by an edge in the double cover if and only if vi and vj are connected by an edge in G. For instance, below is an illustration of a bipartite double cover of a non- G. In the illustration, each vertex in the tensor product is shown using a color from the first term of the product (G) and a shape from the second term of the product (K2); therefore, the vertices ui in the double cover are shown as circles while the vertices wi are shown as squares.

× = =

K2

H

The bipartite double cover may also be constructed using adjacency matrices (as described below) or as the derived graph of a in which each edge of G is labeled by the nonzero element of the two-element group.

2.2 Examples

The bipartite double cover of the is the : K2 × G(5,2) = G(10,3).

The bipartite double cover of a Kn is a (a Kn,n minus a ). In particular, the bipartite double cover of the graph of a , K4, is the graph of a . The bipartite double cover of an odd-length is a cycle of twice the length, while the bipartite double of any bipartite graph (such as an even length cycle, shown in the following example) is formed by two disjoint copies of the original graph.

3 4 CHAPTER 2. BIPARTITE DOUBLE COVER

× = =

K2 G

2.3 Matrix interpretation

If an undirected graph G has a matrix A as its adjacency matrix, then the adjacency matrix of the double cover of G is

[ ] 0 A , AT 0 and the biadjacency matrix of the double cover of G is just A itself. That is, the conversion from a graph to its double cover can be performed simply by reinterpreting A as a biadjacency matrix instead of as an adjacency matrix. More generally, the reinterpretation the adjacency matrices of directed graphs as biadjacency matrices provides a combinatorial equivalence between directed graphs and balanced bipartite graphs.*[1]

2.4 Properties

The bipartite double cover of any graph G is a bipartite graph; both parts of the bipartite graph have one vertex for each vertex of G. A bipartite double cover is connected if and only if G is connected and non-bipartite.*[2] The bipartite double cover is a special case of a double cover (a 2-fold covering graph). A double cover in graph theory can be viewed as a special case of a topological double cover. If G is a non-bipartite symmetric graph, the double cover of G is also a symmetric graph; several known cubic symmetric graphs may be obtained in this way. For instance, the double cover of K4 is the graph of a cube; the double cover of the Petersen graph is the Desargues graph; and the double cover of the graph of the dodecahedron is a 40-vertex symmetric .*[3] It is possible for two different graphs to have isomorphic bipartite double covers. For instance, the Desargues graph is not only the bipartite double cover of the Petersen graph, but is also the bipartite double cover of a different graph that is not isomorphic to the Petersen graph.*[4] Not every bipartite graph is a bipartite double cover of another graph; for a bipartite graph G to be the bipartite cover of another graph, it is necessary and sufficient that the automorphisms of G include an involution that maps each vertex to a distinct and non-adjacent vertex.*[4] For instance, the graph with two vertices and one edges is bipartite but is not a bipartite double cover, because it has no non-adjacent pairs of vertices to be mapped to each other by such an involution; on the other hand, the graph of the cube is a bipartite double cover, and has an involution that maps each vertex to the diametrally opposite vertex. An alternative characterization of the bipartite graphs that may be formed by the bipartite double cover construction was obtained by Sampathkumar (1975).

2.5 Other double covers

In general, a graph may have multiple double covers that are different from the bipartite double cover.*[5] In the following figure, the graph C is a double cover of the graph H:

1. The graph C is a covering graph of H: there is a surjective local isomorphism f from C to H, the one indicated by the colours. For example, f maps both blue nodes in C to the blue node in H. Furthermore, let X be the neighbourhood of a blue node in C and let Y be the neighbourhood of the blue node in H; then the restriction of f to X is a bijection from X to Y. In particular, the degree of each blue node is the same. The same applies to each colour. 2.6. SEE ALSO 5

2. The graph C is a double cover (or 2-fold cover or 2-lift) of H: the preimage of each node in H has size 2. For example, there are exactly 2 nodes in C that are mapped to the blue node in H.

However, C is not a bipartite double cover of H or any other graph; it is not a bipartite graph. If we replace one triangle by a square in H the resulting graph has four distinct double covers. Two of them are bipartite but only one of them is the Kronecker cover.

H C

As another example, the graph of the icosahedron is a double cover of the complete graph K6; to obtain a covering map from the icosahedron to K6, map each pair of opposite vertices of the icosahedron to a single vertex of K6. However, the icosahedron is not bipartite, so it is not the bipartite double cover of K6. Instead, it can be obtained as the orientable double cover of an embedding of K6 on the .

2.6 See also

2.7 Notes

[1] Dulmage & Mendelsohn (1958); Brualdi, Harary & Miller (1980). [2] Brualdi, Harary & Miller (1980), Theorem 3.4. [3] Feng et al. (2008). [4] Imrich & Pisanski (2008). [5] Waller (1976).

2.8 References

• Brualdi, Richard A.; Harary, Frank; Miller, Zevi (1980), “Bigraphs versus digraphs via matrices”, Journal of Graph Theory 4 (1): 51–73, doi:10.1002/jgt.3190040107, MR 558453. • Dulmage, A. L.; Mendelsohn, N. S. (1958),“Coverings of bipartite graphs”, Canadian Journal of Mathematics 10: 517–534, doi:10.4153/CJM-1958-052-0, MR 0097069. The “coverings”in the title of this paper refer to the problem, not to bipartite double covers. However, Brualdi, Harary & Miller (1980) cite this paper as the source of the idea of reinterpreting the adjacency matrix as a biadjacency matrix. • Feng, Yan-Quan; Kutnar, Klavdija; Malnič, Aleksander; Marušic, Dragan (2008),“On 2-fold covers of graphs” , Journal of Combinatorial Theory, Series B 98 (2): 324–341, arXiv:math.CO/0701722, doi:10.1016/j.jctb.2007.07.001, MR 2389602. • Imrich, Wilfried; Pisanski, Tomaž (2008), “Multiple Kronecker covering graphs”, European Journal of Combinatorics 29 (5): 1116–1122, doi:10.1016/j.ejc.2007.07.001, MR 2419215. • Sampathkumar, E. (1975),“On tensor product graphs”, Journal of the Australian Mathematical Society, Series A, Pure Mathematics and Statistics 20 (3): 268–273, doi:10.1017/S1446788700020619, MR 0389681. • Waller, Derek A. (1976),“Double covers of graphs”, Bulletin of the Australian Mathematical Society 14 (2): 233–248, doi:10.1017/S0004972700025053, MR 0406876. 6 CHAPTER 2. BIPARTITE DOUBLE COVER

2.9 External links

• Weisstein, Eric W., “Bipartite Double Graph”, MathWorld. Chapter 3

Complex question

A complex question, trick question, multiple question or plurium interrogationum (Latin, “of many questions” ) is a question that has a presupposition that is complex. The presupposition is a proposition that is presumed to be acceptable to the respondent when the question is asked. The respondent becomes committed to this proposition when he gives any direct answer. The presupposition is called “complex”because it is a conjunctive proposition, a disjunctive proposition, or a conditional proposition. It could also be another type of proposition that contains some logical connective in a way that makes it have several parts that are component propositions.*[1] Complex questions can but do not have to be fallacious, as in being an informal fallacy.*[1]

3.1 Implication by question

One form of misleading discourse involves presupposing and implying something without stating it explicitly, by phrasing it as a question. For example, the question “Does Mr. Jones have a brother in the army?" does not claim that he does, but implies that there must be at least some indication that he does, or the question would not need to be asked.*[2] The person asking the question is thus protected from accusations of making false claims, but still manages to make the implication in the form of a hidden compound question. The fallacy isn't in the question itself, but rather in the listener's assumption that the question would not have been asked without some evidence to support the supposition. This example seems harmless, but consider this one: “Does Mr. Jones have a brother in jail?" In order to have the desired effect, the question must imply something uncommon enough not to be asked without some evidence to the fact. For example, the question“Does Mr. Jones have a brother?" would not cause the listener to think there must be some evidence that he does, since this form of general question is frequently asked with no foreknowledge of the answer.

3.2 Complex question fallacy

For more details on this topic, see Loaded question.

The complex question fallacy, or many questions fallacy, is context dependent; a presupposition by itself doesn't have to be a fallacy. It is committed when someone asks a question that presupposes something that has not been proven or accepted by all the people involved.*[1]*[3]*[4]*[5]*[6] For example,“Is Mary wearing a blue or a red dress?" is fallacious because it artificially restricts the possible responses to a blue or red dress. If the person being questioned wouldn't necessarily consent to those constraints, the question is fallacious.*[1]*[4]*[5]*[6] Hence we can distinguish between:

• legitimately complex questions (not a fallacy): A question that assumes something that the hearer would readily agree to. For example, “Who is the monarch of the United Kingdom?" assumes that there is a place called the United Kingdom and that it has a monarch, both true.

7 8 CHAPTER 3. COMPLEX QUESTION

• illegitimately complex question: On the other hand,“Who is the King of France?" would commit the complex question fallacy because while it assumes there is a place called France (true), it also assumes France currently has a king (false). But since answering this question does not seem to incriminate or otherwise embarrass the speaker, it is complex but not really a loaded question.*[7]

When a complex question contains controversial presuppositions (often with loaded language – having an unspoken and often emotive implication), it is known as a loaded question.*[3]*[4]*[6] For example, a classic loaded question, containing incriminating assumptions that the questioned persons seem to admit to if they answer the questions instead of challenging them, is “Have you stopped beating your wife?" If the person questioned answers, “Yes”, then that implies that he has previously beaten his wife. A loaded question may be asked to trick the respondent into admitting something that the questioner believes to be true, and which may in fact be true. So the previous question is “loaded,”whether or not the respondent has actually beaten his wife ̶and if the respondent answers anything other than “yes”or “no”in an attempt to deny having beaten his wife, the questioner can accuse him of “trying to dodge the question". The very same question may be loaded in one context, but not in the other. For example, the previous question would not be loaded were it asked during a trial in which the defendant has already admitted to beating his wife.*[4]

3.2.1 Similar questions and fallacies

A similar fallacy is the double-barreled question. It is committed when someone asks a question that touches upon more than one issue, yet allows only for one answer.*[8]*[9]*[10] This fallacy can be also confused with petitio principii, begging the question,*[11] which offers a premise no more plausible than, and often just a restatement of, the conclusion.*[12]

Closely connected with [petitio principii] is the fallacy of the Complex Question. By a complex ques- tion, in the broadest meaning of that term, is meant one that suggests its own answer. Any question, for instance, that forces us to select, and assert in our answer to it, one of the elements of the question itself, while some other possibility is really open, is complex in the sense in which that term is here employed. If, for example, one were to ask whether you were going to New York or London, or if your favourite colour were red or blue, or if you had given up a particular bad habit, he would be guilty of the fallacy of the complex question, if, in each case, the alternatives, as a matter of fact, were more numerous than, or were in any way different from, those stated in the question. Any leading question which complicates an issue by over simplification is fallacious for the same reason…In the petitio principii an assumption with respect to the subject-matter of an argument functions as a premise, in the complex question it is a similar assumption that shuts out some of the material possibilities of a situation and confines an issue within too narrow limits. As in the former case, so here, the only way of meeting the difficulty is to raise the previous question, that is, to call the assumption which lies back of the fallacy into question.*[13] ̶Arthur Ernest Davies, “Fallacies”in A Text-Book of Logic

3.3 Notes

[1] Walton, Douglas. “The Fallacy of Many Questions” (PDF). University of Winnipeg. Archived from the original (PDF) on 2006-11-29. Retrieved 2008-01-22.

[2] “compound question, definition”. Legal-dictionary.thefreedictionary.com. Retrieved 2010-02-03.

[3] Michel Meyer, Questions and questioning, Walter de Gruyter, 1988, ISBN 3-11-010680-9, Google Print, p. 198–199

[4] Douglas N. Walton, Fundamentals of critical argumentation, Cambridge University Press, 2006, ISBN 0-521-82319-6, Google Print, p. 194–196

[5] Douglas N. Walton, Informal logic: a handbook for critical argumentation, Cambridge University Press, 1989, ISBN 0- 521-37925-3, Google Print, p. 36–37

[6] Douglas N. Walton. Witness testimony evidence: argumentation, artificial intelligence, and law, Cambridge University Press, 2008, ISBN 0-521-88143-9, Google Print, p. 329 3.3. NOTES 9

[7] Layman, C. Stephen (2003). The Power of Logic. p. 158.

[8] Response bias. SuperSurvey, Ipathia Inc.

[9] Earl R. Babbie, Lucia Benaquisto, Fundamentals of Social Research, Cengage Learning, 2009, Google Print, p. 251

[10] Alan Bryman, Emma Bell, Business research methods, Oxford University Press, 2007, ISBN 0-19-928498-9, Google Print, p. 267–268

[11] Fallacy: Begging the Question The Nizkor Project. Retrieved on: January 22, 2008

[12] Carroll, Robert Todd. The Skeptic's Dictionary. John Wiley & Sons. p. 51. ISBN 0-471-27242-6.

[13] Davies, Arthur Ernest (1915). A Text-Book of Logic. R. G. Adams and company. pp. 572–573. LCCN 15027713. Chapter 4

Directed graph

A directed graph.

In mathematics, and more specifically in graph theory, a directed graph (or digraph) is a graph, or set of nodes connected by edges, where the edges have a direction associated with them. In formal terms, a digraph is a pair G = (V,A) (sometimes G = (V,E) ) of:*[1]

• a set V, whose elements are called vertices or nodes, • a set A of ordered pairs of vertices, called arcs, directed edges, or arrows (and sometimes simply edges with

10 4.1. BASIC TERMINOLOGY 11

a 1 b 1 2 3 4 a 1 0 1-1 4 3 2 b -1 1 0 0 c -1 1 00 d 0 0 0 -1 d c

Directed graph with corresponding incidence matrix.

the corresponding set named E instead of A).

It differs from an ordinary or undirected graph, in that the latter is defined in terms of unordered pairs of vertices, which are usually called edges. A digraph is called “simple”if it has no loops, and no multiple arcs (arcs with same starting and ending nodes). A directed multigraph, in which the arcs constitute a multiset, rather than a set, of ordered pairs of vertices may have loops (that is, “self-loops”with same starting and ending node) and multiple arcs. Some, but not all, texts allow a digraph, without the qualification simple, to have self loops, multiple arcs, or both.

4.1 Basic terminology

An arc e = (x, y) is considered to be directed from x to y ; y is called the head and x is called the tail of the arc; y is said to be a direct successor of x , and x is said to be a direct predecessor of y . If a path made up of one or more successive arcs leads from x to y , then y is said to be a successor of x , and x is said to be a predecessor of y . The arc (y, x) is called the arc (x, y) inverted. An orientation of a simple undirected graph is obtained by assigning a direction to each edge. Any directed graph constructed this way is called an “oriented graph”. A directed graph is an oriented simple graph if and only if it has neither self-loops nor 2-cycles.*[2] A weighted digraph is a digraph with weights assigned to its arcs, similar to a weighted graph. In the context of graph theory a digraph with weighted edges is called a network. The adjacency matrix of a digraph (with loops and multiple arcs) is the -valued matrix with rows and columns corresponding to the nodes, where a nondiagonal entry aij is the number of arcs from node i to node j, and the diagonal entry aii is the number of loops at node i. The adjacency matrix of a digraph is unique up to identical permutation of rows and columns. Another matrix representation for a digraph is its incidence matrix. See direction for more definitions.

4.2 Indegree and outdegree

For a node, the number of head endpoints adjacent to a node is called the indegree of the node and the number of tail endpoints adjacent to a node is its outdegree (called "branching factor" in trees). 12 CHAPTER 4. DIRECTED GRAPH

A digraph with vertices labeled (indegree, outdegree)

Let G = (V, E) and v ∈ V, then the indegree is denoted deg−(v) and the outdegree as deg+(v). A vertex with deg−(v) = 0 is called a source, as it is the origin of each of its incident edges. Similarly, a vertex with deg+(v) = 0 is called a sink. The degree sum formula states that, for a directed graph,

∑ ∑ deg+(v) = deg−(v) = |E| . v∈V v∈V

If, for every node v ∈ V, we have deg+(v) = deg−(v) , the graph is called a balanced digraph.*[3]

4.3 Degree sequence

The degree sequence of a directed graph is the list of its indegree and outdegree pairs; for the above example we have degree sequence ((2,0),(2,2),(0,2),(1,1)). The degree sequence is a directed graph invariant so isomorphic directed graphs have the same degree sequence. However, the degree sequence does not, in general, uniquely identify a graph; in some cases, non-isomorphic graphs have the same degree sequence. The digraph realization problem is the problem of finding a digraph with the degree sequence being a given sequence of positive integer pairs. (Trailing pairs of zeros may be ignored since they are trivially realized by adding an appro- priate number of isolated vertices to the digraph.) A sequence which is the degree sequence of some digraph, i.e. for which the digraph realization problem has a solution, is called a digraphic or digraphical sequence. This problem can either be solved by the Kleitman–Wang algorithm or by the Fulkerson–Chen–Anstee theorem.

4.4 Digraph connectivity

Main article: Connectivity (graph theory)

A digraph G is called weakly connected (or just connected*[4]) if the undirected underlying graph obtained by re- placing all directed edges of G with undirected edges is a connected graph. A digraph is strongly connected or strong 4.5. CLASSES OF DIGRAPHS 13

if it contains a directed path from u to v and a directed path from v to u for every pair of vertices u,v. The strong components are the maximal strongly connected subgraphs.

4.5 Classes of digraphs

A directed graph G is called symmetric if, for every arc that belongs to G, the corresponding reversed arc also belongs to G. A symmetric, loopless directed graph is equivalent to an undirected graph with the edges replaced by pairs of inverse arcs; thus the number of edges is equal to the number of arcs halved.

5 7 3

11 8

2 9 10

A simple acyclic directed graph

An acyclic directed graph, acyclic digraph, or is a directed graph with no directed cycles. Special cases of acyclic directed graphs include multitrees (graphs in which no two directed paths from a single starting node meet back at the same ending node), oriented trees or polytrees (digraphs formed by orienting the edges of undirected acyclic graphs), and rooted trees (oriented trees in which all edges of the underlying undirected tree are directed away from the roots). A tournament is an oriented graph obtained by choosing a direction for each edge in an undirected complete graph. In the theory of Lie groups, a quiver Q is a directed graph serving as the domain of, and thus characterizing the shape * of, a representation V defined as a functor, specifically an object of the functor category FinVctK F(Q) where F(Q) is the free category on Q consisting of paths in Q and FinVctK is the category of finite-dimensional vector spaces over a field K. Representations of a quiver label its vertices with vector spaces and its edges (and hence paths) compatibly with linear transformations between them, and transform via natural transformations. 14 CHAPTER 4. DIRECTED GRAPH

1 2

3 4

A tournament on 4 vertices

4.6 See also

• Coates graph • Flow chart • Rooted graph • Flow graph (mathematics) • Mason graph • Oriented graph • Preorder • Quiver • Signal-flow graph • Transpose graph • Vertical constraint graph 4.7. NOTES 15

4.7 Notes

[1] Bang-Jensen & Gutin (2000). Diestel (2005), Section 1.10. Bondy & Murty (1976), Section 10.

[2] Diestel (2005), Section 1.10.

[3] Satyanarayana, Bhavanari; Prasad, Kuncham Syam, Discrete Mathematics and Graph Theory, PHI Learning Pvt. Ltd., p. 460, ISBN 978-81-203-3842-5; Brualdi, Richard A. (2006), Combinatorial matrix classes, Encyclopedia of mathematics and its applications 108, Cambridge University Press, p. 51, ISBN 978-0-521-86565-4.

[4] Bang-Jensen & Gutin (2000) p. 19 in the 2007 edition; p. 20 in the 2nd edition (2009).

4.8 References

• Bang-Jensen, Jørgen; Gutin, Gregory (2000), Digraphs: Theory, Algorithms and Applications, Springer, ISBN 1-85233-268-9 (the corrected 1st edition of 2007 is now freely available on the authors' site; the 2nd edition appeared in 2009 ISBN 1-84800-997-6). • Bondy, John Adrian; Murty, U. S. R. (1976), Graph Theory with Applications, North-Holland, ISBN 0-444- 19451-7. • Diestel, Reinhard (2005), Graph Theory (3rd ed.), Springer, ISBN 3-540-26182-6 (the electronic 3rd edition is freely available on author's site). • Harary, Frank; Norman, Robert Z.; Cartwright, Dorwin (1965), Structural Models: An Introduction to the Theory of Directed Graphs, New York: Wiley. • Number of directed graphs (or digraphs) with n nodes. Chapter 5

Downward entailing

In linguistic semantics, a downward entailing (DE) propositional operator is one that denotes a monotone decreasing function. A downward entailing operator reverses the relation of semantic strength among expressions. An expression like“run fast”is semantically stronger than the expression“run”since“run fast”is true of fewer things than the latter. Thus the proposition “John ran fast”entails the proposition “John ran”. Examples of DE contexts include “not”, “nobody”, “few people”, “at most two boys”. They reverse the entailment relation of sentences formed with the predicates “run fast”and “run”, for example. The proposition “Nobody ran”entails that “Nobody ran fast”. The proposition “At most two boys ran”entails that “At most two boys ran fast”. Conversely, an upward entailing operator is one that preserves the relation of semantic strength among a set of expressions (for example,“more”). A context that is neither downward nor upward entailing is non-monotone, such as “exactly”. Ladusaw (1980) proposed that downward entailment is the property that licenses polarity items. Indeed, “Nobody saw anything“is downward entailing and admits the negative polarity item anything, while * “I saw anything”is ungrammatical (the upward entailing context does not license such a polarity item). This approach explains many but not all typical cases of polarity item sensitivity. Subsequent attempts to describe the behavior of polarity items rely on a broader notion of nonveridicality.

5.1 Strawson-DE

Downward entailment does not explain the licensing of any in certain contexts such as with only:

Only John ate any vegetables for breakfast.

This is not a downward entailing context because the above proposition does not entail “Only John ate kale for breakfast”(John may have eaten spinach, for example). Von Fintel (1999) claims that although only does not exhibit the classical DE pattern, it can be shown to be DE in a special way. He defines a notion of Strawson-DE for expressions that come with presuppositions. The reasoning scheme is as follows:

1. P → Q 2. [[ only John ]] (P) is defined. 3. [[ only John ]] (Q) is true. 4. Therefore, [[ only John ]] (P) is true.

Here, (2) is the intended presupposition. For example:

1. Kale is a vegetable.

16 5.2. SEE ALSO 17

2. Somebody ate kale for breakfast.

3. Only John ate any vegetables for breakfast. 4. Therefore, only John ate kale for breakfast.

Hence only is a Strawson-DE and therefore licenses any. Giannakidou (2002) argues that Strawson-DE allows not just the presupposition of the evaluated sentence but just any arbitrary proposition to count as relevant. This results in over-generalization that validates the use if any' in contexts where it is, in fact, ungrammatical, such as clefts, preposed exhaustive focus, and each/both:

* It was John who talked to anybody. * JOHN talked to anybody. * Each student who saw anything reported to the Dean. * Both students who saw anything reported to the Dean.

5.2 See also

• Entailment (pragmatics) • Veridicality

• Polarity item

5.3 References

• Ladusaw, William (1980). Polarity Sensitivity as Inherent Scope Relations. Garland, NY.

• Von Fintel, Kai (1999). “NPI-Licensing, Strawson-Entailment, and Context-Dependency”. Journal of Semantics (16): 97–148.

• Giannakidou, Anastasia (2002). “Licensing and sensitivity in polarity items: from downward entailment to nonveridicality”. In Maria Andronis; Anne Pycha; Keiko Yoshimura. CLS 38: Papers from the 38th Annual Meeting of the Chicago Linguistic Society, Parasession on Polarity and Negation. Retrieved 2011-12-15. Chapter 6

Entailment (pragmatics)

In pragmatics (linguistics), entailment is the relationship between two sentences where the truth of one (A) requires the truth of the other (B). For example, the sentence (A) The president was assassinated. entails (B) The president is dead. Notice also that if (B) is false, then (A) must necessarily be false. To show entailment, we must show that (A) being true forces (B) to be true, or, equivalently, that (B) being false forces (A) to be false. Entailment differs from implicature (in their definitions for pragmatics), where the truth of one (A) suggests the truth of the other (B), but does not require it. For example, the sentence (A) Mary had a baby and (B) got married implicates that (A) she had a baby before (B) the wedding, but this is cancellable by adding – not necessarily in that order. Entailments are not cancellable. Entailment also differs from presupposition in that in presupposition, the truth of what one is presupposing is taken for granted. A simple test to differentiate presupposition from entailment is negation. For example, both The king of France is ill and The king of France is not ill presuppose that there is a king of France. However The president was not assassinated no longer entails The president is dead (nor its opposite, as the president could have died in another way). In this case, presupposition remains under negation, but entailment does not.

6.1 Types of entailment

There are three types of entailment: formal or logical entailment, analytic entailment, synthetic entailment.

6.2 See also

• Compound question • Downward entailing

• Loaded question

6.3 References

6.4 Further reading

• Entailment Regimes in SPARQL 1.1

18 Chapter 7

Fallacy

This article is about errors in reasoning. For the formal concept in philosophy and logic, see formal fallacy. For other uses, see Fallacy (disambiguation).

A fallacy is the use of poor, or invalid, reasoning for the construction of an argument.*[1]*[2] A fallacious argument may be deceptive by appearing to be better than it really is. Some fallacies are committed intentionally to manipulate or persuade by deception, while others are committed unintentionally due to carelessness or ignorance. Fallacies are commonly divided into “formal”and “informal”.A formal fallacy can be expressed neatly in a standard system of logic, such as propositional logic,*[1] while an informal fallacy originates in an error in reasoning other than an improper logical form.*[3] Arguments containing informal fallacies may be formally valid, but still fallacious.*[4]

7.1 Formal fallacy

Main article: Formal fallacy

A formal fallacy is a common error of thinking that can neatly be expressed in standard system of logic.*[1] An argument that is formally fallacious is rendered invalid due to a flaw in its logical structure. Such an argument is always considered to be wrong. The presence of a formal fallacy in a deductive argument does not imply anything about the argument's premises or its conclusion. Both may actually be true, or may even be more probable as a result of the argument; but the deductive argument is still invalid because the conclusion does not follow from the premises in the manner described. By extension, an argument can contain a formal fallacy even if the argument is not a deductive one: for instance, an inductive argument that incorrectly applies principles of probability or causality can be said to commit a formal fallacy.

7.1.1 Common examples

Main article: List of fallacies § Formal fallacies

7.2 Aristotle's Fallacies

Aristotle was the first to systematize logical errors into a list. Aristotle's "Sophistical Refutations"(De Sophisticis Elenchis) identifies thirteen fallacies. He divided them up into two major types, those depending on language and those not depending on language.*[5] These fallacies are called verbal fallacies and material fallacies, respectively. A material fallacy is an error in what the arguer is talking about, while a verbal fallacy is an error in how the arguer is talking. Verbal fallacies are those in which a conclusion is obtained by improper or ambiguous use of words.*[6]

19 20 CHAPTER 7. FALLACY

7.3 Whately's grouping of fallacies

Richard Whately defines a fallacy broadly as, “any argument, or apparent argument, which professes to be decisive of the matter at hand, while in reality it is not.*[7] Whately divided fallacies into two groups: logical and material. According to Whately, logical fallacies are argu- ments where the conclusion does not follow from the premises. Material fallacies are not logical errors because the conclusion does follow from the premises. He then divided the logical group into two groups: purely logical and semi-logical. The semi-logical group included all of Aristotle's sophisms except:ignoratio elenchi, petitio principii, and non causa pro causa, which are in the material group.*[8]

7.4 Intentional fallacies

Sometimes a speaker or writer uses a fallacy intentionally. In any context, including academic debate, a conversation among friends, political discourse, advertising, or for comedic purposes, the arguer may use fallacious reasoning to try to persuade the listener or reader, by means other than offering relevant evidence, that the conclusion is true. Examples of this include the speaker or writer: diverting the argument to unrelated issues with a red herring (Ignoratio elenchi); insulting someone's character (argumentum ad hominem), assuming they are right by“begging the question” (petitio principi); making jumps in logic (non-sequitur); identifying a false cause and effect (post hoc ergo propter hoc); asserting that everyone agrees (bandwagoning); creating a “false dilemma”(“either-or fallacy”) in which the situation is oversimplified; selectively using facts (card-stacking); making false or misleading comparisons (false equivalence and "false analogy); generalizing quickly and sloppily (hasty generalization).*[9] In humor, errors of reasoning are used for comical purposes. Groucho Marx used fallacies of amphiboly, for instance, to make ironic statements; Gary Larson employs fallacious reasoning in many of his cartoons. Wes Boyer and Samuel Stoddard have written a humorous essay teaching students how to be persuasive by means of a whole host of informal and formal fallacies.*[10]

7.5 Deductive fallacy

Main articles: Deductive fallacy and formal fallacy

In philosophy, the term formal fallacy for logical fallacies and defined formally as: a flaw in the structure of a deductive argument which renders the argument invalid. The term is preferred as logic is the use of valid reasoning and a fallacy is an argument that uses poor reasoning therefore the term logical fallacy is an oxymoron. However, the same terms are used in informal discourse to mean an argument which is problematic for any reason. A logical form such as "A and B" is independent of any particular conjunction of meaningful propositions. Logical form alone can guarantee that given true premises, a true conclusion must follow. However, formal logic makes no such guarantee if any premise is false; the conclusion can be either true or false. Any formal error or logical fallacy similarly invalidates the deductive guarantee. Both the argument and all its premises must be true for a statement to be true.

7.6 Paul Meehl's Fallacies

In Why I Do Not Attend Case Conferences*[11] (1973), psychologist Paul Meehl discusses several fallacies that can arise in medical case conferences that are primarily held to diagnose patients. These fallacies can also be considered more general errors of thinking that all individuals (not just psychologists) are prone to making.

• Barnum effect: Making a statement that is trivial, and true of everyone, e.g of all patients, but which appears to have special significance to the diagnosis.

• Sick-sick fallacy“ ( pathological set”): The tendency to generalize from personal experiences of health and ways of being, to the identification of others who are different from ourselves as being “sick”. Meehl emphasizes that though psychologists claim to know about this tendency, most are not very good at correcting it in their own thinking. 7.7. FALLACIES OF MEASUREMENT 21

•“Me too”fallacy: The opposite of Sick-sick. Imagining that “everyone does this”and thereby minimizing a symptom without assessing the probability of whether a mentally healthy person would actually do it. A varia- tion of this is Uncle George's pancake fallacy. This minimizes a symptom through reference to a friend/relative who exhibited a similar symptom, thereby implying that it is normal. Meehl points out that consideration should be given that the patient is not healthy by comparison but that the friend/relative is unhealthy.

• Multiple Napoleons fallacy: “It's not real to us, but it's 'real' to him.”A relativism that Meehl sees as a waste of time. There is a distinction between reality and delusion that is important to make when assessing a patient and so the consideration of comparative realities can mislead and distract from the importance of a patient's delusion to a diagnostic decision.

• Hidden decisions: Decisions based on factors that we do not own up to or challenge, and for example result in the placing of middle- and upper-class patients in therapy while lower-class patients are given medication. Meehl identifies these decisions as related to an implicit ideal patient who is young, attractive, verbal, intelligent, and successful (YAVIS). He sees YAVIS patients as being preferred by psychotherapists because they can pay for long-term treatment and are more enjoyable to interact with.

• The spun-glass theory of the mind: The belief that the human organism is so fragile that negative events, such as criticism, rejection, or failure, are bound to cause major trauma to the system. Essentially not giving humans, and sometimes patients, enough credit for their resilience and ability to recover.*[11]

7.7 Fallacies of Measurement

Increasing availability and circulation of big data are driving proliferation of new metrics for scholarly author- ity,*[12]*[13] and there is lively discussion regarding the relative usefulness of such metrics for measuring the value of knowledge production in the context of an“information tsunami.”*[14] Where mathematical fallacies are subtle mistakes in reasoning leading to invalid mathematical proofs, measurement fallacies are unwarranted inferential leaps involved in the extrapolation of raw data to a measurement-based value claim. The ancient Greek Sophist Protagoras was one of the first thinkers to propose that humans can generate reliable measurements through his“human-measure” principle and the practice of dissoi logoi (arguing multiple sides of an issue).*[15]*[16] This history helps explain why measurement fallacies are informed by informal logic and argumentation theory.

• Anchoring fallacy: Anchoring is a cognitive bias, first theorized by Amos Tversky and Daniel Kahneman, that “describes the common human tendency to rely too heavily on the first piece of information offered (the 'anchor') when making decisions.”In measurement arguments, anchoring fallacies can occur when unwarranted weight is given to data generated by metrics that the arguers themselves acknowledge is flawed. For example, limitations of the Journal Impact Factor (JIF) are well documented,*[17] and even JIF pioneer Eugene Garfield notes,“while citation data create new tools for analyses of research performance, it should be stressed that they supplement rather than replace other quantitative-and qualitative-indicators.”*[18] To the extent that arguers jettison acknowledged limitations of JIF-generated data in evaluative judgments, or leave behind Garfield's “supplement rather than replace”caveat, they court commission of anchoring fallacies.

• Naturalistic Fallacy: In the context of measurement, a naturalistic fallacy can occur in a reasoning chain that makes an unwarranted extrapolation from “is”to “ought,”as in the case of sheer quantity metrics based on the premise “more is better”*[14] or, in the case of developmental assessment in the field of psychology, “higher is better.”*[19]

• False Analogy: In the context of measurement, this error in reasoning occurs when claims are supported by unsound comparisons between data points, hence the false analogy's informal nickname of the “apples and oranges”fallacy.*[20] For example, the Scopus and Web of Science bibliographic databases have difficulty distinguishing between citations of scholarly work that are arms-length endorsements, ceremonial citations, or negative citations (indicating the citing author withholds endorsement of the cited work).*[21] Hence, measurement-based value claims premised on the uniform quality of all citations may be questioned on false analogy grounds.

• Argumentum ex Silentio: An argument from silence features an unwarranted conclusion advanced based on the absence of data. For example, Academic Analytics' Faculty Scholarly Productivity Index purports to measure overall faculty productivity, yet the tool does not capture data based on citations in books. This creates a 22 CHAPTER 7. FALLACY

possibility that low productivity measurements using the tool may constitute argumentum ex silentio fallacies, to the extent that such measurements are supported by the absence of book citation data.

• Ecological Fallacy: An ecological fallacy is committed when one draws an inference from data based on the premise that qualities observed for groups necessarily hold for individuals; for example,“if countries with more Protestants tend to have higher suicide rates, then Protestants must be more likely to commit suicide.”*[22] In metrical argumentation, ecological fallacies can be committed when one measures scholarly productivity of a sub-group of individuals (e.g. “Puerto Rican”faculty) via reference to aggregate data about a larger and different group (e.g. “Hispanic”faculty).*[23]

7.8 Other systems of classification

Of other classifications of fallacies in general the most famous are those of Francis Bacon and J. S. Mill. Bacon (Novum Organum, Aph. 33, 38 sqq.) divided fallacies into four Idola (Idols, i.e. False Appearances), which sum- marize the various kinds of mistakes to which the human intellect is prone. With these should be compared the Offendicula of Roger Bacon, contained in the Opus maius, pt. i. J. S. Mill discussed the subject in book v. of his Logic, and Jeremy Bentham's Book of Fallacies (1824) contains valuable remarks. See Rd. Whateley's Logic, bk. v.; A. de Morgan, Formal Logic (1847) ; A. Sidgwick, Fallacies (1883) and other textbooks.

7.9 Assessment of Fallacies - Pragmatic Theory

According to the pragmatic theory,*[24] a fallacy can in some instances be an error a fallacy, use of a heuristic (short version of an argumentation scheme) to jump to a conclusion. However, even more worryingly, in other instances it is a tactic or ploy used inappropriately in argumentation to try to get the best of a speech part unfairly. There are always two parties to an argument containing a fallacy - the perpetrator and the intended victim. The dialogue framework required to support the pragmatic theory of fallacy is built on the presumption that argumentative dialogue has both an adversarial component and a collaborative component. A dialogue has individual goals for each participant, but also collective (shared) goals that apply to all participants. A fallacy of the second kind is seen as more than simply violation of a rule of reasonable dialogue. It is also a deceptive tactic of argumentation, based on sleight-of-hand. Aristotle explicitly compared contentious reasoning to unfair fighting in athletic contest. But the roots of the pragmatic theory go back even further in history to the Sophists. The pragmatic theory finds its roots in the Aristotelian conception of a fallacy as a sophistical refutation, but also supports the view that many of the types of arguments traditionally labelled as fallacies are in fact reasonable techniques of argumentation that can be used, in many cases, to support legitimate goals of dialogue. Hence on the pragmatic approach, each case needs to analyzed individually, to determine by the textual evidence whether the argument is fallacious or reasonable.

7.10 See also

Lists

• List of cognitive biases

• List of fallacies

• List of memory biases

• List of paradoxes

Concepts

• Association fallacy

• Cogency

• Cognitive bias 7.11. REFERENCES 23

• Cognitive distortion

• Demagogy

• Evidence

• Fallacies of definition

• False premise

• False statement

• Invalid proof

• Mathematical fallacy

• Paradox

• Prosecutor's fallacy

• Sophism

• Soundness

• Truth

• Validity

• Victim blaming

Works

• Attacking Faulty Reasoning

• Straight and Crooked Thinking

7.11 References

[1] Harry J. Gensler, The A to Z of Logic (2010:p74). Rowman & Littlefield, ISBN 9780810875968

[2] John Woods, The Death of Argument (2004). Applied Logic Series Volume 32, pp 3-23. ISBN 9789048167005

[3] “Informal Fallacies, Northern Kentucky University”. Retrieved 2013-09-10.

[4] “Internet Encyclopedia of Philosophy, The University of Tennessee at Martin”. Retrieved 2013-09-10.

[5] “Aristotle's original 13 fallacies”. The Non Sequitur. Retrieved 2013-05-28.

[6] “PHIL 495: Philosophical Writing (Spring 2008), Texas A&M University”. Retrieved 2013-09-10.

[7] Frans H. van Eemeren, Bart Garssen, Bert Meuffels (2009). Fallacies and Judgments of Reasonableness: Empirical Research Concerning the Pragma-Dialectical Discussion Rules, p.8. ISBN 9789048126149.

[8] Coffey, P. (1912). The Science of Logic. Longmans, Green, and Company. p. 302. LCCN 12018756.

[9] Ed Shewan (2003). Applications of Grammar: Principles of Effective Communication (2nd ed.). Christian Liberty Press. pp. 92 ff. ISBN 1-930367-28-7.

[10] Boyer, Web. “How to Be Persuasive”. Retrieved 12/05/2012. Check date values in: |accessdate= (help)

[11] Meehl, P.E. (1973). Psychodiagnosis: Selected papers. Minneapolis (MN): University of Minnesota Press, p. 225-302.

[12] Meho, Lokman (2007). “The Rise and Rise of Citation Analysis” (PDF). Physics World. January: 32–36. Retrieved October 28, 2013.

[13] Jensen, Michael (June 15, 2007). “The New Metrics of Scholarly Authority”. Chronicle Review. Retrieved 28 October 2013. 24 CHAPTER 7. FALLACY

[14] Baveye, Phillippe C. (2010). “Sticker Shock and Looming Tsunami: The High Cost of Academic Serials in Perspective” . Journal of Scholarly Publishing 41: 191–215. doi:10.1353/scp.0.0074.

[15] Schiappa, Edward (1991). Protagoras and Logos: A Study in Greek Philosophy and Rhetoric. Columbia, SC: University of South Carolina Press. ISBN 0872497585.

[16] Protagoras (1972). The Older Sophists. Indianapolis, IN: Hackett Publishing Co. ISBN 0872205568.

[17] National Communication Journal (2013). Impact Factors, Journal Quality, and Communication Journals: A Report for the Council of Communication Associations (PDF). Washington, D.C.: National Communication Association.

[18] Gafield, Eugene (1993). “What Citations Tell us About Canadian Research,”. Canadian Journal of Library and Infor- mation Science 18 (4): 34.

[19] Stein, Zachary (October 2008). “Myth Busting and Metric Making: Refashioning the Discourse about Development”. Integral Leadership Review 8 (5). Retrieved 28 October 2013.

[20] Kornprobst, Markus (2007). “Comparing Apples and Oranges? Leading and Misleading Uses of Historical Analogies” . Millennium - Journal of International Studies 36: 29–49. doi:10.1177/03058298070360010301. Retrieved 29 October 2013.

[21] Meho, Lokman (2007).“The Rise and Rise of Citation Analysis”(PDF). Physics World. January: 32. Retrieved October 28, 2013.

[22] Freedman, David A. (2004). Michael S. Lewis-Beck & Alan Bryman & Tim Futing Liao, ed. Encyclopedia of Social Science Research Methods. Thousand Oaks, CA: Sage. pp. 293–295. ISBN 0761923632.

[23] Allen, Henry L. (1997).“Faculty Workload and Productivity: Ethnic and Gender Disparities”(PDF). NEA 1997 Almanac of Higher Education: 39. Retrieved 29 October 2013.

[24] Walton, Douglas (1995). A Pragmatic Theory of Fallacy. Tuscaloosa: University of Alabama Press.

• Fearnside, W. Ward and William B. Holther, Fallacy: The Counterfeit of Argument, 1959.

• Vincent F. Hendricks, Thought 2 Talk: A Crash Course in Reflection and Expression, New York: Automatic Press / VIP, 2005, ISBN 87-991013-7-8

• D. H. Fischer, Historians' Fallacies: Toward a Logic of Historical Thought, Harper Torchbooks, 1970.

• Warburton Nigel, Thinking from A to Z, Routledge 1998.

• T. Edward Damer. Attacking Faulty Reasoning, 5th Edition, Wadsworth, 2005. ISBN 0-534-60516-8

• Sagan, Carl,"The Demon-Haunted World: Science As a Candle in the Dark". Ballantine Books, March 1997 ISBN 0-345-40946-9, 480 pgs. 1996 hardback edition: Random House, ISBN 0-394-53512-X, xv+457 pages plus addenda insert (some printings). Ch.12.

7.12 Further reading

• C. L. Hamblin, Fallacies, Methuen London, 1970. reprinted by Vale Press in 1998 as ISBN 0-916475-24-7.

• Hans V. Hansen; Robert C. Pinto (1995). Fallacies: classical and contemporary readings. Penn State Press. ISBN 978-0-271-01417-3.

• Frans van Eemeren; Bart Garssen; Bert Meuffels (2009). Fallacies and Judgments of Reasonableness: Empirical Research Concerning the Pragma-Dialectical Discussion. Springer. ISBN 978-90-481-2613-2.

• Douglas N. Walton, Informal logic: A handbook for critical argumentation. Cambridge University Press, 1989.

• Douglas, Walton (1987). Informal Fallacies. Amsterdam: John Benjamins.

• Walton, Douglas (1995). A Pragmatic Theory of Fallacy. Tuscaloosa: University of Alabama Press.

• Walton, Douglas (2010). “Why Fallacies Appear to Be Better Arguments than They Are”. Informal Logic 30 (2): 159–184. 7.13. EXTERNAL LINKS 25

• John Woods (2004). The death of argument: fallacies in agent based reasoning. Springer. ISBN 978-1-4020- 2663-8.

Historical texts

• Aristotle, On Sophistical Refutations, De Sophistici Elenchi. library.adelaide.edu.au • William of Ockham, Summa of Logic (ca. 1323) Part III.4.

• John Buridan, Summulae de dialectica Book VII. • Francis Bacon, the doctrine of the idols in Novum Organum Scientiarum, Aphorisms concerning The Interpre- tation of Nature and the Kingdom of Man, XXIIIff. fly.hiwaay.net • Arthur Schopenhauer, The Art of Controversy | Die Kunst, Recht zu behalten - The Art Of Controversy (bilin- gual), (also known as “Schopenhauers 38 stratagems”). gutenberg.net • John Stuart Mill, A System of Logic - Raciocinative and Inductive. Book 5, Chapter 7, Fallacies of Confusion. la.utexas.edu

7.13 External links

• Fallacies entry by Hans Hansen in the Stanford Encyclopedia of Philosophy

• Informal logic entry in the Stanford Encyclopedia of Philosophy

• Fallacy entry in the Internet Encyclopedia of Philosophy • Fallacy at PhilPapers

• Appeal to Authority Appeal to Authority Logical Fallacy • FallacyFiles.org contains categorization of fallacies with examples.

• 42 informal logical fallacies explained by Dr. Michael C. Labossiere (including examples), nizkor.org • Humbug! The skeptic's field guide to spotting fallacies in thinking – textbook on fallacies. scribd.com

• List of fallacies with clear examples, infidels.org • Interactive Syllogistic Machine A web based syllogistic machine for exploring fallacies, figures, and modes of syllogisms. • Logical Fallacies and the Art of Debate, csun.edu

• LogicalFallacies.Info • Stephen Downes Guide to the Logical Fallacies, onegoodmove.org

• WebCitation archive. Chapter 8

Fixed point (mathematics)

Not to be confused with a stationary point where f'(x) = 0, or with fixed-point arithmetic, a form of limited- arithmetic in computing. In mathematics, a fixed point (sometimes shortened to fixpoint, also known as an invariant point) of a function is

y=x

y=f(x)

A function with three fixed points

26 8.1. ATTRACTIVE FIXED POINTS 27

an element of the function's domain that is mapped to itself by the function. That is to say, c is a fixed point of the function f(x) if and only if f(c) = c. This means f(f(...f(c)...)) = f*n(c) = c, an important terminating consideration when recursively computing f. A set of fixed points is sometimes called a fixed set. For example, if f is defined on the real numbers by f(x) = x2 − 3x + 4, then 2 is a fixed point of f, because f(2) = 2. Not all functions have fixed points: for example, if f is a function defined on the real numbers as f(x) = x + 1, then it has no fixed points, since x is never equal to x + 1 for any . In graphical terms, a fixed point means the point (x, f(x)) is on the line y = x, or in other words the graph of f has a point in common with that line. Points which come back to the same value after a finite number of iterations of the function are known as periodic points; a fixed point is a periodic point with period equal to one. In projective geometry, a fixed point of a projectivity has been called a double point.*[1]

8.1 Attractive fixed points

1

0.8

0.6

0.4

0.2

0

-1.5 -1 -0.5 0 0.5 1 1.5

The fixed point iteration xn+1 = cos xn with initial value x1 = −1.

An attractive fixed point of a function f is a fixed point x0 of f such that for any value of x in the domain that is close enough to x0, the iterated function sequence

x, f(x), f(f(x)), f(f(f(x))),... 28 CHAPTER 8. FIXED POINT (MATHEMATICS)

converges to x0. An expression of prerequisites and proof of the existence of such solution is given by the Banach fixed-point theorem. The natural cosine function (“natural”means in radians, not degrees or other units) has exactly one fixed point, which is attractive. In this case, “close enough”is not a stringent criterion at all̶to demonstrate this, start with any real number and repeatedly press the cos key on a calculator (checking first that the calculator is in “radians” mode). It eventually converges to about 0.739085133, which is a fixed point. That is where the graph of the cosine function intersects the line y = x . Not all fixed points are attractive: for example, x = 0 is a fixed point of the function f(x) = 2x, but iteration of this function for any value other than zero rapidly diverges. However, if the function f is continuously differentiable in an ′ open neighbourhood of a fixed point x0, and |f (x0)| < 1 , attraction is guaranteed. Attractive fixed points are a special case of a wider mathematical concept of attractors. An attractive fixed point is said to be a stable fixed point if it is also Lyapunov stable. A fixed point is said to be a neutrally stable fixed point if it is Lyapunov stable but not attracting. The center of a linear homogeneous differential equation of the second order is an example of a neutrally stable fixed point.

8.2 Applications

In many fields, equilibria or stability are fundamental concepts that can be described in terms of fixed points. For example, in economics, a Nash equilibrium of a game is a fixed point of the game's best response correspondence. However, in physics, more precisely in the theory of phase transitions, linearisation near an unstable fixed point has led to Wilson's Nobel prize-winning work inventing the renormalization group, and to the mathematical explanation of the term "critical phenomenon". In compilers, fixed point computations are used for program analysis, for example data-flow analysis which are often required to do code optimization. The vector of PageRank values of all web pages is the fixed point of a linear transformation derived from the World Wide Web's link structure. Logician Saul Kripke makes use of fixed points in his influential theory of truth. He shows how one can generate a partially defined truth predicate (one which remains undefined for problematic sentences like “This sentence is not true”), by recursively defining“truth”starting from the segment of a language which contains no occurrences of the word, and continuing until the process ceases to yield any newly well-defined sentences. (This will take a denumerable infinity of steps.) That is, for a language L, let L-prime be the language generated by adding to L, for each sentence S in L, the sentence "S is true.”A fixed point is reached when L-prime is L; at this point sentences like“This sentence is not true”remain undefined, so, according to Kripke, the theory is suitable for a natural language which contains its own truth predicate. The concept of fixed point can be used to define the convergence of a function.

8.3 Topological fixed point property

Main article: Fixed point property

A topological space X is said to have the fixed point property (briefly FPP) if for any continuous function

f : X → X

there exists x ∈ X such that f(x) = x . The FPP is a topological invariant, i.e. is preserved by any homeomorphism. The FPP is also preserved by any retraction. According to the Brouwer fixed point theorem, every compact and convex subset of a euclidean space has the FPP. Compactness alone does not imply the FPP and convexity is not even a topological property so it makes sense to ask 8.4. GENERALIZATION TO PARTIAL ORDERS: PREFIXPOINT AND POSTFIXPOINT 29 how to topologically characterize the FPP. In 1932 Borsuk asked whether compactness together with contractibility could be a necessary and sufficient condition for the FPP to hold. The problem was open for 20 years until the conjecture was disproved by Kinoshita who found an example of a compact contractible space without the FPP.*[2]

8.4 Generalization to partial orders: prefixpoint and postfixpoint

The notion and terminology is generalized to a partial order. Let ≤ be a partial order over a set X and let f:X → X be a function over X. Then a prefixpoint (also spelled pre-fixpoint) of f is any p such that f(p) ≤ p. Analogously a postfixpoint (or post-fixpoint) of f is any p such that p ≤ f(p).*[3] One way to express the Knaster–Tarski theorem is to say that a monotone function on a complete lattice has a least fixpoint which coincides with its least prefixpoint (and similarly its greatest fixpoint coincides with its greatest postfixpoint). Prefixpoints and postfixpoints have applications in theoretical computer science.*[4]

8.5 See also

• Eigenvector

• Equilibrium

• Attractor

• Stability theory

• Stationary point

• Normal form of Möbius transformation

• Invariant (mathematics)

• Fixed-point combinator

• Fixed point property

• Idempotent

• Fixed-point theorems

• Least fixed point and greatest fixed point

• Nielsen theory

• Sierpinski triangle

• Koenigs function

• Recurrence relation

8.6 Notes

[1] Coxeter, H. S. M. (1942). Non-Euclidean Geometry. University of Toronto Press. p. 36.

[2] Kinoshita, S. (1953). “On Some Contractible Continua without Fixed Point Property”. Fund. Math. 40 (1): 96–98. ISSN 0016-2736.

[3] B. A. Davey; H. A. Priestley (2002). Introduction to Lattices and Order. Cambridge University Press. p. 182. ISBN 978-0-521-78451-1.

[4] Yde Venema (2008) Lectures on the Modal μ-calculus 30 CHAPTER 8. FIXED POINT (MATHEMATICS)

8.7 External links

• Animations for Fixed Point Iteration

• An Elegant Solution for Drawing a Fixed Point Chapter 9

Graph isomorphism

In graph theory, an isomorphism of graphs G and H is a bijection between the vertex sets of G and H

f : V (G) → V (H)

such that any two vertices u and v of G are adjacent in G if and only if ƒ(u) and ƒ(v) are adjacent in H. This kind of bijection is generally called “edge-preserving bijection”, in accordance with the general notion of isomorphism being a structure-preserving bijection. If an isomorphism exists between two graphs, then the graphs are called isomorphic and we write G ≃ H . In the case when the bijection is a mapping of a graph onto itself, i.e., when G and H are one and the same graph, the bijection is called an automorphism of G. A graph isomorphism is an equivalence relation on graphs and as such it partitions the class of all graphs into equivalence classes. A set of graphs isomorphic to each other is called an isomorphism class of graphs. The two graphs shown below are isomorphic, despite their different looking drawings.

9.1 Variations

In the above definition, graphs are understood to be undirected non-labeled non-weighted graphs. However, the notion of isomorphism may be applied to all other variants of the notion of graph, by adding the requirements to preserve the corresponding additional elements of structure: arc directions, edge weights, etc., with the following exception. When spoken about graph labeling with unique labels, commonly taken from the integer range 1,...,n, where n is the number of the vertices of the graph, two labeled graphs are said to be isomorphic if the corresponding underlying unlabeled graphs are isomorphic.

9.2 Motivation

The formal notion of “isomorphism”, e.g., of “graph isomorphism”, captures the informal notion that some objects have “the same structure”if one ignores individual distinctions of “atomic”components of objects in question. Whenever individuality of “atomic”components (vertices and edges, for graphs) is important for correct representation of whatever is modeled by graphs, the model is refined by imposing additional restrictions on the structure, and other mathematical objects are used: digraphs, labeled graphs, colored graphs, rooted trees and so on. The isomorphism relation may also be defined for all these generalizations of graphs: the isomorphism bijection must preserve the elements of structure which define the object type in question: arcs, labels, vertex/edge colors, the root of the rooted tree, etc. The notion of “graph isomorphism”allows us to distinguish graph properties inherent to the structures of graphs themselves from properties associated with graph representations: graph drawings, data structures for graphs, graph labelings, etc. For example, if a graph has exactly one cycle, then all graphs in its isomorphism class also have exactly

31 32 CHAPTER 9. GRAPH ISOMORPHISM

one cycle. On the other hand, in the common case when the vertices of a graph are (represented by) the 1, 2,... N, then the expression

∑ v · degv v∈V (G) may be different for two isomorphic graphs.

9.3 Whitney theorem

Main article: Whitney graph isomorphism theorem The Whitney graph isomorphism theorem,*[1] shown by H. Whitney, states that two connected graphs are iso-

The exception of Whitney's theorem: these two graphs are not isomorphic but have isomorphic line graphs.

morphic if and only if their line graphs are isomorphic, with a single exception: K3, the complete graph on three vertices, and the complete bipartite graph K1,3, which are not isomorphic but both have K3 as their line graph. The Whitney graph theorem can be extended to hypergraphs.*[2]

9.4 Recognition of graph isomorphism

Main article: Graph isomorphism problem

While graph isomorphism may be studied in a classical mathematical way, as exemplified by the Whitney theorem, it is recognized that it is a problem to be tackled with an algorithmic approach. The computational problem of determining whether two finite graphs are isomorphic is called the graph isomorphism problem. Its practical applications include primarily cheminformatics, mathematical chemistry (identification of chemical com- pounds), and electronic design automation (verification of equivalence of various representations of the design of an electronic circuit). The graph isomorphism problem is one of few standard problems in computational complexity theory belonging to NP, but not known to belong to either of its well-known (and, if P ≠ NP, disjoint) subsets: P and NP-complete. It is one of only two, out of 12 total, problems listed in Garey & Johnson (1979) whose complexity remains unresolved, the other being integer factorization. It is however known that if the problem is NP-complete then the polynomial hierarchy collapses to a finite level.*[3] Its generalization, the subgraph isomorphism problem, is known to be NP-complete. The main areas of research for the problem are design of fast algorithms and theoretical investigations of its computational complexity, both for the general problem and for special classes of graphs. 9.5. SEE ALSO 33

9.5 See also

problem • Graph canonization

9.6 Notes

[1] Whitney, Hassler (January 1932).“Congruent Graphs and the Connectivity of Graphs”. American Journal of Mathematics (The Johns Hopkins University Press) 54 (1): 150–168. doi:10.2307/2371086. Retrieved 17 August 2012.

[2] Dirk L. Vertigan, Geoffrey P. Whittle: A 2-Isomorphism Theorem for Hypergraphs. J. Comb. Theory, Ser. B 71(2): 215–230. 1997.

[3] Schöning, Uwe (1988). “Graph isomorphism is in the low hierarchy”. Journal of Computer and System Sciences 37: 312–323. doi:10.1016/0022-0000(88)90010-4.

9.7 References

• Garey, Michael R.; Johnson, David S. (1979), Computers and Intractability: A Guide to the Theory of NP- Completeness, W. H. Freeman, ISBN 0-7167-1045-5 Chapter 10

Graph theory

This article is about sets of vertices connected by edges. For graphs of mathematical functions, see Graph of a function. For other uses, see Graph (disambiguation). In mathematics and computer science, graph theory is the study of graphs, which are mathematical structures used

6 5 4 1

3 2

A drawing of a graph

to model pairwise relations between objects. A “graph”in this context is made up of "vertices" or “nodes”and lines called edges that connect them. A graph may be undirected, meaning that there is no distinction between the two vertices associated with each edge, or its edges may be directed from one vertex to another; see graph (mathematics) for more detailed definitions and for other variations in the types of graph that are commonly considered. Graphs are one of the prime objects of study in discrete mathematics. Refer to the glossary of graph theory for basic definitions in graph theory.

34 10.1. DEFINITIONS 35

10.1 Definitions

Definitions in graph theory vary. The following are some of the more basic ways of defining graphs and related mathematical structures.

10.1.1 Graph

In the most common sense of the term,*[1] a graph is an ordered pair G = (V,E) comprising a set V of vertices or nodes together with a set E of edges or lines, which are 2-element subsets of V (i.e., an edge is related with two vertices, and the relation is represented as an unordered pair of the vertices with respect to the particular edge). To avoid ambiguity, this type of graph may be described precisely as undirected and simple. Other senses of graph stem from different conceptions of the edge set. In one more generalized notion,*[2] V is a set together with a relation of incidence that associates with each edge two vertices. In another generalized notion, E is a multiset of unordered pairs of (not necessarily distinct) vertices. Many authors call this type of object a multigraph or pseudograph. All of these variants and others are described more fully below. The vertices belonging to an edge are called the ends, endpoints, or end vertices of the edge. A vertex may exist in a graph and not belong to an edge. V and E are usually taken to be finite, and many of the well-known results are not true (or are rather different) for infinite graphs because many of the arguments fail in the infinite case. The order of a graph is |V | (the number of vertices). A graph's size is |E| , the number of edges. The degree or valency of a vertex is the number of edges that connect to it, where an edge that connects to the vertex at both ends (a loop) is counted twice. For an edge {u, v} , graph theorists usually use the somewhat shorter notation uv .

10.2 Applications

Graphs can be used to model many types of relations and processes in physical, biological,*[4] social and information systems. Many practical problems can be represented by graphs. In computer science, graphs are used to represent networks of communication, data organization, computational devices, the flow of computation, etc. For instance, the link structure of a website can be represented by a directed graph, in which the vertices represent web pages and directed edges represent links from one page to another. A similar approach can be taken to problems in travel, biology, computer chip design, and many other fields. The development of algorithms to handle graphs is therefore of major interest in computer science. The transformation of graphs is often formalized and represented by graph rewrite systems. Complementary to graph transformation systems focusing on rule-based in-memory manipulation of graphs are graph databases geared towards transaction- safe, persistent storing and querying of graph-structured data. Graph-theoretic methods, in various forms, have proven particularly useful in linguistics, since natural language often lends itself well to discrete structure. Traditionally, syntax and compositional semantics follow tree-based structures, whose expressive power lies in the principle of compositionality, modeled in a hierarchical graph. More contemporary approaches such as head-driven phrase structure grammar model the syntax of natural language using typed feature structures, which are directed acyclic graphs. Within lexical semantics, especially as applied to computers, modeling word meaning is easier when a given word is understood in terms of related words; semantic networks are therefore important in computational linguistics. Still other methods in phonology (e.g. optimality theory, which uses lattice graphs) and morphology (e.g. finite-state morphology, using finite-state transducers) are common in the analysis of language as a graph. Indeed, the usefulness of this area of mathematics to linguistics has borne organizations such as TextGraphs, as well as various 'Net' projects, such as WordNet, VerbNet, and others. Graph theory is also used to study molecules in chemistry and physics. In condensed matter physics, the three- dimensional structure of complicated simulated atomic structures can be studied quantitatively by gathering statistics on graph-theoretic properties related to the topology of the atoms. In chemistry a graph makes a natural model for a molecule, where vertices represent atoms and edges bonds. This approach is especially used in computer processing of molecular structures, ranging from chemical editors to database searching. In statistical physics, graphs can represent 36 CHAPTER 10. GRAPH THEORY

pt hu ca es ro no pl sv fr fi ko it he en da de ja bg nl ar tr id zh uk ru fa cs

The network graph formed by Wikipedia editors (edges) contributing to different Wikipedia language versions (nodes) during one month in summer 2013.*[3]

local connections between interacting parts of a system, as well as the dynamics of a physical process on such systems. Graphs are also used to represent the micro-scale channels of porous media, in which the vertices represent the pores and the edges represent the smaller channels connecting the pores. Graph theory is also widely used in sociology as a way, for example, to measure actors' prestige or to explore rumor spreading, notably through the use of social network analysis software. Under the umbrella of social networks are many different types of graphs:*[5] Acquaintanceship and friendship graphs describe whether people know each other. Influence graphs model whether certain people can influence the behavior of others. Finally, collaboration graphs model whether two people work together in a particular way, such as acting in a movie together. Likewise, graph theory is useful in biology and conservation efforts where a vertex can represent regions where certain species exist (or habitats) and the edges represent migration paths, or movement between the regions. This information is important when looking at breeding patterns or tracking the spread of disease, parasites or how changes to the movement can affect other species. In mathematics, graphs are useful in geometry and certain parts of topology such as knot theory. Algebraic graph theory has close links with group theory. A graph structure can be extended by assigning a weight to each edge of the graph. Graphs with weights, or weighted graphs, are used to represent structures in which pairwise connections have some numerical values. For example if a graph represents a road network, the weights could represent the length of each road. 10.3. HISTORY 37

10.3 History

The Königsberg Bridge problem

The paper written by Leonhard Euler on the Seven Bridges of Königsberg and published in 1736 is regarded as the first paper in the history of graph theory.*[6] This paper, as well as the one written by Vandermonde on the knight problem, carried on with the analysis situs initiated by Leibniz. Euler's formula relating the number of edges, vertices, and faces of a convex polyhedron was studied and generalized by Cauchy*[7] and L'Huillier,*[8] and is at the origin of topology. More than one century after Euler's paper on the bridges of Königsberg and while Listing introduced topology, Cayley was led by the study of particular analytical forms arising from differential calculus to study a particular class of graphs, the trees.*[9] This study had many implications in theoretical chemistry. The involved techniques mainly concerned the enumeration of graphs having particular properties. Enumerative graph theory then rose from the results of Cayley and the fundamental results published by Pólya between 1935 and 1937 and the generalization of these by De Bruijn in 1959. Cayley linked his results on trees with the contemporary studies of chemical composition.*[10] The fusion of the ideas coming from mathematics with those coming from chemistry is at the origin of a part of the standard terminology of graph theory. In particular, the term“graph”was introduced by Sylvester in a paper published in 1878 in Nature, where he draws an analogy between “quantic invariants”and “co-variants”of algebra and molecular diagrams:*[11]

"[...] Every invariant and co-variant thus becomes expressible by a graph precisely identical with a Kekuléan diagram or chemicograph. [...] I give a rule for the geometrical multiplication of graphs, i.e. for constructing a graph to the product of in- or co-variants whose separate graphs are given. [...]" (italics as in the original).

The first textbook on graph theory was written by Dénes Kőnig, and published in 1936.*[12] Another book by Frank Harary, published in 1969, was “considered the world over to be the definitive textbook on the subject”,*[13] and 38 CHAPTER 10. GRAPH THEORY

enabled mathematicians, chemists, electrical engineers and social scientists to talk to each other. Harary donated all of the royalties to fund the Pólya Prize.*[14] One of the most famous and stimulating problems in graph theory is the four color problem: “Is it true that any map drawn in the plane may have its regions colored with four colors, in such a way that any two regions having a common border have different colors?" This problem was first posed by Francis Guthrie in 1852 and its first written record is in a letter of De Morgan addressed to Hamilton the same year. Many incorrect proofs have been proposed, including those by Cayley, Kempe, and others. The study and the generalization of this problem by Tait, Heawood, Ramsey and Hadwiger led to the study of the colorings of the graphs embedded on surfaces with arbitrary genus. Tait's reformulation generated a new class of problems, the factorization problems, particularly studied by Petersen and Kőnig. The works of Ramsey on colorations and more specially the results obtained by Turán in 1941 was at the origin of another branch of graph theory, extremal graph theory. The four color problem remained unsolved for more than a century. In 1969 Heinrich Heesch published a method for solving the problem using computers.*[15] A computer-aided proof produced in 1976 by Kenneth Appel and Wolfgang Haken makes fundamental use of the notion of“discharging”developed by Heesch.*[16]*[17] The proof involved checking the properties of 1,936 configurations by computer, and was not fully accepted at the time due to its complexity. A simpler proof considering only 633 configurations was given twenty years later by Robertson, Seymour, Sanders and Thomas.*[18] The autonomous development of topology from 1860 and 1930 fertilized graph theory back through the works of Jordan, Kuratowski and Whitney. Another important factor of common development of graph theory and topology came from the use of the techniques of modern algebra. The first example of such a use comes from the work of the physicist Gustav Kirchhoff, who published in 1845 his Kirchhoff's circuit laws for calculating the voltage and current in electric circuits. The introduction of probabilistic methods in graph theory, especially in the study of Erdős and Rényi of the asymptotic probability of graph connectivity, gave rise to yet another branch, known as random graph theory, which has been a fruitful source of graph-theoretic results.

10.4 Graph drawing

Main article: Graph drawing

Graphs are represented visually by drawing a dot or circle for every vertex, and drawing an arc between two vertices if they are connected by an edge. If the graph is directed, the direction is indicated by drawing an arrow. A graph drawing should not be confused with the graph itself (the abstract, non-visual structure) as there are several ways to structure the graph drawing. All that matters is which vertices are connected to which others by how many edges and not the exact layout. In practice it is often difficult to decide if two drawings represent the same graph. Depending on the problem domain some layouts may be better suited and easier to understand than others. The pioneering work of W. T. Tutte was very influential in the subject of graph drawing. Among other achievements, he introduced the use of linear algebraic methods to obtain graph drawings. Graph drawing also can be said to encompass problems that deal with the crossing number and its various general- izations. The crossing number of a graph is the minimum number of intersections between edges that a drawing of the graph in the plane must contain. For a , the crossing number is zero by definition. Drawings on surfaces other than the plane are also studied.

10.5 Graph-theoretic data structures

Main article: Graph (abstract data type)

There are different ways to store graphs in a computer system. The data structure used depends on both the graph structure and the algorithm used for manipulating the graph. Theoretically one can distinguish between list and matrix structures but in concrete applications the best structure is often a combination of both. List structures are often preferred for sparse graphs as they have smaller memory requirements. Matrix structures on the other hand 10.6. PROBLEMS IN GRAPH THEORY 39

provide faster access for some applications but can consume huge amounts of memory. List structures include the incidence list, an array of pairs of vertices, and the adjacency list, which separately lists the neighbors of each vertex: Much like the incidence list, each vertex has a list of which vertices it is adjacent to. Matrix structures include the incidence matrix, a matrix of 0's and 1's whose rows represent vertices and whose columns represent edges, and the adjacency matrix, in which both the rows and columns are indexed by vertices. In both cases a 1 indicates two adjacent objects and a 0 indicates two non-adjacent objects. The is a modified form of the adjacency matrix that incorporates information about the degrees of the vertices, and is useful in some calculations such as Kirchhoff's theorem on the number of spanning trees of a graph. The , like the adjacency matrix, has both its rows and columns indexed by vertices, but rather than containing a 0 or a 1 in each cell it contains the length of a shortest path between two vertices.

10.6 Problems in graph theory

10.6.1 Enumeration

There is a large literature on graphical enumeration: the problem of counting graphs meeting specified conditions. Some of this work is found in Harary and Palmer (1973).

10.6.2 Subgraphs, induced subgraphs, and minors

A common problem, called the subgraph isomorphism problem, is finding a fixed graph as a subgraph in a given graph. One reason to be interested in such a question is that many graph properties are hereditary for subgraphs, which means that a graph has the property if and only if all subgraphs have it too. Unfortunately, finding maximal subgraphs of a certain kind is often an NP-complete problem.

• Finding the largest complete subgraph is called the clique problem (NP-complete).

A similar problem is finding induced subgraphs in a given graph. Again, some important graph properties are hered- itary with respect to induced subgraphs, which means that a graph has a property if and only if all induced subgraphs also have it. Finding maximal induced subgraphs of a certain kind is also often NP-complete. For example,

• Finding the largest edgeless induced subgraph, or independent set, called the independent set problem (NP- complete).

Still another such problem, the minor containment problem, is to find a fixed graph as a minor of a given graph. A minor or subcontraction of a graph is any graph obtained by taking a subgraph and contracting some (or no) edges. Many graph properties are hereditary for minors, which means that a graph has a property if and only if all minors have it too. A famous example:

• A graph is planar if it contains as a minor neither the complete bipartite graph K3,3 (See the Three-cottage problem) nor the complete graph K5 .

Another class of problems has to do with the extent to which various species and generalizations of graphs are deter- mined by their point-deleted subgraphs, for example:

• The reconstruction conjecture.

10.6.3 Graph coloring

Many problems have to do with various ways of coloring graphs, for example:

• The four-color theorem 40 CHAPTER 10. GRAPH THEORY

• The strong perfect graph theorem • The Erdős–Faber–Lovász conjecture(unsolved) • The total coloring conjecture, also called Behzad's conjecture) (unsolved) • The list coloring conjecture (unsolved) • The Hadwiger conjecture (graph theory) (unsolved)

10.6.4 Subsumption and unification

Constraint modeling theories concern families of directed graphs related by a partial order. In these applications, graphs are ordered by specificity, meaning that more constrained graphs̶which are more specific and thus contain a greater amount of information̶are subsumed by those that are more general. Operations between graphs include evaluating the direction of a subsumption relationship between two graphs, if any, and computing graph unification. The unification of two argument graphs is defined as the most general graph (or the computation thereof) that is consistent with (i.e. contains all of the information in) the inputs, if such a graph exists; efficient unification algorithms are known. For constraint frameworks which are strictly compositional, graph unification is the sufficient satisfiability and com- bination function. Well-known applications include automatic theorem proving and modeling the elaboration of linguistic structure.

10.6.5 Route problems

and cycle problems • Minimum spanning tree • Route inspection problem (also called the “Chinese Postman Problem”) • Seven Bridges of Königsberg • Shortest path problem • Steiner tree • Three-cottage problem • Traveling salesman problem (NP-hard)

10.6.6 Network flow

There are numerous problems arising especially from applications that have to do with various notions of flows in networks, for example:

• Max flow min cut theorem

10.6.7 Visibility problems

• Museum guard problem

10.6.8 Covering problems

Covering problems in graphs are specific instances of subgraph-finding problems, and they tend to be closely related to the clique problem or the independent set problem.

• Set cover problem • Vertex cover problem 10.7. SEE ALSO 41

10.6.9 Decomposition problems

Decomposition, defined as partitioning the edge set of a graph (with as many vertices as necessary accompanying the edges of each part of the partition), has a wide variety of question. Often, it is required to decompose a graph into subgraphs isomorphic to a fixed graph; for instance, decomposing a complete graph into Hamiltonian cycles. Other problems specify a family of graphs into which a given graph should be decomposed, for instance, a family of cycles, or decomposing a complete graph Kn into n − 1 specified trees having, respectively, 1, 2, 3, ..., n − 1 edges. Some specific decomposition problems that have been studied include:

• Arboricity, a decomposition into as few forests as possible • Cycle double cover, a decomposition into a collection of cycles covering each edge exactly twice • , a decomposition into as few matchings as possible • Graph factorization, a decomposition of a regular graph into regular subgraphs of given degrees

10.6.10 Graph classes

Many problems involve characterizing the members of various classes of graphs. Some examples of such questions are below:

• Enumerating the members of a class • Characterizing a class in terms of forbidden substructures • Ascertaining relationships among classes (e.g., does one property of graphs imply another) • Finding efficient algorithms to decide membership in a class • Finding representations for members of a class.

10.7 See also

• Gallery of named graphs • Glossary of graph theory • List of graph theory topics • Publications in graph theory

10.7.1 Related topics

• Algebraic graph theory • Citation graph • Conceptual graph • Data structure • Disjoint-set data structure • Dual-phase evolution • Entitative graph • • Graph algebras 42 CHAPTER 10. GRAPH THEORY

• Graph automorphism

• Graph coloring

• Graph database

• Graph data structure

• Graph drawing

• Graph equation

• Graph rewriting

• Graph sandwich problem

• Graph property

• Intersection graph

• Logical graph

• Loop

• Network theory

• Null graph

• Pebble motion problems

• Percolation

• Perfect graph

• Quantum graph

• Random regular graphs

• Semantic networks

• Spectral graph theory

• Strongly regular graphs

• Symmetric graphs

• Transitive reduction

• Tree data structure

10.7.2 Algorithms

• Bellman–Ford algorithm

• Dijkstra's algorithm

• Ford–Fulkerson algorithm

• Kruskal's algorithm

• Nearest neighbour algorithm

• Prim's algorithm

• Depth-first search

• Breadth-first search 10.7. SEE ALSO 43

10.7.3 Subareas

• Algebraic graph theory • Geometric graph theory • Extremal graph theory • Probabilistic graph theory • Topological graph theory

10.7.4 Related areas of mathematics

• Combinatorics • Group theory • Knot theory • Ramsey theory

10.7.5 Generalizations

• Hypergraph • Abstract simplicial complex

10.7.6 Prominent graph theorists

• Alon, Noga • Berge, Claude • Bollobás, Béla • Bondy, Adrian John • Brightwell, Graham • Chudnovsky, Maria • Chung, Fan • Dirac, Gabriel Andrew • Erdős, Paul • Euler, Leonhard • Faudree, Ralph • Golumbic, Martin • Graham, Ronald • Harary, Frank • Heawood, Percy John • Kotzig, Anton • Kőnig, Dénes • Lovász, László 44 CHAPTER 10. GRAPH THEORY

• Murty, U. S. R. • Nešetřil, Jaroslav • Rényi, Alfréd • Ringel, Gerhard • Robertson, Neil • Seymour, Paul • Szemerédi, Endre • Thomas, Robin • Thomassen, Carsten • Turán, Pál • Tutte, W. T. • Whitney, Hassler

10.8 Notes

[1] See, for instance, Iyanaga and Kawada, 69 J, p. 234 or Biggs, p. 4.

[2] See, for instance, Graham et al., p. 5.

[3] Hale, Scott A. (2013). “Multilinguals and Wikipedia Editing”. arXiv:1312.0976 [cs.CY].

[4] Mashaghi, A. et al. (2004).“Investigation of a protein complex network”. European Physical Journal B 41 (1): 113–121. doi:10.1140/epjb/e2004-00301-0.

[5] Rosen, Kenneth H. Discrete mathematics and its applications (7th ed.). New York: McGraw-Hill. ISBN 978-0-07-338309- 5.

[6] Biggs, N.; Lloyd, E. and Wilson, R. (1986), Graph Theory, 1736-1936, Oxford University Press

[7] Cauchy, A.L. (1813),“Recherche sur les polyèdres - premier mémoire”, Journal de l'École Polytechnique, 9 (Cahier 16): 66–86.

[8] L'Huillier, S.-A.-J. (1861), “Mémoire sur la polyèdrométrie”, Annales de Mathématiques 3: 169–189.

[9] Cayley, A. (1857), “On the theory of the analytical forms called trees”, Philosophical Magazine, Series IV 13 (85): 172–176, doi:10.1017/CBO9780511703690.046.

[10] Cayley, A. (1875), “Ueber die Analytischen Figuren, welche in der Mathematik Bäume genannt werden und ihre An- wendung auf die Theorie chemischer Verbindungen”, Berichte der deutschen Chemischen Gesellschaft 8 (2): 1056–1059, doi:10.1002/cber.18750080252.

[11] Joseph Sylvester, John (1878). “Chemistry and Algebra”. Nature 17: 284. doi:10.1038/017284a0.

[12] Tutte, W.T. (2001), Graph Theory, Cambridge University Press, p. 30, ISBN 978-0-521-79489-3.

[13] Gardner, Martin (1992), Fractal Music, Hypercards, and more...Mathematical Recreations from Scientific American, W. H. Freeman and Company, p. 203.

[14] Society for Industrial and Applied Mathematics (2002), “The George Polya Prize”, Looking Back, Looking Ahead: A SIAM History (PDF), p. 26.

[15] Heinrich Heesch: Untersuchungen zum Vierfarbenproblem. Mannheim: Bibliographisches Institut 1969.

[16] Appel, K. and Haken, W. (1977),“Every planar map is four colorable. Part I. Discharging”, Illinois J. Math. 21: 429–490.

[17] Appel, K. and Haken, W. (1977), “Every planar map is four colorable. Part II. Reducibility”, Illinois J. Math. 21: 491–567.

[18] Robertson, N.; Sanders, D.; Seymour, P. and Thomas, R. (1997), “The four color theorem”, Journal of Combinatorial Theory Series B 70: 2–44, doi:10.1006/jctb.1997.1750. 10.9. REFERENCES 45

10.9 References

• Berge, Claude (1958), Théorie des graphes et ses applications, Collection Universitaire de Mathématiques II, Paris: Dunod. English edition, Wiley 1961; Methuen & Co, New York 1962; Russian, Moscow 1961; Spanish, Mexico 1962; Roumanian, Bucharest 1969; Chinese, Shanghai 1963; Second printing of the 1962 first English edition, Dover, New York 2001. • Biggs, N.; Lloyd, E.; Wilson, R. (1986), Graph Theory, 1736–1936, Oxford University Press.

• Bondy, J.A.; Murty, U.S.R. (2008), Graph Theory, Springer, ISBN 978-1-84628-969-9.

• Bondy, Riordan, O.M (2003), Mathematical results on scale-free random graphs in“Handbook of Graphs and Networks”(S. Bornholdt and H.G. Schuster (eds)), Wiley VCH, Weinheim, 1st ed.. • Chartrand, Gary (1985), Introductory Graph Theory, Dover, ISBN 0-486-24775-9. • Gibbons, Alan (1985), Algorithmic Graph Theory, Cambridge University Press.

• Reuven Cohen, Shlomo Havlin (2010), Complex Networks: Structure, Robustness and Function, Cambridge University Press

• Golumbic, Martin (1980), Algorithmic Graph Theory and Perfect Graphs, Academic Press. • Harary, Frank (1969), Graph Theory, Reading, MA: Addison-Wesley.

• Harary, Frank; Palmer, Edgar M. (1973), Graphical Enumeration, New York, NY: Academic Press. • Mahadev, N.V.R.; Peled, Uri N. (1995), Threshold Graphs and Related Topics, North-Holland.

• Mark Newman (2010), Networks: An Introduction, Oxford University Press.

10.10 External links

• Graph theory with examples

• Hazewinkel, Michiel, ed. (2001), “Graph theory”, Encyclopedia of Mathematics, Springer, ISBN 978-1- 55608-010-4 • Graph theory tutorial

• A searchable database of small connected graphs • Image gallery: graphs at the Wayback Machine (archived February 6, 2006)

• Concise, annotated list of graph theory resources for researchers • rocs ̶a graph theory IDE

• The Social Life of Routers ̶non-technical paper discussing graphs of people and computers • Graph Theory Software ̶tools to teach and learn graph theory

• Online books, and library resources in your library and in other libraries about graph theory

10.10.1 Online textbooks

• Phase Transitions in Combinatorial Optimization Problems, Section 3: Introduction to Graphs (2006) by Hart- mann and Weigt • Digraphs: Theory Algorithms and Applications 2007 by Jorgen Bang-Jensen and Gregory Gutin

• Graph Theory, by Reinhard Diestel Chapter 11

Implication graph

~x2 x0

x6 ~x4 x3

~x5 ~x1 x1 x5

~x3 x4 ~x6

~x0 x2

An implication graph representing the 2-satisfiability instance (x0∨x2)∧(x0∨¬x3)∧(x1∨¬x3)∧(x1∨¬x4)∧(x2∨¬x4)∧ (x0∨¬x5)∧(x1∨¬x5)∧(x2∨¬x5)∧(x3∨x6)∧(x4∨x6)∧(x5∨x6).

In mathematical logic, an implication graph is a skew-symmetric directed graph G(V, E) composed of vertex set V and directed edge set E. Each vertex in V represents the truth status of a Boolean literal, and each directed edge from vertex u to vertex v represents the material implication “If the literal u is true then the literal v is also true”. Implication graphs were originally used for analyzing complex Boolean expressions.

46 11.1. APPLICATIONS 47

11.1 Applications

A 2-satisfiability instance in conjunctive normal form can be transformed into an implication graph by replacing each of its disjunctions by a pair of implications. An instance is satisfiable if and only if no literal and its negation belong to the same strongly connected component of its implication graph; this characterization can be used to solve 2-satisfiability instances in linear time.*[1]

11.2 References

[1] Aspvall, Bengt; Plass, Michael F.; Tarjan, Robert E. (1979). “A linear-time algorithm for testing the truth of certain quantified boolean formulas”. Information Processing Letters 8 (3): 121–123. doi:10.1016/0020-0190(79)90002-4. Chapter 12

Implicational hierarchy

Implicational hierarchy, in linguistics, is a chain of implicational universals. A set of chained universals is schemat- ically shown as in (1): (1) A < B < C < D It can be reformulated in the following way: If a language has property D, then it also has properties A, B, and C; if a language has a property C, then it also has properties A and B, etc. In other words, the implicational hierarchy defines the possible combinations of properties A, B, C, and D as listed in matrix (2): Implicational hierarchies are a useful tool in capturing linguistic generalizations pertaining the different components of the language. They are found in all subfields of grammar.

12.1 Phonology

(3) is an example of an implicational hierarchy concerning the distribution of nasal phonemes across languages, which concerns dental/alveolar, bilabial, and palatal voiced nasals, respectively: (3) /n/ < /m/ < /ɲ/ This hierarchy defines the following possible combinations of dental/alveolar, bilabial, and palatal voiced nasals in the phoneme inventory of a language: (4) In other words, the hierarchy implies that there are no languages with /ɲ/ but without /m/ and /n/, or with /ɲ/ and /m/ but without /n/.

12.2 Morphology

Number marking provides an example of implicational hierarchies in morphology. (5) Number: singular < plural < dual < trial / paucal On the one hand, the hierarchy implies that no language distinguishes a trial unless having a dual, and no language has dual without a plural. On the other hand, the hierarchy provides implications for the morphological marking: if the plural is coded with a certain number of morphemes, then the dual is coded with at least as many morphemes.

12.3 Syntax

Implicational hierarchies also play a role in syntactic phenomena. For instance, in some languages (e.g. Tangut) the transitive verb agrees not with a subject, or the object, but with the syntactic argument which is higher on the person hierarchy.

48 12.4. BIBLIOGRAPHY 49

(5) Person: first < second < third

See also: animacy.

12.4 Bibliography

• Comrie, B. (1989). Language universals and linguistic typology: Syntax and morphology. Oxford: Blackwell, 2nd edn. • Croft, W. (1990). Typology and universals. Cambridge: Cambridge UP.

• Whaley, L.J. (1997). Introduction to typology: The unity and diversity of language. Newbury Park: Sage. Chapter 13

Implicational propositional calculus

In mathematical logic, the implicational propositional calculus is a version of classical propositional calculus which uses only one connective, called implication or conditional. In formulas, this binary operation is indicated by“implies” , “if ..., then ...”, "→", " →", etc..

13.1 Virtual completeness as an operator

Implication alone is not functionally complete as a logical operator because one cannot form all other two-valued truth functions from it. However, if one has a propositional formula which is known to be false and uses that as if it were a nullary connective for falsity, then one can define all other truth functions. So implication is virtually complete as an operator. If P,Q, and F are propositions and F is known to be false, then:

• ¬P is equivalent to P → F • P ∧ Q is equivalent to (P → (Q → F)) → F • P ∨ Q is equivalent to (P → Q) → Q • P ↔ Q is equivalent to ((P → Q) → ((Q → P) → F)) → F

More generally, since the above operators are known to be functionally complete, it follows that any truth function can be expressed in terms of "→" and "F", if we have a proposition F which is known to be false. It is worth noting that F is not definable from → and arbitrary sentence variables: any formula constructed from → and propositional variables must receive the value true when all of its variables are evaluated to true. It follows as a corollary that {→} is not functionally complete. It cannot, for example, be used to define the two-place truth function that always returns false.

13.2 Axiom system

• Axiom schema 1 is P → (Q → P). • Axiom schema 2 is (P → (Q → R)) → ((P → Q) → (P → R)). • Axiom schema 3 (Peirce's law) is ((P → Q) → P) → P. • The one non-nullary rule of inference (modus ponens) is: from P and P → Q infer Q.

Where in each case, P, Q, and R may be replaced by any formulas which contain only "→" as a connective. If Γ is a set of formulas and A a formula, then Γ ⊢ A means that A is derivable using the axioms and rules above and formulas from Γ as additional hypotheses. Łukasiewicz (1948) found an axiom system for the implicational calculus, which replaces the schemas 1–3 above with a single schema

50 13.3. BASIC PROPERTIES OF DERIVATION 51

• ((P → Q) → R) → ((R → P) → (S → P)).

He also argued that there is no shorter axiom system.

13.3 Basic properties of derivation

Since all axioms and rules of the calculus are schemata, derivation is closed under substitution:

If Γ ⊢ A, then σ(Γ) ⊢ σ(A),

where σ is any substitution (of formulas using only implication). The implicational propositional calculus also satisfies the deduction theorem:

If Γ,A ⊢ B , then Γ ⊢ A → B.

As explained in the deduction theorem article, this holds for any axiomatic extension of the system containing axiom schemas 1 and 2 above and modus ponens.

13.4 Completeness

The implicational propositional calculus is semantically complete with respect to the usual two-valued semantics of classical propositional logic. That is, if Γ is a set of implicational formulas, and A is an implicational formula entailed by Γ, then Γ ⊢ A .

13.4.1 Proof

A proof of the completeness theorem is outlined below. First, using the compactness theorem and the deduction theorem, we may reduce the completeness theorem to its special case with empty Γ, i.e., we only need to show that every tautology is derivable in the system. The proof is similar to completeness of full propositional logic, but it also uses the following idea to overcome the functional incompleteness of implication. If A and F are formulas, then A → F is equivalent to (¬A*) ∨ F, where A* is the result of replacing in A all, some, or none of the occurrences of F by falsity. Similarly, (A → F) → F is equivalent to A* ∨ F. So under some conditions, one can use them as substitutes for saying A* is false or A* is true respectively. We first observe some basic facts about derivability:

Indeed, we can derive A → (B → C) using Axiom 1, and then derive A → C by modus ponens (twice) from Ax. 2.

This follows from (1) by the deduction theorem.

If we further assume C → B, we can derive A → B using (1), then we derive C by modus ponens. This shows A → C, (A → B) → C,C → B ⊢ C , and the deduction theorem gives A → C, (A → B) → C ⊢ (C → B) → C . We apply Ax. 3 to obtain (3). 52 CHAPTER 13. IMPLICATIONAL PROPOSITIONAL CALCULUS

Let F be an arbitrary fixed formula. For any formula A, we define A0 = (A → F) and A1 = ((A → F) → F). Let us consider only formulas in propositional variables p1, ..., pn. We claim that for every formula A in these variables and every truth assignment e,

We prove (4) by induction on A. The base case A = pi is trivial. Let A = (B → C). We distinguish three cases:

1. e(C) = 1. Then also e(A) = 1. We have

(C → F ) → F ⊢ ((B → C) → F ) → F

by applying (2) twice to the axiom C → (B → C). Since we have derived (C → F) → F by the induction hypothesis, we can infer ((B → C) → F) → F.

2. e(B) = 0. Then again e(A) = 1. The deduction theorem applied to (3) gives

B → F ⊢ ((B → C) → F ) → F.

Since we have derived B → F by the induction hypothesis, we can infer ((B → C) → F) → F.

3. e(B) = 1 and e(C) = 0. Then e(A) = 0. We have

(B → F ) → F,C → F,B → C ⊢ B → F (1) by ⊢ F ponens, modus by

thus (B → F ) → F,C → F ⊢ (B → C) → F by the deduction theorem. We have derived (B → F) → F and C → F by the induction hypothesis, hence we can infer (B → C) → F. This completes the proof of (4).

Now let A be a tautology in variables p1, ..., pn. We will prove by reverse induction on k = n,...,0 that for every assignment e,

The base case k = n is a special case of (4). Assume that (5) holds for k + 1, we will show it for k. By applying deduction theorem to the induction hypothesis, we obtain

e(p1) e(pk) ⊢ → → 1 p1 , . . . , pk (pk+1 F ) A , e(p1) e(pk) ⊢ → → → 1 p1 , . . . , pk ((pk+1 F ) F ) A , by first setting e(pk+1) = 0 and second setting e(pk+1) = 1. From this we derive (5) using (3). For k = 0 we obtain that the formula A1, i.e., (A → F) → F, is provable without assumptions. Recall that F was an arbitrary formula, thus we can choose F = A, which gives us provability of the formula (A → A) → A. Since A → A is provable by the deduction theorem, we can infer A. This proof is constructive. That is, given a tautology, one could actually follow the instructions and create a proof of it from the axioms. However, the length of such a proof increases exponentially with the number of propositional variables in the tautology, hence it is not a practical method for any but the very shortest tautologies. 13.5. THE BERNAYS–TARSKI AXIOM SYSTEM 53

13.5 The Bernays–Tarski axiom system

The Bernays–Tarski axiom system is often used. In particular, Łukasiewicz's paper derives the Bernays–Tarski axioms from Łukasiewicz's sole axiom as a means of showing its completeness. It differs from the axiom schemas above by replacing axiom schema 2, (P→(Q→R))→((P→Q)→(P→R)), with

• Axiom schema 2': (P→Q)→((Q→R)→(P→R))

which is called hypothetical syllogism. This makes derivation of the deduction meta-theorem a little more difficult, but it can still be done. We show that from P→(Q→R) and P→Q one can derive P→R. This fact can be used in lieu of axiom schema 2 to get the meta-theorem.

1. P→(Q→R) given 2. P→Q given 3. (P→Q)→((Q→R))→(P→R)) ax 2' 4. (Q→R)→(P→R) mp 2,3 5. (P→(Q→R))→(((Q→R)→(P→R))→(P→(P→R))) ax 2' 6. ((Q→R)→(P→R))→(P→(P→R)) mp 1,5 7. P→(P→R) mp 4,6 8. (P→(P→R))→(((P→R)→R)→(P→R)) ax 2' 9. ((P→R)→R)→(P→R) mp 7,8 10. (((P→R)→R)→(P→R))→(P→R) ax 3 11. P→R mp 9,10 qed

13.6 Testing whether a formula of the implicational propositional calculus is a tautology

Main articles: Tautology (logic) § Efficient verification and the Boolean satisfiability problem and Boolean satisfia- bility problem § Algorithms for solving SAT

In this case, a useful technique is to presume that the formula is not a tautology and attempt to find a valuation which makes it false. If one succeeds, then it is indeed not a tautology. If one fails, then it is a tautology. Example of a non-tautology: Suppose [(A→B)→((C→A)→E)]→([F→((C→D)→E)]→[(A→F)→(D→E)]) is false. Then (A→B)→((C→A)→E) is true; F→((C→D)→E) is true; A→F is true; D is true; and E is false. Since D is true, C→D is true. So the truth of F→((C→D)→E) is equivalent to the truth of F→E. Then since E is false and F→E is true, we get that F is false. Since A→F is true, A is false. Thus A→B is true and (C→A)→E is true. C→A is false, so C is true. The value of B does not matter, so we can arbitrarily choose it to be true. Summing up, the valuation which sets B, C and D to be true and A, E and F to be false will make [(A→B)→((C→A)→E)]→([F→((C→D)→E)]→[(A→F)→(D→E)]) false. So it is not a tautology. Example of a tautology: 54 CHAPTER 13. IMPLICATIONAL PROPOSITIONAL CALCULUS

Suppose ((A→B)→C)→((C→A)→(D→A)) is false. Then (A→B)→C is true; C→A is true; D is true; and A is false. Since A is false, A→B is true. So C is true. Thus A must be true, contradicting the fact that it is false. Thus there is no valuation which makes ((A→B)→C)→((C→A)→(D→A)) false. Consequently, it is a tautology.

13.7 Adding an axiom schema

What would happen if another axiom schema were added to those listed above? There are two cases: (1) it is a tautology; or (2) it is not a tautology. If it is a tautology, then the set of theorems remains the set of tautologies as before. However, in some cases it may be possible to find significantly shorter proofs for theorems. Nevertheless, the minimum length of proofs of theorems will remain unbounded, that is, for any natural number n there will still be theorems which cannot be proved in n or fewer steps. If the new axiom schema is not a tautology, then every formula becomes a theorem (which makes the concept of a theorem useless in this case). What is more, there is then an upper bound on the minimum length of a proof of every formula, because there is a common method for proving every formula. For example, suppose the new axiom schema were ((B→C)→C)→B. Then ((A→(A→A))→(A→A))→A is an instance (one of the new axioms) and also not a tautology. But [((A→(A→A))→(A→A))→A]→A is a tautology and thus a theorem due to the old axioms (using the completeness result above). Applying modus ponens, we get that A is a theorem of the extended system. Then all one has to do to prove any formula is to replace A by the desired formula throughout the proof of A. This proof will have the same number of steps as the proof of A.

13.8 An alternative axiomatization

The axioms listed above primarily work through the deduction metatheorem to arrive at completeness. Here is another axiom system which aims directly at completeness without going through the deduction metatheorem. First we have axiom schemas which are designed to efficiently prove the subset of tautologies which contain only one propositional variable.

• aa 1: ꞈA→A • aa 2: (A→B)→ꞈ(A→(C→B)) • aa 3: A→((B→C)→ꞈ((A→B)→C)) • aa 4: A→ꞈ(B→A)

The proof of each such tautology would begin with two parts (hypothesis and conclusion) which are the same. Then insert additional hypotheses between them. Then insert additional tautological hypotheses (which are true even when the sole variable is false) into the original hypothesis. Then add more hypotheses outside (on the left). This procedure will quickly give every tautology containing only one variable. (The symbol "ꞈ" in each axiom schema indicates where the conclusion used in the completeness proof begins. It is merely a comment, not a part of the formula.)

Consider any formula Φ which may contain A, B, C1, ..., Cn and ends with A as its final conclusion. Then we take

• aa 5: Φ−→(Φ₊→ꞈΦ)

as an axiom schema where Φ− is the result of replacing B by A throughout Φ and Φ₊ is the result of replacing B by (A→A) throughout Φ. This is a schema for axiom schemas since there are two level of substitution: in the first Φ is substituted (with variations); in the second, any of the variables (including both A and B) may be replaced by arbitrary formulas of the implicational propositional calculus. This schema allows one to prove tautologies with more than one variable by considering the case when B is false Φ− and the case when B is true Φ₊. If the variable which is the final conclusion of a formula takes the value true, then the whole formula takes the value true regardless of the values of the other variables. Consequently if A is true, then Φ, Φ−, Φ₊ and Φ−→(Φ₊→Φ) are 13.8. AN ALTERNATIVE AXIOMATIZATION 55

all true. So without loss of generality, we may assume that A is false. Notice that Φ is a tautology if and only if both Φ− and Φ₊ are tautologies. But while Φ has n+2 distinct variables, Φ− and Φ₊ both have n+1. So the question of whether a formula is a tautology has been reduced to the question of whether certain formulas with one variable each are all tautologies. Also notice that Φ−→(Φ₊→Φ) is a tautology regardless of whether Φ is, because if Φ is false then either Φ− or Φ₊ will be false depending on whether B is false or true. Examples: Deriving Peirce's law

1. [((P→P)→P)→P]→([((P→(P→P))→P)→P]→[((P→Q)→P)→P]) aa 5

2. P→P aa 1

3. (P→P)→((P→P)→(((P→P)→P)→P)) aa 3

4. (P→P)→(((P→P)→P)→P) mp 2,3

5. ((P→P)→P)→P mp 2,4

6. [((P→(P→P))→P)→P]→[((P→Q)→P)→P] mp 5,1

7. P→(P→P) aa 4

8. (P→(P→P))→((P→P)→(((P→(P→P))→P)→P)) aa 3

9. (P→P)→(((P→(P→P))→P)→P) mp 7,8

10. ((P→(P→P))→P)→P mp 2,9

11. ((P→Q)→P)→P mp 10,6 qed

Deriving Łukasiewicz' sole axiom

1. [((P→Q)→P)→((P→P)→(S→P))]→([((P→Q)→(P→P))→(((P→P)→P)→(S→P))]→[((P→Q)→R)→((R→P)→(S→P))]) aa 5

2. [((P→P)→P)→((P→P)→(S→P))]→([((P→(P→P))→P)→((P→P)→(S→P))]→[((P→Q)→P)→((P→P)→(S→P))]) aa 5

3. P→(S→P) aa 4

4. (P→(S→P))→(P→((P→P)→(S→P))) aa 2

5. P→((P→P)→(S→P)) mp 3,4

6. P→P aa 1

7. (P→P)→((P→((P→P)→(S→P)))→[((P→P)→P)→((P→P)→(S→P))]) aa 3

8. (P→((P→P)→(S→P)))→[((P→P)→P)→((P→P)→(S→P))] mp 6,7

9. ((P→P)→P)→((P→P)→(S→P)) mp 5,8

10. [((P→(P→P))→P)→((P→P)→(S→P))]→[((P→Q)→P)→((P→P)→(S→P))] mp 9,2

11. P→(P→P) aa 4

12. (P→(P→P))→((P→((P→P)→(S→P)))→[((P→(P→P))→P)→((P→P)→(S→P))]) aa 3

13. (P→((P→P)→(S→P)))→[((P→(P→P))→P)→((P→P)→(S→P))] mp 11,12

14. ((P→(P→P))→P)→((P→P)→(S→P)) mp 5,13

15. ((P→Q)→P)→((P→P)→(S→P)) mp 14,10

16. [((P→Q)→(P→P))→(((P→P)→P)→(S→P))]→[((P→Q)→R)→((R→P)→(S→P))] mp 15,1 56 CHAPTER 13. IMPLICATIONAL PROPOSITIONAL CALCULUS

17. (P→P)→((P→(S→P))→[((P→P)→P)→(S→P)]) aa 3

18. (P→(S→P))→[((P→P)→P)→(S→P)] mp 6,17 19. ((P→P)→P)→(S→P) mp 3,18

20. (((P→P)→P)→(S→P))→[((P→Q)→(P→P))→(((P→P)→P)→(S→P))] aa 4 21. ((P→Q)→(P→P))→(((P→P)→P)→(S→P)) mp 19,20

22. ((P→Q)→R)→((R→P)→(S→P)) mp 21,16 qed

Using a truth table to verify Łukasiewicz' sole axiom would require consideration of 16=24 cases since it contains 4 distinct variables. In this derivation, we were able to restrict consideration to merely 3 cases: R is false and Q is false, R is false and Q is true, and R is true. However because we are working within the of logic (instead of outside it, informally), each case required much more effort.

13.9 See also

• Deduction theorem

• List of logic systems#Implicational propositional calculus • Peirce's law

• Propositional calculus • Tautology (logic)

• Truth table • Valuation (logic)

13.10 References

• Mendelson, Elliot (1997) Introduction to Mathematical Logic, 4th ed. London: Chapman & Hall. • Łukasiewicz, Jan (1948) The shortest axiom of the implicational calculus of propositions, Proc. Royal Irish Academy, vol. 52, sec. A, no. 3, pp. 25–33. Chapter 14

Implicature

Implicature is a technical term in the pragmatics subfield of linguistics, coined by H. P. Grice, which refers to what is suggested in an utterance, even though neither expressed nor strictly implied (that is, entailed) by the utterance.*[1] For example, the sentence "Mary had a baby and got married" strongly suggests that Mary had the baby before the wedding, but the sentence would still be strictly true if Mary had her baby after she got married. Further, if we add the qualification "̶not necessarily in that order" to the original sentence, then the implicature is cancelled even though the meaning of the original sentence is not altered. “Implicature”is an alternative to "implication,”which has additional meanings in logic and informal language.

14.1 Types of implicature

14.1.1 Conversational implicature

Paul Grice identified three types of general conversational implicatures: 1. The speaker deliberately flouts a conversational maxim to convey an additional meaning not expressed literally. For instance, a speaker responds to the question“How did you like the guest lecturer?" with the following utterance:

Well, Iʼm sure he was speaking English.

If the speaker is assumed to be following the cooperative principle,*[2] in spite of flouting the Maxim of Quantity, then the utterance must have an additional nonliteral meaning, such as: “The content of the lecturer's speech was confusing.” 2. The speakerʼs desire to fulfill two conflicting maxims results in his or her flouting one maxim to invoke the other. For instance, a speaker responds to the question “Where is John?" with the following utterance:

Heʼs either in the cafeteria or in his office.

In this case, the Maxim of Quantity and the Maxim of Quality are in conflict. A cooperative speaker does not want to be ambiguous but also does not want to give false information by giving a specific answer in spite of his uncertainty. By flouting the Maxim of Quantity, the speaker invokes the Maxim of Quality, leading to the implicature that the speaker does not have the evidence to give a specific location where he believes John is. 3. The speaker invokes a maxim as a basis for interpreting the utterance. In the following exchange:

Do you know where I can get some gas? Thereʼs a gas station around the corner.

The second speaker invokes the Maxim of Relevance, resulting in the implicature that “the gas station is open and one can probably get gas there”.

57 58 CHAPTER 14. IMPLICATURE

Scalar implicature

According to Grice (1975), another form of conversational implicature is also known as a scalar implicature. This concerns the conventional uses of words like “all”or “some”in conversation.

I ate some of the pie.

This sentence implies“I did not eat all of the pie.”While the statement“I ate some pie”is still true if the entire pie was eaten, the conventional meaning of the word “some”and the implicature generated by the statement is “not all”.

14.1.2 Conventional implicature

Conventional implicature is independent of the cooperative principle and its four maxims. A statement always carries its conventional implicature.

Donovan is poor but happy.

This sentence implies poverty and happiness are not compatible but in spite of this Donovan is still happy. The conventional interpretation of the word“but”will always create the implicature of a sense of contrast. So Donovan is poor but happy will always necessarily imply “Surprisingly Donovan is happy in spite of being poor”.

14.2 Implicature vs entailment

This can be contrasted with cases of entailment. For example, the statement “The president was assassinated”not only suggests that “The president is dead”is true, but requires that it be true. The first sentence could not be true if the second were not true; if the president were not dead, then whatever it is that happened to him would not have counted as a (successful) assassination. Similarly, unlike implicatures, entailments cannot be cancelled; there is no qualification that one could add to “The president was assassinated”which would cause it to cease entailing “The president is dead”while also preserving the meaning of the first sentence.

14.3 See also

• Allofunctional implicature • Cooperative principle • Gricean maxims • Entailment, or implication, in logic • Entailment (pragmatics) • Explicature • Indirect speech act • Intrinsic and extrinsic properties • Presupposition

14.4 References

[1] Blackburn 1996, p. 189.

[2] Kordić 1991, pp. 89–92. 14.5. BIBLIOGRAPHY 59

14.5 Bibliography

• Blackburn, Simon (1996). “implicature,”The Oxford Dictionary of Philosophy, Oxford, pp. 188-89.

• P. Cole (1975)“The synchronic and diachronic status of conversational implicature.”In Syntax and Semantics, 3: Speech Acts (New York: Academic Press) ed. P. Cole & J. L. Morgan, pp. 257–288. ISBN 0-12-785424-X.

• A. Davison (1975) “Indirect speech acts and what to do with them.”ibid, pp. 143–184.

• G. M. Green (1975) “How to get people to do things with words.”ibid, pp. 107–141. New York: Academic Press

• H. P. Grice (1975)“Logic and conversation.”ibid. Reprinted in Studies in the Way of Words, ed. H. P. Grice, pp. 22–40. Cambridge, MA: Harvard University Press (1989) ISBN 0-674-85270-2.

• Michael Hancher (1978) “Grice's “Implicature”and Literary Interpretation: Background and Preface” Twentieth Annual Meeting Midwest Modern Language Association

• Kordić, Snježana (1991).“Konverzacijske implikature”[Conversational implicatures]. Suvremena lingvistika (in Serbo-Croatian) 17 (31-32): 87–96. ISSN 0586-0296. Archived from the original (PDF) on 2 September 2012. Retrieved 6 September 2012. • John Searle (1975)“Indirect speech acts.”ibid. Reprinted in Pragmatics: A Reader, ed. S. Davis, pp. 265–277. Oxford: Oxford University Press. (1991) ISBN 0-19-505898-4.

14.6 Further reading

• Kent, Bach (2006). “The Top 10 Misconceptions about Implicature” (PDF). in: Birner, B.; Ward, G. A Festschrift for Larry Horn. Amsterdam: John Benjamins.

14.7 External links

• “Implicature” in the Stanford Encyclopedia of Philosophy • The Top 10 Misconceptions about Implicature by Kent Bach (2005) Chapter 15

Implicit

Implicit may refer to:

15.1 Mathematics

A function defined by an equation in several variables or the equation defining this function, as in

• Implicit function

• Implicit function theorem • Implicit curve

• Implicit surface

• Implicit differential equation

15.2 Computer science

• Implicit type conversion

15.3 Other uses

• Implicit solvation • Implicit Association Test

• Implicit learning • Implicit memory

• Implicit and explicit atheism

15.4 See also

• Implication (disambiguation)

60 Chapter 16

Informal fallacy

An informal fallacy is an argument whose stated premises may fail to adequately support its proposed conclusion.*[1] The problem with an informal fallacy often stems from reasoning that renders the conclusion unpersuasive. In contrast to a formal fallacy of deduction, the error is not a flaw in logic.

16.1 Formal deductive fallacies and informal fallacies

Formal fallacies of deductive reasoning fail to guarantee a true conclusion follows given the truth of the premises. This will render the argument invalid. Inductive fallacies are not formal in this sense. Their merit is judged in terms of rational persuasiveness, inductive strength or methodology (for example, statistical inference). For instance, the fallacy of hasty generalization, can be roughly stated as an invalid syllogism. Hasty generalisation often follows a pattern such as:

X is true for A.

X is true for B.

X is true for C.

X is true for D.

Therefore, X is true for E, F, G, etc.

While never a valid logical deduction, if such an inference can be made on statistical grounds, it may nonetheless be convincing. This is to say that informal fallacies are not necessarily incorrect, nor are they logical fallacies. However they often need the backing of empirical proof to become convincing.

16.2 See also

Main article: List of fallacies

• Argumentation theory • Argument map • Critical thinking

61 62 CHAPTER 16. INFORMAL FALLACY

• Inference objection

• Inquiry • Lemma

• Sophism

16.3 References

[1] Kelly, D. (1994) The Art of Reasoning. W W Norton & Company, Inc. ISBN 0-393-96466-3

16.4 Further reading

• Damer, T. Edward (2009), Attacking Faulty Reasoning: A Practical Guide to Fallacy-free Arguments (6th ed.), Wadsworth, ISBN 978-0-495-09506-4

16.5 External links

• Logical fallacies A list of logical fallacies, explained. • The Fallacy Files: Informal Fallacy Chapter 17

Involution (mathematics)

f x f(x) X f X

An involution is a function f : X → X that, when applied twice, brings one back to the starting point.

In mathematics, an (anti-)involution, or an involutory function, is a function f that is its own inverse,

f(f(x)) = x

for all x in the domain of f.*[1] For x in ℝ, this is often called Babbage's functional equation (1820).*[2]

17.1 General properties

Any involution is a bijection. The identity map is a trivial example of an involution. Common examples in mathematics of more detailed involutions include multiplication by −1 in arithmetic, the taking of reciprocals, complementation in set theory and complex conjugation. Other examples include circle inversion, rotation by a half-turn, and reciprocal ciphers such as the ROT13 transformation and the Beaufort polyalphabetic cipher. The number of involutions, including the identity involution, on a set with n = 0, 1, 2, …elements is given by a recurrence relation found by Heinrich August Rothe in 1800:

a0 = a1 = 1;

an = an − 1 + (n − 1)an − 2, for n > 1.

63 64 CHAPTER 17. INVOLUTION (MATHEMATICS)

The first few terms of this sequence are 1, 1, 2, 4, 10, 26, 76, 232 (sequence A000085 in OEIS); these numbers are called the telephone numbers, and they also count the number of Young tableaux with a given number of cells.*[3] The composition g ◦ f of two involutions f and g is an involution if and only if they commute: g ◦ f = f ◦ g .*[4] Every involution on an odd number of elements has at least one fixed point. More generally, for an involution on a finite set of elements, the number of elements and the number of fixed points have the same parity.*[5]

17.2 Involution throughout the fields of mathematics

17.2.1 Euclidean geometry

A simple example of an involution of the three-dimensional Euclidean space is reflection against a plane. Performing a reflection twice brings a point back to its original coordinates. Another is the so-called reflection through the origin; this is an abuse of language as it is not a reflection, though it is an involution. These transformations are examples of affine involutions.

17.2.2 Projective geometry

An involution is a projectivity of period 2, that is, a projectivity that interchanges pairs of points. Coxeter relates three theorems on involutions:

• Any projectivity that interchanges two points is an involution. • The three pairs of opposite sides of a complete quadrangle meet any line (not through a vertex) in three pairs of an involution. • If an involution has one fixed point, it has another, and consists of the correspondence between harmonic conjugates with respect to these two points. In this instance the involution is termed “hyperbolic”, while if there are no fixed points it is “elliptic”.

Another type of involution occurring in projective geometry is a polarity which is a correlation of period 2. *[6]

17.2.3 Linear algebra

For more details on this topic, see Involutory matrix.

In linear algebra, an involution is a linear operator T such that T 2 = I . Except for in characteristic 2, such operators are diagonalizable with 1s and −1s on the diagonal. If the operator is orthogonal (an orthogonal involution), it is orthonormally diagonalizable.

For example, suppose that a basis for a vector space V is chosen, and that e1 and e2 are basis elements. There exists a linear transformation f which sends e1 to e2, and sends e2 to e1, and which is the identity on all other basis vectors. It can be checked that f(f(x))=x for all x in V. That is, f is an involution of V. This definition extends readily to modules. Given a module M over a ring R, an R endomorphism f of M is called an involution if f 2 is the identity homomorphism on M. Involutions are related to idempotents; if 2 is invertible then they correspond in a one-to-one manner.

17.2.4 Quaternion algebra, groups, semigroups

In a quaternion algebra, an (anti-)involution is defined by the following axioms: if we consider a transformation x 7→ f(x) then an involution is

• f(f(x)) = x . An involution is its own inverse 17.2. INVOLUTION THROUGHOUT THE FIELDS OF MATHEMATICS 65

• An involution is linear: f(x1 + x2) = f(x1) + f(x2) and f(λx) = λf(x)

• f(x1x2) = f(x1)f(x2)

An anti-involution does not obey the last axiom but instead

• f(x1x2) = f(x2)f(x1)

This former law is sometimes called antidistributive. It also appears in groups as (xy)*−1 = y*−1x*−1. Taken as an axiom, it leads to the notion of semigroup with involution, of which there are natural examples that are not groups, for example multiplication (i.e. the full linear monoid) with transpose as the involution.

17.2.5 Ring theory

For more details on this topic, see *-algebra.

In ring theory, the word involution is customarily taken to mean an antihomomorphism that is its own inverse function. Examples of involutions in common rings:

• complex conjugation on the complex plane

• multiplication by j in the split-complex numbers

• taking the transpose in a .

17.2.6 Group theory

In group theory, an element of a group is an involution if it has order 2; i.e. an involution is an element a such that a ≠ e and a2 = e, where e is the identity element.*[7] Originally, this definition agreed with the first definition above, since members of groups were always bijections from a set into itself; i.e., group was taken to mean permutation group. By the end of the 19th century, group was defined more broadly, and accordingly so was involution. A permutation is an involution precisely if it can be written as a product of one or more non-overlapping transpositions. The involutions of a group have a large impact on the group's structure. The study of involutions was instrumental in the classification of finite simple groups. Coxeter groups are groups generated by involutions with the relations determined only by relations given for pairs of the generating involutions. Coxeter groups can be used, among other things, to describe the possible regular polyhedra and their generalizations to higher dimensions.

17.2.7 Mathematical logic

The operation of complement in Boolean algebras is an involution. Accordingly, negation in classical logic satisfies the law of double negation: ¬¬A is equivalent to A. Generally in non-classical logics, negation that satisfies the law of double negation is called involutive. In algebraic semantics, such a negation is realized as an involution on the algebra of truth values. Examples of logics which have involutive negation are Kleene and Bochvar three-valued logics, Łukasiewicz many-valued logic, fuzzy logic IMTL, etc. Involutive negation is sometimes added as an additional connective to logics with non-involutive negation; this is usual, for example, in t-norm fuzzy logics. The involutiveness of negation is an important characterization property for logics and the corresponding varieties of algebras. For instance, involutive negation characterizes Boolean algebras among Heyting algebras. Correspondingly, classical Boolean logic arises by adding the law of double negation to intuitionistic logic. The same relationship holds also between MV-algebras and BL-algebras (and so correspondingly between Łukasiewicz logic and fuzzy logic BL), IMTL and MTL, and other pairs of important varieties of algebras (resp. corresponding logics). 66 CHAPTER 17. INVOLUTION (MATHEMATICS)

17.2.8 Computer science

The XOR bitwise operation with a given value for one parameter is also an involution. XOR masks were once used to draw graphics on images in such a way that drawing them twice on the background reverts the background to its original state. Another example is a bit mask and shift function operating on color values stored as integers say in the form RGB that swaps R and B, resulting in form BGR. f(f(RGB))=RGB, f(f(BGR))=BGR. The RC4 cryptographic cipher is involutionary, as encryption and decryption operations use the same function.

17.3 References

[1] Russell, Bertrand (1903), Principles of mathematics (2nd ed.), W. W. Norton & Company, Inc, pp. page 426, ISBN 9781440054167

[2] Ritt, J. F. (1916). “On Certain Real Solutions of Babbage's Functional Equation”. The Annals of Mathematics 17 (3): 113. doi:10.2307/2007270. JSTOR 2007270.

[3] Knuth, Donald E. (1973), The Art of Computer Programming, Volume 3: Sorting and Searching, Reading, Mass.: Addison- Wesley, pp. 48, 65, MR 0445948.

[4] Kubrusly, Carlos S. (2011), The Elements of Operator Theory, Springer Science & Business Media, Problem 1.11(a), p. 27, ISBN 9780817649982.

[5] Zagier, D. (1990),“A one-sentence proof that every prime p ≡ 1 (mod 4) is a sum of two squares”, American Mathematical Monthly 97 (2): 144, doi:10.2307/2323918, MR 1041893.

[6] H. S. M. Coxeter (1969) Introduction to Geometry, pp 244–8, John Wiley & Sons

[7] John S. Rose. “A Course on Group Theory”. p. 10, section 1.13.

• Todd A. Ell; Stephen J. Sangwine (2007),“Quaternion involutions and anti-involutions”, Computers & Math- ematics with Applications 53 (1): 137–143, doi:10.1016/j.camwa.2006.10.029

17.4 Further reading

• Knus, Max-Albert; Merkurjev, Alexander; Rost, Markus; Tignol, Jean-Pierre (1998), The book of involutions, Colloquium Publications 44, With a preface by J. Tits, Providence, RI: American Mathematical Society, ISBN 0-8218-0904-0, Zbl 0955.16001

17.5 See also

• Automorphism

• Idempotence • ROT13 Chapter 18

Linguistic universal

A linguistic universal is a pattern that occurs systematically across natural languages, potentially true for all of them. For example, All languages have nouns and verbs, or If a language is spoken, it has consonants and vowels. Research in this area of linguistics is closely tied to the study of linguistic typology, and intends to reveal generalizations across languages, likely tied to cognition, perception, or other abilities of the mind. The field was largely pioneered by the linguist Joseph Greenberg, who derived a set of forty-five basic universals, mostly dealing with syntax, from a study of some thirty languages.

18.1 Terminology

Linguists distinguish between two kinds of universals: absolute (opposite: statistical, often called tendencies) and implicational (opposite non-implicational). Absolute universals apply to every known language and are quite few in number; an example is All languages have pronouns. An implicational universal applies to languages with a particular feature that is always accompanied by another feature, such as If a language has trial grammatical number, it also has dual grammatical number, while non-implicational universals just state the existence (or non-existence) of one particular feature. Also in contrast to absolute universals are tendencies, statements that may not be true for all languages, but never- theless are far too common to be the result of chance.*[1] They also have implicational and non-implicational forms. An example of the latter would be The vast majority of languages have nasal consonants.*[2] However, most ten- dencies, like their universal counterparts, are implicational. For example, With overwhelmingly greater-than-chance frequency, languages with normal SOV order are postpositional. Strictly speaking, a tendency is not a kind of univer- sal, but exceptions to most statements called universals can be found. For example, Latin is an SOV language with prepositions. Often it turns out that these exceptional languages are undergoing a shift from one type of language to another. In the case of Latin, its descendant Romance languages switched to SVO, which is a much more common order among prepositional languages. Universals may also be bidirectional or unidirectional. In a bidirectional universal two features each imply the existence of each other. For example, languages with postpositions usually have SOV order, and likewise SOV languages usually have postpositions. The implication works both ways, and thus the universal is bidirectional. By contrast, in a unidirectional universal the implication works only one way. Languages that place relative clauses before the noun they modify again usually have SOV order, so pre-nominal relative clauses imply SOV. On the other hand, SOV languages worldwide show little preference for pre-nominal relative clauses, and thus SOV implies little about the order of relative clauses. As the implication works only one way, the proposed universal is a unidirectional one. Linguistic universals in syntax are sometimes held up as evidence for universal grammar (although epistemological arguments are more common). Other explanations for linguistic universals have been proposed, for example, that linguistic universals tend to be properties of language that aid communication. If a language were to lack one of these properties, it has been argued, it would probably soon evolve into a language having that property.*[3] Michael Halliday has argued for a distinction between descriptive and theoretical categories in resolving the matter of the existence of linguistic universals, a distinction he takes from J.R. Firth and Louis Hjelmslev. He argues that “theoretical categories, and their inter-relations construe an abstract model of language...; they are interlocking and mutually defining”. Descriptive categories, by contrast, are those set up to describe particular languages. He argues

67 68 CHAPTER 18. LINGUISTIC UNIVERSAL

that“When people ask about 'universals', they usually mean descriptive categories that are assumed to be found in all languages. The problem is there is no mechanism for deciding how much alike descriptive categories from different languages have to be before they are said to be 'the same thing'" *[4]

18.2 In semantics

In the domain of semantics, research into linguistic universals has taken place in a number of ways. Some linguists, starting with Leibniz, have pursued the search for a hypothetic irreducible semantic core of all languages. A mod- ern variant of this approach can be found in the Natural Semantic Metalanguage of Wierzbicka and associates.*[5] Other lines of research suggest cross-linguistic tendencies to use body part terms metaphorically as adpositions,*[6] or tendencies to have morphologically simple words for cognitively salient concepts.*[7] The human body, being a physiological universal, provides an ideal domain for research into semantic and lexical universals. In a seminal study, Cecil H. Brown (1976) proposed a number of universals in the semantics of body part terminology, including the following: in any language, there will be distinct terms for BODY, HEAD, ARM, EYES, NOSE, and MOUTH; if there is a distinct term for FOOT, there will be a distinct term for HAND; similarly, if there are terms for INDIVIDUAL TOES, then there are terms for INDIVIDUAL FINGERS. Subsequent research has shown that most of these features have to be considered cross-linguistic tendencies rather than true universals. Several languages, for example Tidore and Kuuk Thaayorre, lack a general term meaning 'body'. On the basis of such data it has been argued that the highest level in the partonomy of body part terms would be the word for 'person'.*[8]

18.3 See also

• Greenberg's linguistic universals

• Cultural universal

18.4 Notes

[1] Dryer (1998)

[2] Lushootseed and Rotokas are examples of the rare languages which truly lack nasal consonants as normal speech sounds.

[3] Daniel everett: Language the cultural tool

[4] Halliday, M.A.K. 2002. A personal perspective. In On Grammar, Volume 1 in the Collected Works of M.A.K. Halliday. London and New York: Continuumm p12.

[5] see for example Goddard & Wierzbicka (1994) and Goddard (2002).

[6] Heine (1997)

[7] Rosch et al. (1976)

[8] Wilkins (1993), Enfield et al. 2006:17.

18.5 References

• Brown, Cecil H. (1976) “General principles of human anatomical partonomy and speculations on the growth of partonomic nomenclature.”American Ethnologist 3, no. 3, Folk Biology, pp. 400–424

• Comrie, Bernard (1981) Language Universals and Linguistic Typology. Chicago: University of Chicago Press.

• Croft, W. (2002). Typology and Universals. Cambridge: Cambridge UP. 2nd ed. ISBN 0-521-00499-3

• Dryer, Matthew S. (1998)“Why Statistical Universals are Better Than Absolute Universals”Chicago Linguistic Society 33: The Panels, pp. 123–145. 18.6. EXTERNAL LINKS 69

• Enfield, Nick J. & Asifa Majid & Miriam van Staden (2006) 'Cross-linguistic categorisation of the body: Introduction' (special issue of Language Sciences). • Ferguson, Charles A. (1968) 'Historical background of universals research'. In: Greenberg, Ferguson, & Moravcsik, Universals of human languages, pp. 7–31. • Goddard, Cliff and Wierzbicka, Anna (eds.). 1994. Semantic and Lexical Universals - Theory and Empirical Findings. Amsterdam/Philadelphia: John Benjamins. • Goddard, Cliff (2002)“The search for the shared semantic core of all languages”. In Goddard & Wierzbicka (eds.) Meaning and Universal Grammar - Theory and Empirical Findings volume 1, pp. 5–40, Amster- dam/Philadelphia: John Benjamins.

• Greenberg, Joseph H. (ed.) (1963) Universals of Language. Cambridge, Mass.: MIT Press. • Greenberg, Joseph H. (ed.) (1978a) Universals of Human Language Vol. 4: Syntax. Stanford, California: Stanford University Press.

• Greenberg, Joseph H. (ed.) (1978b) Universals of Human Language Vol. 3: Word Structure. Stanford, Cali- fornia: Stanford University Press.

• Heine, Bernd (1997) Cognitive Foundations of Grammar. New York/Oxford: Oxford University Press. • Song, Jae Jung (2001) Linguistic Typology: Morphology and Syntax. Harlow, UK: Pearson Education (Long- man). • Song, Jae Jung (ed.) (2011) Oxford Handbook of Linguistic Typology. Oxford: Oxford University Press.

• Rosch, E. & Mervis, C.B. & Gray, W.D. & Johnson, D.M. & Boyes-Braem, P. (1976) 'Basic Objects In Natural Categories', Cognitive Psychology 8-3, 382-439.

• Wilkins, David P. (1993) ʻFrom part to person: natural tendencies of semantic change and the search for cognatesʼ, Working paper No. 23, Cognitive Anthropology Research Group at the Max Planck Institute for Psycholinguistics.

18.6 External links

• Some Universals of Grammar with Particular Reference to the Order of Meaningful Elements by Joseph H. Greenberg • The Universals Archive by the University of Konstanz Chapter 19

Linguistics

This article is about the field of study. For the journal, see Linguistics (journal). “Linguist”redirects here. For other uses, see Linguist (disambiguation).

Linguistics is the scientific*[1] study of language.*[2]There are three aspects to this study: language form, language meaning, and language in context.*[3] The earliest activities in the description of language have been attributed to Pāṇini (fl. 4th century BCE),*[4] with his analysis of Sanskrit in Ashtadhyayi.*[5] Linguistics analyzes human language as a system for relating sounds (or signs in signed languages) and meaning.*[6] Phonetics studies acoustic and articulatory properties of the production and perception of speech sounds and non- speech sounds. The study of language meaning, on the other hand, deals with how languages encode relations between entities, properties, and other aspects of the world to convey, process, and assign meaning, as well as to manage and resolve ambiguity. While the study of semantics typically concerns itself with truth conditions, pragmatics deals with how context influences meanings.*[7] Grammar is a system of rules which govern the form of the utterances in a given language. It encompasses both sound*[8] and meaning, and includes phonology (how sounds and gestures function together), morphology (the formation and composition of words), and syntax (the formation and composition of phrases and sentences from words).*[9] In the early 20th century, Ferdinand de Saussure distinguished between the notions of langue and parole in his formu- lation of structural linguistics. According to him, parole is the specific utterance of speech, whereas langue refers to an abstract phenomenon that theoretically defines the principles and system of rules that govern a language.*[10] This distinction resembles the one made by Noam Chomsky between competence and performance, where competence is individual's ideal knowledge of a language, while performance is the specific way in which it is used.*[11] The formal study of language has also led to the growth of fields like psycholinguistics, which explores the repre- sentation and function of language in the mind; neurolinguistics, which studies language processing in the brain; and language acquisition, which investigates how children and adults acquire a particular language. Linguistics also includes nonformal approaches to the study of other aspects of human language, such as social, cul- tural, historical and political factors.*[12] The study of cultural discourses and dialects is the domain of sociolinguistics, which looks at the relation between linguistic variation and social structures, as well as that of discourse anal- ysis, which examines the structure of texts and conversations.*[13] Research on language through historical and evolutionary linguistics focuses on how languages change, and on the origin and growth of languages, particularly over an extended period of time. Corpus linguistics takes naturally occurring texts or films (in signed languages) as its primary object of analysis, and studies the variation of grammatical and other features based on such corpora. Stylistics involves the study of patterns of style: within written, signed, or spoken discourse.*[14] Language documentation combines anthropological inquiry with linguistic inquiry to describe languages and their grammars. Lexicography covers the study and construction of dictionaries. Computational linguistics applies computer technology to address questions in theoretical linguistics, as well as to create applications for use in parsing, data retrieval, machine translation, and other areas. People can apply actual knowledge of a language in translation and interpreting, as well as in language education - the teaching of a second or foreign language. Policy makers work with governments to implement new plans in education and teaching which are based on linguistic research.

70 19.1. NOMENCLATURE 71

Areas of study related to linguistics include semiotics (the study of signs and symbols both within language and without), literary criticism, translation, and speech-language pathology.

19.1 Nomenclature

Before the 20th century, the term philology, first attested in 1716,*[15] was commonly used to refer to the science of language, which was then predominantly historical in focus.*[16]*[17] Since Ferdinand de Saussure's insistence on the importance of synchronic analysis, however, this focus has shifted*[18] and the term“philology”is now generally used for the “study of a language's grammar, history, and literary tradition”, especially in the United States*[19] (where philology has never been very popularly considered as the “science of language”).*[20] Although the term“linguist”in the sense of“a student of language”dates from 1641,*[21] the term“linguistics” is first attested in 1847.*[21] It is now the common academic term in English for the scientific study of language. Today, the term linguist applies to someone who studies language or is a researcher within the field, or to someone who uses the tools of the discipline to describe and analyze specific languages.*[22]

19.2 Variation and Universality

While some theories on linguistics focus on the different varieties that language produces, among different sections of society, others focus on the universal properties that are common to all human languages. The theory of variation therefore would elaborate on the different usages of popular languages like French and English across the globe, as well as its smaller dialects and regional permutations within their national boundaries. The theory of variation looks at the cultural stages that a particular language undergoes, and these include the following.

19.2.1 Lexicon

The lexicon is a catalogue of words and terms that are stored in a speaker's mind. The lexicon consists of words and bound morphemes, which are words that can't stand alone, like affixes. In some analyses, compound words and certain classes of idiomatic expressions and other collocations are also considered to be part of the lexicon. Dictionaries represent attempts at listing, in alphabetical order, the lexicon of a given language; usually, however, bound morphemes are not included. Lexicography, closely linked with the domain of semantics, is the science of mapping the words into an encyclopedia or a dictionary. The creation and addition of new words (into the lexicon) are called neologisms. It is often believed that a speaker's capacity for language lies in the quantity of words stored in the lexicon. However, this is often considered a myth by linguists. The capacity for the use of language is considered by many linguists to lie primarily in the domain of grammar, and to be linked with competence, rather than with the growth of vocabulary. Even a very small lexicon is theoretically capable of producing an infinite number of sentences.

19.2.2 Discourse

A discourse is a way of speaking that emerges within a certain social setting and is based on a certain subject matter. A particular discourse becomes a language variety when it is used in this way for a particular purpose, and is referred to as a register.*[23] There may be certain lexical additions (new words) that are brought into play because of the expertise of the community of people within a certain domain of specialisation. Registers and discourses therefore differentiate themselves through the use of vocabulary, and at times through the use of style too. People in the medical fraternity, for example, may use some medical terminology in their communication that is specialised to the field of medicine. This is often referred to as being part of the “medical discourse”, and so on.

19.2.3 Dialect

A dialect is a variety of language that is characteristic of a particular group among the language speakers.*[24] The group of people who are the speakers of a dialect are usually bound to each other by social identity. This is what differentiates a dialect from a register or a discourse, where in the latter case, cultural identity does not always play 72 CHAPTER 19. LINGUISTICS a role. Dialects are speech varieties that have their own grammatical and phonological rules, linguistic features, and stylistic aspects, but have not been given an official status as a language. Dialects often move on to gain the status of a language due to political and social reasons. Differentiation amongst dialects (and subsequently, languages too) is based upon the use of grammatical rules, syntactic rules, and stylistic features, though not always on lexical use or vocabulary. The popular saying that a "language is a dialect with an army and navy" is attributed as a definition formulated by Max Weinreich. Universal grammar takes into account general formal structures and features that are common to all dialects and languages, and the template of which pre-exists in the mind of an infant child. This idea is based on the theory of generative grammar and the formal school of linguistics, whose proponents include Noam Chomsky and those who follow his theory and work.

“We may as individuals be rather fond of our own dialect. This should not make us think, though, that it is actually any better than any other dialect. Dialects are not good or bad, nice or nasty, right or wrong – they are just different from one another, and it is the mark of a civilised society that it tolerates different dialects just as it tolerates different races, religions and sexes.”*[25]

19.2.4 Structures

Linguistic structures are pairings of meaning and form. Any particular pairing of meaning and form is a Saussurean sign. For instance, the meaning “cat”is represented worldwide with a wide variety of different sound patterns (in oral languages), movements of the hands and face (in sign languages), and written symbols (in written languages). Linguists focusing on structure attempt to understand the rules regarding language use that native speakers know (not always consciously). All linguistic structures can be broken down into component parts that are combined according to (sub)conscious rules, over multiple levels of analysis. For instance, consider the structure of the word “tenth”on two different levels of analysis. On the level of internal word structure (known as morphology), the word “tenth” is made up of one linguistic form indicating a number and another form indicating ordinality. The rule governing the combination of these forms ensures that the ordinality marker “th”follows the number “ten.”On the level of sound structure (known as phonology), structural analysis shows that the “n”sound in “tenth”is made differently from the “n”sound in “ten”spoken alone. Although most speakers of English are consciously aware of the rules governing internal structure of the word pieces of“tenth”, they are less often aware of the rule governing its sound structure. Linguists focused on structure find and analyze rules such as these, which govern how native speakers use language. Linguistics has many sub-fields concerned with particular aspects of linguistic structure. The theory that elucidates on these, as propounded by Noam Chomsky, is known as generative theory or universal grammar. These sub-fields range from those focused primarily on form to those focused primarily on meaning. They also run the gamut of level of analysis of language, from individual sounds, to words, to phrases, up to cultural discourse. Sub-fields that focus on a structure-focused study of language:

• Phonetics, the study of the physical properties of speech sound production and perception

• Phonology, the study of sounds as abstract elements in the speaker's mind that distinguish meaning (phonemes)

• Morphology, the study of morphemes, or the internal structures of words and how they can be modified

• Syntax, the study of how words combine to form grammatical phrases and sentences

• Semantics, the study of the meaning of words (lexical semantics) and fixed word combinations (phraseology), and how these combine to form the meanings of sentences

• Pragmatics, the study of how utterances are used in communicative acts, and the role played by context and non-linguistic knowledge in the transmission of meaning

• Discourse analysis, the analysis of language use in texts (spoken, written, or signed)

• Stylistics, the study of linguistic factors (rhetoric, diction, stress) that place a discourse in context

• Semiotics, the study of signs and sign processes (semiosis), indication, designation, likeness, analogy, metaphor, symbolism, signification, and communication. 19.3. APPROACH 73

19.2.5 Relativity

As constructed popularly through the "Sapir-Whorf Hypothesis", relativists believe that the structure of a particu- lar language is capable of influencing the cognitive patterns through which a person shapes his or her world view. Universalists believe that there are commonalities between human perception as there is in the human capacity for language, while relativists believe that this varies from language to language and person to person. While the Sapir- Whorf hypothesis is an elaboration of this idea expressed through the writings of American linguists Edward Sapir and Benjamin Lee Whorf, it was Sapir's student Harry Hoijer who termed it thus. The 20th century German linguist Leo Weisgerber also wrote extensively about the theory of relativity. Relativists argue for the case of differenti- ation at the level of cognition and in semantic domains. The emergence of cognitive linguistics in the 1980s also revived an interest in linguistic relativity. Thinkers like George Lakoff have argued that language reflects different cultural metaphors, while the French philosopher of language Jacques Derrida's writings have been seen to be closely associated with the relativist movement in linguistics, especially through deconstruction*[26] and was even heavily criticised in the media at the time of his death for his theory of relativism.*[27]

19.2.6 Style

Stylistics is the study and interpretation of texts for aspects of their linguistic and tonal style. Stylistic analysis entails the analysis of description of particular dialects and registers used by speech communities. Stylistic features include rhetoric,*[28] diction, stress, satire, irony, dialogue, and other forms of phonetic variations. Stylistic analysis can also include the study of language in canonical works of literature, popular fiction, news, advertisements, and other forms of communication in popular culture as well. It is usually seen as a variation in communication that changes from speaker to speaker and community to community. In short, Stylistics is the interpretation of text.

19.3 Approach

One major debate in linguistics concerns how language should be defined and understood. Some linguists use the term “language”primarily to refer to a hypothesized, innate module in the human brain that allows people to undertake linguistic behavior, which is part of the formalist approach. This "universal grammar" is considered to guide children when they learn languages and to constrain what sentences are considered grammatical in any language. Proponents of this view, which is predominant in those schools of linguistics that are based on the generative theory of Noam Chomsky, do not necessarily consider that language evolved for communication in particular. They consider instead that it has more to do with the process of structuring human thought (see also formal grammar). Another group of linguists, by contrast, use the term“language”to refer to a communication system that developed to support cooperative activity and extend cooperative networks. Such theories of grammar view language as a tool that emerged and is adapted to the communicative needs of its users, and the role of cultural evolutionary processes are often emphasized over that of biological evolution.*[29]

19.3.1 Methodology

Linguistics is primarily descriptive. Linguists describe and explain features of language without making subjective judgments on whether a particular feature or usage is “good”or “bad”. This is analogous to practice in other sciences: a zoologist studies the animal kingdom without making subjective judgments on whether a particular species is “better”or “worse”than another. Prescription, on the other hand, is an attempt to promote particular linguistic usages over others, often favoring a particular dialect or "acrolect". This may have the aim of establishing a linguistic standard, which can aid commu- nication over large geographical areas. It may also, however, be an attempt by speakers of one language or dialect to exert influence over speakers of other languages or dialects (see Linguistic imperialism). An extreme version of prescriptivism can be found among censors, who attempt to eradicate words and structures that they consider to be destructive to society. Prescription, however, is practiced in the teaching of language, where certain fundamental grammatical rules and lexical terms need to be introduced to a second-language speaker who is attempting to acquire the language. 74 CHAPTER 19. LINGUISTICS

19.3.2 Analysis

Before the 20th century, linguists analyzed language on a diachronic plane, which was historical in focus. This meant that they would compare linguistic features and try to analyze language from the point of view of how it had changed between then and later. However, with Saussurean linguistics in the 20th century, the focus shifted to a more synchronic approach, where the study was more geared towards analysis and comparison between different language variations, which existed at the same given point of time. At another level, the syntagmatic plane of linguistic analysis entails the comparison between the way words are sequenced, within the syntax of a sentence. For example, the article “the”is followed by a noun, because of the syntagmatic relation between the words. The paradigmatic plane on the other hand, focuses on an analysis that is based on the paradigms or concepts that are embedded in a given text. In this case, words of the same type or class may be replaced in the text with each other to achieve the same conceptual understanding.

19.3.3 Anthropology

The objective of describing languages is to often uncover cultural knowledge about communities. The use of anthropological methods of investigation on linguistic sources leads to the discovery of certain cultural traits among a speech community through its linguistic features. It is also widely used as a tool in language documentation, with an endeavor to curate endangered languages. However, now, linguistic inquiry uses the anthropological method to understand cognitive, historical, sociolinguistic and historical processes that languages undergo as they change and evolve, as well as general anthropological inquiry uses the linguistic method to excavate into culture. In all aspects, anthropological inquiry usually uncovers the different variations and relativities that underlie the usage of language.

19.3.4 Sources

Most contemporary linguists work under the assumption that spoken data and signed data is more fundamental than written data. This is because:

• Speech appears to be universal to all human beings capable of producing and perceiving it, while there have been many cultures and speech communities that lack written communication;

• Features appear in speech which aren't always recorded in writing, including phonological rules, sound changes, and speech errors;

• All natural writing systems reflect a spoken language (or potentially a signed one) they are being used to write, with even pictographic scripts like Dongba writing Naxi homophones with the same pictogram, and text in writing systems used for two languages changing to fit the spoken language being recorded;

• Speech evolved before human beings invented writing;

• People learnt to speak and process spoken language more easily and earlier than they did with writing.

Nonetheless, linguists agree that the study of written language can be worthwhile and valuable. For research that relies on corpus linguistics and computational linguistics, written language is often much more convenient for processing large amounts of linguistic data. Large corpora of spoken language are difficult to create and hard to find, and are typically transcribed and written. In addition, linguists have turned to text-based discourse occurring in various formats of computer-mediated communication as a viable site for linguistic inquiry. The study of writing systems themselves is, in any case, considered a branch of linguistics.

19.4 History of linguistic thought

Main article: History of linguistics 19.4. HISTORY OF LINGUISTIC THOUGHT 75

19.4.1 Early grammarians

Main articles: Philology and History of English grammars The formal study of language began in India with Pāṇini, the 5th century BC grammarian who formulated 3,959

Ancient Tamil inscription at Thanjavur

rules of Sanskrit morphology. Pāṇini's systematic classification of the sounds of Sanskrit into consonants and vowels, and word classes, such as nouns and verbs, was the first known instance of its kind. In the Middle East Sibawayh الكتاب) made a detailed description of Arabic in 760 AD in his monumental work, Al-kitab fi al-nahw( سیبویه) The Book on Grammar), the first known author to distinguish between sounds and phonemes (sounds , في النحو as units of a linguistic system). Western interest in the study of languages began as early as in the East,*[30] but the grammarians of the classical languages did not use the same methods or reach the same conclusions as their contemporaries in the Indic world. Early interest in language in the West was a part of philosophy, not of grammatical description. The first insights into semantic theory were made by Plato in his Cratylus dialogue, where he argues that words denote concepts that are eternal and exist in the world of ideas. This work is the first to use the word etymology to describe the history of a word's meaning. Around 280 BC, one of Alexander the Great's successors founded a university (see Musaeum) in Alexandria, where a school of philologists studied the ancient texts in and taught Greek to speakers of other languages. While this school was the first to use the word "grammar" in its modern sense, Plato had used the word in its original meaning as "téchnē grammatikḗ" (Τέχνη Γραμματική), the“art of writing”, which is also the title of one of the most important works of the Alexandrine school by Dionysius Thrax.*[31] Throughout the Middle Ages, the study of language was subsumed under the topic of philology, the study of ancient languages and texts, practiced by such educators as Roger Ascham, Wolfgang Ratke, and John Amos Comenius.*[32]

19.4.2 Comparative philology

In the 18th century, the first use of the comparative method by William Jones sparked the rise of comparative linguistics.*[33] Bloomfield attributes “the first great scientific linguistic work of the world”to Jacob Grimm, who wrote Deutsche Grammatik.*[34] It was soon followed by other authors writing similar comparative studies on other 76 CHAPTER 19. LINGUISTICS

language groups of Europe. The scientific study of language was broadened from Indo-European to language in general by Wilhelm von Humboldt, of whom Bloomfield asserts:*[34]

This study received its foundation at the hands of the Prussian statesman and scholar Wilhelm von Humboldt (1767̶1835), especially in the first volume of his work on Kavi, the literary language of Java, entitled Über die Verschiedenheit des menschlichen Sprachbaues und ihren Einfluß auf die geistige Entwickelung des Menschengeschlechts (On the Variety of the Structure of Human Language and its In- fluence upon the Mental Development of the Human Race).

19.4.3 Structuralism

Main article: Structuralism (linguistics)

Early in the 20th century, Saussure introduced the idea of language as a static system of interconnected units, de- fined through the oppositions between them. By introducing a distinction between diachronic to synchronic analyses of language, he laid the foundation of the modern discipline of linguistics. Saussure also introduced several basic dimensions of linguistic analysis that are still foundational in many contemporary linguistic theories, such as the dis- tinctions between syntagm and paradigm, and the langue- parole distinction, distinguishing language as an abstract system (langue) from language as a concrete manifestation of this system (parole).*[35] Substantial additional contri- butions following Saussure's definition of a structural approach to language came from The Prague school, Leonard Bloomfield, Charles F. Hockett, Louis Hjelmslev, Émile Benveniste and Roman Jakobson.*[36]*[37]

19.4.4 Generativism

Main article: Generative linguistics

During the last half of the 20th century, following the work of Noam Chomsky, linguistics was dominated by the generativist school. While formulated by Chomsky in part as a way to explain how human beings acquire language and the biological constraints on this acquisition, in practice it has largely been concerned with giving formal accounts of specific phenomena in natural languages. Generative theory is modularist and formalist in character. Chomsky built on earlier work of Zellig Harris to formulate the generative theory of language. According to this theory the most basic form of language is a set of syntactic rules universal for all humans and underlying the grammars of all human languages. This set of rules is called Universal Grammar, and for Chomsky describing it is the primary objective of the discipline of linguistics. For this reason the grammars of individual languages are of importance to linguistics only in so far as they allow us to discern the universal underlying rules from which the observable linguistic variability is generated. In the classic formalisation of generative grammars first proposed by Noam Chomsky in the 1950s,*[38]*[39] a grammar G consists of the following components:

• A finite set N of nonterminal symbols, none of which appear in strings formed from G.

• A finite set Σ of terminal symbols that is disjoint from N.

• A finite set P of production rules, that map from one string of symbols to another.

A formal description of language attempts to replicate a speaker's knowledge of the rules of their language, and the aim is to produce a set of rules that is minimally sufficient to successfully model valid linguistic forms.

19.4.5 Functionalism

Main article: Functional theories of grammar

Functional theories of language propose that since language is fundamentally a tool, it is reasonable to assume that its structures are best analyzed and understood with reference to the functions they carry out. Functional theories of 19.5. AREAS OF RESEARCH 77

grammar differ from formal theories of grammar, in that the latter seek to define the different elements of language and describe the way they relate to each other as systems of formal rules or operations, whereas the former defines the functions performed by language and then relates these functions to the linguistic elements that carry them out. This means that functional theories of grammar tend to pay attention to the way language is actually used, and not just to the formal relations between linguistic elements.*[40] Functional theories describe language in term of the functions existing at all levels of language.

• Phonological function: the function of the phoneme is to distinguish between different lexical material.

• Semantic function: (Agent, Patient, Recipient, etc.), describing the role of participants in states of affairs or actions expressed.

• Syntactic functions: (e.g. subject and Object), defining different perspectives in the presentation of a linguistic expression

• Pragmatic functions: (Theme and Rheme, Topic and Focus, Predicate), defining the informational status of constituents, determined by the pragmatic context of the verbal interaction. Functional descriptions of grammar strive to explain how linguistic functions are performed in communication through the use of linguistic forms.

19.4.6 Cognitivism

Main article: Cognitive linguistics

In the 1950s, a new school of thought known as cognitivism emerged through the field of psychology. Cognitivists lay emphasis on knowledge and information, as opposed to behaviorism, for instance. Cognitivism emerged in lin- guistics as a reaction to generativist theory in the 1970s and 1980s. Led by theorists like Ronald Langacker and George Lakoff, cognitive linguists propose that language is an emergent property of basic, general-purpose cognitive processes. In contrast to the generativist school of linguistics, cognitive linguistics is non-modularist and function- alist in character. Important developments in cognitive linguistics include cognitive grammar, frame semantics, and conceptual metaphor, all of which are based on the idea that form–function correspondences based on representations derived from embodied experience constitute the basic units of language. Cognitive linguistics interprets language in terms of concepts (sometimes universal, sometimes specific to a particular tongue) that underlie its form. It is thus closely associated with semantics but is distinct from psycholinguistics, which draws upon empirical findings from cognitive psychology in order to explain the mental processes that underlie the acquisition, storage, production and understanding of speech and writing. Unlike generative theory, cognitive linguistics denies that there is an autonomous linguistic faculty in the mind; it understands grammar in terms of conceptualization; and claims that knowledge of language arises out of language use.*[41] Because of its conviction that knowledge of language is learned through use, cognitive linguistics is sometimes considered to be a functional approach, but it differs from other functional approaches in that it is primarily concerned with how the mind creates meaning through language, and not with the use of language as a tool of communication.

19.5 Areas of research

19.5.1 Historical linguistics

Historical linguists study the history of specific languages as well as general characteristics of language change. The study of language change is also referred to as“diachronic linguistics”(the study of how one particular language has changed over time), which can be distinguished from “synchronic linguistics”(the comparative study of more than one language at a given moment in time without regard to previous stages). Historical linguistics was among the first sub-disciplines to emerge in linguistics, and was the most widely practiced form of linguistics in the late 19th century. However, there was a shift to the synchronic approach in the early twentieth century with Saussure, and became more predominant in western linguistics with the work of Noam Chomsky. 78 CHAPTER 19. LINGUISTICS

19.5.2 Sociolinguistics

Sociolinguistics is the study of how language is shaped by social factors. This sub-discipline focuses on the synchronic approach of linguistics, and looks at how a language in general, or a set of languages, display variation and varieties at a given point in time. The study of language variation and the different varieties of language through dialects, registers, and ideolects can be tackled through a study of style, as well as through analysis of discourse. Sociolinguists research on both style and discourse in language, and also study the theoretical factors that are at play between language and society.

19.5.3 Developmental linguistics

Developmental linguistics is the study of the development of linguistic ability in individuals, particularly the acquisition of language in childhood. Some of the questions that developmental linguistics looks into is how children acquire language, how adults can acquire a second language, and what the process of language acquisition is.

19.5.4 Neurolinguistics

Neurolinguistics is the study of the structures in the human brain that underlie grammar and communication. Re- searchers are drawn to the field from a variety of backgrounds, bringing along a variety of experimental tech- niques as well as widely varying theoretical perspectives. Much work in neurolinguistics is informed by models in psycholinguistics and theoretical linguistics, and is focused on investigating how the brain can implement the pro- cesses that theoretical and psycholinguistics propose are necessary in producing and comprehending language. Neu- rolinguists study the physiological mechanisms by which the brain processes information related to language, and evaluate linguistic and psycholinguistic theories, using aphasiology, brain imaging, electrophysiology, and computer modeling.

19.6 Applied linguistics

Main article: Applied linguistics

Linguists are largely concerned with finding and describing the generalities and varieties both within particular lan- guages and among all languages. Applied linguistics takes the results of those findings and “applies”them to other areas. Linguistic research is commonly applied to areas such as language education, lexicography, translation, language planning, which involves governmental policy implementation related to language use, and natural language processing. “Applied linguistics”has been argued to be something of a misnomer.*[42] Applied linguists actually focus on making sense of and engineering solutions for real-world linguistic problems, and not literally “applying” existing technical knowledge from linguistics. Moreover, they commonly apply technical knowledge from multiple sources, such as sociology (e.g., conversation analysis) and anthropology. (Constructed language fits under Applied linguistics.) Today, computers are widely used in many areas of applied linguistics. Speech synthesis and speech recognition use phonetic and phonemic knowledge to provide voice interfaces to computers. Applications of computational linguistics in machine translation, computer-assisted translation, and natural language processing are areas of applied linguistics that have come to the forefront. Their influence has had an effect on theories of syntax and semantics, as modeling syntactic and semantic theories on computers constraints. Linguistic analysis is a sub-discipline of applied linguistics used by many governments to verify the claimed nationality of people seeking asylum who do not hold the necessary documentation to prove their claim.*[43] This often takes the form of an interview by personnel in an immigration department. Depending on the country, this interview is conducted either in the asylum seeker's native language through an interpreter or in an international lingua franca like English.*[43] Australia uses the former method, while Germany employs the latter; the Netherlands uses either method depending on the languages involved.*[43] Tape recordings of the interview then undergo language analysis, which can be done either by private contractors or within a department of the government. In this analysis, linguistic features of the asylum seeker are used by analysts to make a determination about the speaker's nationality. The reported findings of the linguistic analysis can play a critical role in the government's decision on the refugee status of the asylum seeker.*[43] 19.7. INTER-DISCIPLINARY FIELDS 79

19.7 Inter-disciplinary fields

Within the broad discipline of linguistics, various emerging sub-disciplines focus on a more detailed description and analysis of language, and are often organized on the basis of the school of thought and theoretical approach that they pre-suppose, or the external factors that influence them.

19.7.1 Semiotics

Semiotics is the study of sign processes (semiosis), or signification and communication, signs, and symbols, both individually and grouped into sign systems, including the study of how meaning is constructed and understood. Semioticians often do not restrict themselves to linguistic communication when studying the use of signs but ex- tend the meaning of “sign”to cover all kinds of cultural symbols. Nonetheless, semiotic disciplines closely related to linguistics are literary studies, discourse analysis, text linguistics, and philosophy of language. Semiotics, within the linguistics paradigm, is the study of the relationship between language and culture. Historically, Edward Sapir and Ferdinand De Saussure's structuralist theories influenced the study of signs extensively until the late part of the 20th century, but later, post-modern and post-structural thought, through language philosophers including Jacques Derrida, Mikhail Bakhtin, Michel Foucault, and others, have also been a considerable influence on the discipline in the late part of the 20th century and early 21st century.*[44] These theories emphasise the role of language variation, and the idea of subjective usage, depending on external elements like social and cultural factors, rather than merely on the interplay of formal elements.

19.7.2 Language documentation

Since the inception of the discipline of linguistics, linguists have been concerned with describing and analysing previ- ously undocumented languages. Starting with Franz Boas in the early 1900s, this became the main focus of American linguistics until the rise of formal structural linguistics in the mid-20th century. This focus on language documenta- tion was partly motivated by a concern to document the rapidly disappearing languages of indigenous peoples. The ethnographic dimension of the Boasian approach to language description played a role in the development of disci- plines such as sociolinguistics, anthropological linguistics, and linguistic anthropology, which investigate the relations between language, culture, and society. The emphasis on linguistic description and documentation has also gained prominence outside North America, with the documentation of rapidly dying indigenous languages becoming a primary focus in many university programs in linguistics. Language description is a work-intensive endeavour, usually requiring years of field work in the lan- guage concerned, so as to equip the linguist to write a sufficiently accurate reference grammar. Further, the task of documentation requires the linguist to collect a substantial corpus in the language in question, consisting of texts and recordings, both sound and video, which can be stored in an accessible format within open repositories, and used for further research.*[45]

19.7.3 Translation

The sub-field of translation includes the translation of written and spoken texts across mediums, from digital to print and spoken. To translate literally means to transmute the meaning from one language into another. Translators are often employed by organisations, such as travel agencies as well as governmental embassies to facilitate communi- cation between two speakers who do not know each other's language. Translators are also employed to work within computational linguistics setups like Google Translate for example, which is an automated, programmed facility to translate words and phrases between any two or more given languages. Translation is also conducted by publishing houses, which convert works of writing from one language to another in order to reach varied audiences. Academic Translators, specialize and semi specialize on various other disciplines such as; Technology, Science, Law, Economics etc.

19.7.4 Biolinguistics

Biolinguistics is the study of natural as well as human-taught communication systems in animals, compared to human language. Researchers in the field of biolinguistics have also over the years questioned the possibility and extent of language in animals. 80 CHAPTER 19. LINGUISTICS

19.7.5 Clinical linguistics

Clinical linguistics is the application of linguistic theory to the fields of Speech-Language Pathology. Speech language pathologists work on corrective measures to cure communication disorders and swallowing disorders.

19.7.6 Computational linguistics

Computational linguistics is the study of linguistic issues in a way that is 'computationally responsible', i.e., taking careful note of computational consideration of algorithmic specification and computational complexity, so that the linguistic theories devised can be shown to exhibit certain desirable computational properties and their implementa- tions. Computational linguists also work on computer language and software development.

19.7.7 Evolutionary linguistics

Evolutionary linguistics is the interdisciplinary study of the emergence of the language faculty through human evo- lution, and also the application of evolutionary theory to the study of cultural evolution among different languages. It is also a study of the dispersal of various languages across the globe, through movements among ancient commu- nities.*[46]

19.7.8 Forensic linguistics

Forensic linguistics is the application of linguistic analysis to forensics. Forensic analysis investigates on the style, language, lexical use, and other linguistic and grammatical features used in the legal context to provide evidence in courts of law. Forensic linguists have also contributed expertise in criminal cases.

19.8 See also

Main articles: Outline of linguistics and Index of linguistics articles

• Cognitive science

• History of linguistics

• International Linguistics Olympiad

• International Congress of Linguists

• Linguistics Departments at Universities

• Summer schools for linguistics

• List of linguists

Other Terms and Concepts

• Anthroponymy

• Articulatory phonology

• Articulatory synthesis

• Asemic writing

• Axiom of categoricity

• Biolinguistics 19.8. SEE ALSO 81

• Biosemiotics • Concept Mining • Corpus linguistics • Critical discourse analysis • Cryptanalysis • Decipherment • Developmental linguistics • Embodied cognition • Endangered languages • Global language system • Glottometrics • Grammarian (Greco-Roman world) • Integrational linguistics • Integrationism • Intercultural competence • International Linguistic Olympiad • Language acquisition • Language attrition • Language engineering • Language geography • Linguistic typology • Machine translation • Metacommunicative competence • Microlinguistics • Natural language processing • Onomastics • Orthography • Philology • Reading • Rhythm in linguistics • Second language acquisition • Sign languages • Speaker recognition • Speech processing • Speech recognition • Speech synthesis 82 CHAPTER 19. LINGUISTICS

• Speech-Language Pathology

• Stratificational linguistics

• Text linguistics

• Writing systems

19.9 References

[1] Crystal, David (1990). Linguistics. Penguin Books. ISBN 9780140135312.

[2] Halliday, Michael A.K.; Jonathan Webster (2006). On Language and Linguistics. Continuum International Publishing Group. p. vii. ISBN 0-8264-8824-2.

[3] Martinet, André (1960). Elements of General Linguistics. Tr. Elisabeth Palmer Rubbert (Studies in General Linguistics, vol. i.). London: Faber. p. 15.

[4] Sanskrit Literature The Imperial Gazetteer of India, v. 2 (1909), p. 263.

[5] S.C. Vasu (Tr.) (1996). The Ashtadhyayi of Panini (2 Vols.). Vedic Books. ISBN 9788120804098.

[6] Jakobson, Roman (1937). Six Lectures on Sound and Meaning. MIT Press, Cambridge, Massachusetts. ISBN 0262600102.

[7] Chierchia, Gennaro and Sally McConnell-Ginet (2000). Meaning and Grammar: An Introduction to Semantics. MIT Press, Cambridge, Massachusetts. ISBN 9780262531641.

[8] All references in this article to the study of sound should be taken to include the manual and non-manual signs used in sign languages.

[9] Adrian Akmajian, Richard A. Demers, Ann K. Farmer, Robert M. Harnish (2010). Linguistics (6th ed.). The MIT Press. ISBN 0-262-51370-6. Retrieved 25 July 2012.

[10] de Saussure, F. (1986). Course in general linguistics (3rd ed.). (R. Harris, Trans.). Chicago: Open Court Publishing Company. (Original work published 1972). p. 9-10, 15.

[11] Chomsky, Noam. (1965). Aspects of the Theory of Syntax. Cambridge, MA: MIT Press.

[12] Journal of Language and Politics

[13] Raymond Mougeon and Terry Nadasdi (1998). Sociolinguistic Discontinuity in Minority Language Communities pp. 40- 55. Linguistic Society of America.

[14] “Stylistics”by Joybrato Mukherjee. Chapter 49. Encyclopedia of Linguistics.

[15] Online Etymological Dictionary Definition of Philology

[16] JSTOR preview: Introduction: Philology in a Manuscript Culture by Stephen G. Nichols.

[17] McMahon, A. M. S. (1994). Understanding Language Change. Cambridge University Press. p. 19. ISBN 0-521-44665-1.

[18] McMahon, A. M. S. (1994). Understanding Language Change. Cambridge University Press. p. 9. ISBN 0-521-44665-1.

[19] A. Morpurgo Davies Hist. Linguistics (1998) 4 I. 22.

[20] Online Etymological Dictionary of Philology

[21] Online Etymological Dictionary Definition of Linguist

[22]“Linguist”. The American Heritage Dictionary of the English Language. Houghton Mifflin Harcourt. 2000. ISBN 978-0- 395-82517-4.

[23] Helen Leckie-Tarry, Language and Context: a Functional Linguistic Theory of Register, Continuum International Publishing Group, 1995, p6. ISBN 1-85567-272-3

[24] Oxford English dictionary.

[25] Trudgill, P. (1994). Dialects. Ebooks Online Routledge. Florence, KY. 19.10. BIBLIOGRAPHY 83

[26] Jacques Derrida (Author) and Alan Bass (translator) (1978). Writing and Difference. University of Chicago Press. ISBN 9780226143293. [27] “Relative Thinking.”The Guardian. November 2004. [28] IA Richards (1965). The Philosophy of Rhetoric. Oxford University Press (New York). [29] Isac, Daniela; Charles Reiss (2013). I-language: An Introduction to Linguistics as Cognitive Science, 2nd edition. Oxford University Press. ISBN 978-0199660179. [30] Bloomfield 1914, p. 307. [31] Seuren, Pieter A. M. (1998). Western linguistics: An historical introduction. Wiley-blackwell. pp. 2–24. ISBN 0-631- 20891-7. [32] Bloomfield 1914, p. 308. [33] Bloomfield 1914, p. 310. [34] Bloomfield 1914, p. 311. [35] Clarke, David S. (1990). Sources of semiotic: readings with commentary from antiquity to the present. Carbondale: Southern Illinois University Press. pp. 143–144. [36] Holquist 1981, pp. xvii-xviii. [37] de Saussure, Ferdinand. Course in General Linguistics. McGraw Hill, New York. ISBN 9780802214935. [38] Chomsky, Noam (1956). “Three Models for the Description of Language”. IRE Transactions on Information Theory 2 (2): 113 123. doi:10.1109/TIT.1956.1056813. [39] Chomsky, Noam (1957). Syntactic Structures. The Hague: Mouton. [40] Nichols, Johanna (1984).“Functional Theories of Grammar”. Annual Review of Anthropology 13: 97–117. doi:10.1146/annurev.an.13.100184.000525. [Functional grammar]“analyzes grammatical structure, as do formal and structural grammar; but it also analyses the entire communicative situation: the purpose of the speech event, its participants, its discourse context. Functionalists maintain that the communicative situation motivates, constrains, explains, or otherwise determines grammatical structure, and that a structural or formal approach is not merely limited to an artificially restricted data base, but is inadequate as a structural account. Functional grammar, then, differs from formulae and structural grammar in that it purports not to model but to explain; and the explanation is grounded in the communicative situation.” [41] Croft, William and D. Alan Cruse (2004). Cognitive Linguistics. Cambridge: Cambridge University Press. p. 1. [42] Barbara Seidlhofer (2003). Controversies in Applied Linguistics (pp. 288). Oxford University Press. ISBN 0194374440. [43] Eades, Diana (2005). “Applied Linguistics and Language Analysis in Asylum Seeker Cases” (PDF). Applied Linguistics 26 (4): 503–526. doi:10.1093/applin/ami021. [44] Paul Allen Miller (1998). “The Classical Roots of Post-Structuralism: Lacan, Derrida and Foucault in the International Journal of the Classical Tradition (Volume 5, Number 2.)". Springer. JSTOR 30222818 [45] Himmelman, Nikolaus Language documentation: What is it and what is it good for? in P. Gippert, Jost, Nikolaus P Himmelmann & Ulrike Mosel. (2006) Essentials of Language documentation. Mouton de Gruyter, Berlin & New York. [46] Croft, William (October 2008). “Evolutionary Linguistics”. Annual Review of Anthropology (Annual Reviews) 37: 219–234. doi:10.1146/annurev.anthro.37.081407.085156.

19.10 Bibliography

• Akmajian, Adrian; Demers, Richard; Farmer, Ann; Harnish, Robert (2010). Linguistics: An Introduction to Language and Communication. Cambridge, MA: The MIT Press. ISBN 0-262-51370-6. • Isac, Daniela; Charles Reiss (2013). I-language: An Introduction to Linguistics as Cognitive Science, 2nd edition. Oxford University Press. ISBN 978-0199660179. • Pinker, Steven (1994). The Language Instinct. William Morrow and Company. ISBN 9780140175295. • Chomsky, Noam (1998). On Language. The New Press, New York. ISBN 978-1565844759. • Derrida, Jacques (1967). Of Grammatology. The Johns Hopkins University Press. ISBN 0801858305. • Crystal, David (1990). Linguistics. Penguin Books. ISBN 9780140135312. 84 CHAPTER 19. LINGUISTICS

19.11 External links

• The Linguist List, a global online linguistics community with news and information updated daily

• Glossary of linguistic terms by SIL International (last updated 2004) • Language Log, a linguistics blog maintained by prominent (popular science) linguists

• Glottopedia, MediaWiki-based encyclopedia of linguistics, under construction

• Linguistic sub-fields – according to the Linguistic Society of America • Linguistics and language-related wiki articles on Scholarpedia and Citizendium

• “Linguistics”section – A Bibliography of Literary Theory, Criticism and Philology, ed. J. A. García Landa (University of Zaragoza, Spain)

• An Academic Linguistics Forum (currently some technical problems, Feb 2013) • Linguistics Contents for Non-English World

• Computerized comparative linguistics Calculator to compare the relatedness (genetic proximity) for over 160 languages (from Afar to Zulu)

• Linguistics at DMOZ Chapter 20

Loaded question

A loaded question or complex question fallacy is a question which contains a controversial or unjustified assumption (e.g., a presumption of guilt).*[1] Aside from being an informal fallacy depending on usage, such questions may be used as a rhetorical tool: the question attempts to limit direct replies to be those that serve the questioner's agenda.*[2] The traditional example is the question “Have you stopped beating your wife?" Whether the respondent answers yes or no, they will admit to having a wife and having beaten her at some time in the past. Thus, these facts are presupposed by the question, and in this case an entrapment, because it narrows the respondent to a single answer, and the fallacy of many questions has been committed.*[2] The fallacy relies upon context for its effect: the fact that a question presupposes something does not in itself make the question fallacious. Only when some of these presuppositions are not necessarily agreed to by the person who is asked the question does the argument containing them become fallacious.*[2] Hence the same question may be loaded in one context, but not in the other. For example the previous question would not be loaded if it was asked during a trial in which the defendant has already admitted to beating his wife.*[2] This fallacy should be distinguished from that of begging the question (not to be confused with raising the ques- tion),*[3] which offers a premise whose plausibility depends on the truth of the proposition asked about, and which is often an implicit restatement of the proposition.*[4] The term “loaded question”is sometimes used to refer to loaded language that is phrased as a question. This type of question does not necessarily contain a fallacious presupposition, but rather this usage refers to the question having an unspoken and often emotive implication. For example, “Are you a murderer?" would be such a loaded question, as“murder”has a very negative connotation. Such a question may be asked merely to harass or upset the respondent with no intention of listening to their reply, or asked with the full expectation that the respondent will predictably deny it.

20.1 Defense

A common way out of this argument is not to answer the question (e.g. with a simple 'yes' or 'no'), but to challenge the assumption behind the question. To use an earlier example, a good response to the question “Have you stopped beating your wife?" would be “I have never beaten my wife”.*[5] This removes the ambiguity of the expected response, therefore nullifying the tactic. However, the askers of said questions have learned to get around this tactic by accusing the one who answers of dodging the question.A rhetorical question such as “Then please explain, how could I possibly have beaten a wife that I've never had?" can be an effective antidote to this further tactic, placing the burden on the deceptive questioner either to expose their tactic or stop the line of inquiry. In many cases a short answer is important. I neither did nor do I now makes a good example on how to answer the question without letting the asker interrupt and misshape the response.

20.2 Historical examples

Madeleine Albright (U.S. Ambassador to the U.N.) claims to have answered a loaded question (and later regretted not challenging it instead) on 60 Minutes on 12 May 1996. Lesley Stahl asked, regarding the effects of UN sanctions

85 86 CHAPTER 20. LOADED QUESTION

against Iraq, “We have heard that a half million children have died. I mean, that is more children than died in Hiroshima. And, you know, is the price worth it?" Madeleine Albright: “I think that is a very hard choice, but the price, we think, the price is worth it.”*[6] She later wrote of this response:

I must have been crazy; I should have answered the question by reframing it and pointing out the inherent flaws in the premise behind it. …As soon as I had spoken, I wished for the power to freeze time and take back those words. My reply had been a terrible mistake, hasty, clumsy, and wrong. …I had fallen into a trap and said something that I simply did not mean. That is no oneʼs fault but my own.*[7]

President Bill Clinton, the moderator in a town meeting discussing the topic “Race In America”, in response to a participant argument that the issue was not affirmative action but“racial preferences”asked the participant a loaded question: “Do you favor the United States Army abolishing the affirmative-action program that produced Colin Powell? Yes or no?" *[8] For another example, the New Zealand corporal punishment referendum, 2009 asked: “Should a smack as part of good parental correction be a criminal offence in New Zealand?" Murray Edridge, of Barnardos New Zealand, criticized the question as “loaded and ambiguous”and claimed “the question presupposes that smacking is a part of good parental correction”.*[9]

20.3 See also

• Complex question

• Entailment (pragmatics)

• False dilemma

• Gotcha journalism

• Implicature

• Leading question

• Mu (negative)

• Presupposition

• Suggestive question

20.4 References

[1] Gregory Bassham (2004), Critical Thinking, McGraw-Hill

[2] Douglas N. Walton, Informal logic: a handbook for critical argumentation, Cambridge University Press, 1989, ISBN 0- 521-37925-3, pp. 36–37

[3] Fallacy: Begging the Question The Nizkor Project. Retrieved on: January 22, 2008

[4] Carroll, Robert Todd. The Skeptic's Dictionary. John Wiley & Sons. p. 51. ISBN 0-471-27242-6.

[5] Layman, C. Stephen (2003). The Power of Logic. p. 158.

[6] “Albright's Blunder”. Irvine Review. 2002. Archived from the original on 2003-06-03. Retrieved 2008-01-04.

[7] Albright, Madeleine (2003). Madam Secretary: A Memoir. p. 275. ISBN 0-7868-6843-0.

[8] “Colin Powell Promotion: the Real Story”. New York Times.

[9] “Anti-smacking debate goes to referendum - Story - National”. 3 News. Retrieved 2010-02-03. 20.5. EXTERNAL LINKS 87

20.5 External links

• Fallacy: Loaded Questions and Complex Claims Critical Thinking exercises. San Jose State University.

• Logical Fallacy: Loaded Question The Fallacy Files Chapter 21

Material conditional

“Logical conditional”redirects here. For other related meanings, see Conditional statement. Not to be confused with material inference. The material conditional (also known as "material implication", "material consequence", or simply "implication",

Venn diagram of A → B . If a member of the set described by this diagram (the red areas) is a member of A , it is in the intersection of A and B , and it therefore is also in B .

"implies" or "conditional") is a logical connective (or a binary operator) that is often symbolized by a forward arrow "→". The material conditional is used to form statements of the form "p→q" (termed a conditional statement) which is read as “if p then q”or “p only if q”and conventionally compared to the English construction “If...then...”. But unlike the English construction, the material conditional statement "p→q" does not specify a causal relationship between p and q and is to be understood to mean “if p is true, then q is also true”such that the statement "p→q" is false only when p is true and q is false.*[1] Intuitively, consider that a given p being true and q being false would prove an “if p is true, q is always also true”statement false, even when the “if p then q”does not represent a

88 21.1. DEFINITIONS OF THE MATERIAL CONDITIONAL 89

causal relationship between p and q. Instead, the statement describes p and q as each only being true when the other is true, and makes no claims that p causes q. However, note that such a general and informal way of thinking about the material conditional is not always acceptable, as will be discussed. As such, the material conditional is also to be distinguished from . The material conditional is also symbolized using:

1. p ⊃ q (Although this symbol may be used for the superset symbol in set theory.); 2. p ⇒ q (Although this symbol is often used for logical consequence (i.e. logical implication) rather than for material conditional.)

With respect to the material conditionals above, p is termed the antecedent, and q the consequent of the conditional. Conditional statements may be nested such that either or both of the antecedent or the consequent may themselves be conditional statements. In the example "(p→q) → (r→s)" both the antecedent and the consequent are conditional statements. In classical logic p → q is logically equivalent to ¬(p ∧ ¬q) and by De Morgan's Law logically equivalent to ¬p ∨ q .*[2] Whereas, in minimal logic (and therefore also intuitionistic logic) p → q only logically entails ¬(p ∧ ¬q) ; and in intuitionistic logic (but not minimal logic) ¬p ∨ q entails p → q .

21.1 Definitions of the material conditional

Logicians have many different views on the nature of material implication and approaches to explain its sense.*[3]

21.1.1 As a truth function

In classical logic, the compound p→q is logically equivalent to the negative compound: not both p and not q. Thus the compound p→q is false if and only if both p is true and q is false. By the same stroke, p→q is true if and only if either p is false or q is true (or both). Thus → is a function from pairs of truth values of the components p, q to truth values of the compound p→q, whose truth value is entirely a function of the truth values of the components. Hence, this interpretation is called truth-functional. The compound p→q is logically equivalent also to ¬p∨q (either not p, or q (or both)), and to ¬q→¬p (if not q then not p). But it is not equivalent to ¬p→¬q, which is equivalent to q→p.

Truth table

The truth table associated with the material conditional p→q is identical to that of ¬p∨q and is also denoted by Cpq. It is as follows: It may also be useful to note that in Boolean algebra, true and false can be denoted as 1 and 0 respectively with an equivalent table.

21.1.2 As a formal connective

The material conditional can be considered as a symbol of a formal theory, taken as a set of sentences, satisfying all the classical inferences involving →, in particular the following characteristic rules:

1. Modus ponens; 2. Conditional proof; 3. Classical contraposition; 4. Classical reductio ad absurdum.

Unlike the truth-functional one, this approach to logical connectives permits the examination of structurally identi- cal propositional forms in various logical systems, where somewhat different properties may be demonstrated. For example, in intuitionistic logic which rejects proofs by contraposition as valid rules of inference, (p → q) ⇒ ¬p ∨ q is not a propositional theorem, but the material conditional is used to define negation. 90 CHAPTER 21. MATERIAL CONDITIONAL

21.2 Formal properties

When studying logic formally, the material conditional is distinguished from the semantic consequence relation |= . We say A |= B if every interpretation that makes A true also makes B true. However, there is a close relationship between the two in most logics, including classical logic. For example, the following principles hold:

• If Γ |= ψ then ∅ |= (φ1 ∧ · · · ∧ φn → ψ) for some φ1, . . . , φn ∈ Γ . (This is a particular form of the deduction theorem. In words, it says that if Γ models ψ this means that ψ can be deduced just from some subset of the theorems in Γ.)

• The converse of the above

• Both → and |= are monotonic; i.e., if Γ |= ψ then ∆ ∪ Γ |= ψ , and if φ → ψ then (φ ∧ α) → ψ for any α, Δ. (In terms of structural rules, this is often referred to as weakening or thinning.)

These principles do not hold in all logics, however. Obviously they do not hold in non-monotonic logics, nor do they hold in relevance logics. Other properties of implication (the following expressions are always true, for any logical values of variables):

• distributivity: (s → (p → q)) → ((s → p) → (s → q))

• transitivity: (a → b) → ((b → c) → (a → c))

• reflexivity: a → a

• totality: (a → b) ∨ (b → a)

• truth preserving: The interpretation under which all variables are assigned a truth value of 'true' produces a truth value of 'true' as a result of material implication.

• commutativity of antecedents: (a → (b → c)) ≡ (b → (a → c))

Note that a → (b → c) is logically equivalent to (a ∧ b) → c ; this property is sometimes called un/currying. Because of these properties, it is convenient to adopt a right-associative notation for → where a → b → c denotes a → (b → c) . Comparison of Boolean truth tables shows that a → b is equivalent to ¬a ∨ b , and one is an equivalent replacement for the other in classical logic. See material implication (rule of inference).

21.3 Philosophical problems with material conditional

Outside of mathematics, it is a matter of some controversy as to whether the truth function for material implica- tion provides an adequate treatment of conditional statements in English (a sentence in the indicative mood with a conditional clause attached, i.e., an indicative conditional, or false-to-fact sentences in the subjunctive mood, i.e., a counterfactual conditional).*[4] That is to say, critics argue that in some non-mathematical cases, the truth value of a compound statement, “if p then q", is not adequately determined by the truth values of p and q.*[4] Examples of non-truth-functional statements include: "q because p", "p before q" and“it is possible that p".*[4]“[Of] the sixteen possible truth-functions of A and B, material implication is the only serious candidate. First, it is uncontroversial that when A is true and B is false, “If A, B" is false. A basic rule of inference is modus ponens: from “If A, B" and A, we can infer B. If it were possible to have A true, B false and“If A, B" true, this inference would be invalid. Second, it is uncontroversial that “If A, B" is sometimes true when A and B are respectively (true, true), or (false, true), or (false, false)…Non-truth-functional accounts agree that “If A, B" is false when A is true and B is false; and they agree that the conditional is sometimes true for the other three combinations of truth-values for the components; but they deny that the conditional is always true in each of these three cases. Some agree with the truth-functionalist that when A and B are both true, “If A, B" must be true. Some do not, demanding a further relation between the facts that A and that B.”*[4] 21.4. SEE ALSO 91

The truth-functional theory of the conditional was integral to Frege's new logic (1879). It was taken up enthusiastically by Russell (who called it“material implication”), Wittgenstein in the Tractatus, and the logical positivists, and it is now found in every logic text. It is the first theory of conditionals which students encounter. Typically, it does not strike students as obviously correct. It is logic's first surprise. Yet, as the textbooks testify, it does a creditable job in many circumstances. And it has many defenders. It is a strikingly simple theory: “If A, B" is false when A is true and B is false. In all other cases,“If A, B" is true. It is thus equivalent to "~(A&~B)" and to "~A or B". "A ⊃ B" has, by stipulation, these truth conditions. ̶Dorothy Edgington, The Stanford Encyclopedia of Philosophy, “Conditionals”*[4]

The meaning of the material conditional can sometimes be used in the natural language English “if condition then consequence" construction (a kind of conditional sentence), where condition and consequence are to be filled with English sentences. However, this construction also implies a“reasonable”connection between the condition (protasis) and consequence (apodosis) (see Connexive logic). The material conditional can yield some unexpected truths when expressed in natural language. For example, any material conditional statement with a false antecedent is true (see vacuous truth). So the statement“if 2 is odd then 2 is even”is true. Similarly, any material conditional with a true consequent is true. So the statement“if I have a penny in my pocket then Paris is in France”is always true, regardless of whether or not there is a penny in my pocket. These problems are known as the paradoxes of material implication, though they are not really paradoxes in the strict sense; that is, they do not elicit logical contradictions. These unexpected truths arise because speakers of English (and other natural languages) are tempted to equivocate between the material conditional and the indicative conditional, or other conditional statements, like the counterfactual conditional and the material biconditional. It is not surprising that a rigorously defined truth-functional operator does not correspond exactly to all notions of implication or otherwise expressed by 'if...then...' sentences in English (or their equivalents in other natural languages). For an overview of some the various analyses, formal and informal, of conditionals, see the “References”section below.

21.4 See also

21.4.1 Conditionals

• Counterfactual conditional

• Indicative conditional

• Corresponding conditional

• Strict conditional

21.5 References

[1] Magnus, P.D (January 6, 2012).“forallx: An Introduction to Formal Logic”(PDF). Creative Commons. p. 25. Retrieved 28 May 2013.

[2] Teller, Paul (January 10, 1989). “A Modern Formal Logic Primer: Sentence Logic Volume 1” (PDF). Prentice Hall. p. 54. Retrieved 28 May 2013.

[3] Clarke, Matthew C. (March 1996). “A Comparison of Techniques for Introducing Material Implication”. Cornell University. Retrieved March 4, 2012.

[4] Edgington, Dorothy (2008). Edward N. Zalta, ed. “Conditionals”. The Stanford Encyclopedia of Philosophy (Winter 2008 ed.).

21.6 Further reading

• Brown, Frank Markham (2003), Boolean Reasoning: The Logic of Boolean Equations, 1st edition, Kluwer Academic Publishers, Norwell, MA. 2nd edition, Dover Publications, Mineola, NY, 2003. 92 CHAPTER 21. MATERIAL CONDITIONAL

• Edgington, Dorothy (2001),“Conditionals”, in Lou Goble (ed.), The Blackwell Guide to Philosophical Logic, Blackwell. • Quine, W.V. (1982), Methods of Logic, (1st ed. 1950), (2nd ed. 1959), (3rd ed. 1972), 4th edition, Harvard University Press, Cambridge, MA. • Stalnaker, Robert, “Indicative Conditionals”, Philosophia, 5 (1975): 269–286.

21.7 External links

• Conditionals entry by Edgington, Dorothy in the Stanford Encyclopedia of Philosophy Chapter 22

Material implication

Material implication may refer to:

• Material conditional, a logical connective

• Material implication (rule of inference), a rule of replacement for some propositional logic

22.1 See also

• Implication (disambiguation)

• Conditional statement (disambiguation)

93 Chapter 23

Modus ponens

In propositional logic, modus ponendo ponens (Latin for “the way that affirms by affirming"; often abbreviated to MP or modus ponens*[1]*[2]*[3]*[4]) or implication elimination is a valid, simple argument form and rule of inference.*[5] It can be summarized as "P implies Q; P is asserted to be true, so therefore Q must be true.”The history of modus ponens goes back to antiquity.*[6] While modus ponens is one of the most commonly used concepts in logic it must not be mistaken for a logical law; rather, it is one of the accepted mechanisms for the construction of deductive proofs that includes the “rule of definition”and the “rule of substitution”.*[7] Modus ponens allows one to eliminate a conditional statement from a logical proof or argument (the antecedents) and thereby not carry these antecedents forward in an ever-lengthening string of symbols; for this reason modus ponens is sometimes called the rule of detachment.*[8] Enderton, for example, observes that“modus ponens can produce shorter formulas from longer ones”,*[9] and Russell observes that “the process of the inference cannot be reduced to symbols. Its sole record is the occurrence of ⊦q [the consequent] . . . an inference is the dropping of a true premise; it is the dissolution of an implication”.*[10] A justification for the “trust in inference is the belief that if the two former assertions [the antecedents] are not in error, the final assertion [the consequent] is not in error”.*[11] In other words: if one statement or proposition implies a second one, and the first statement or proposition is true, then the second one is also true. If P implies Q and P is true, then Q is true.*[12] An example is:

If it is raining, I will meet you at the theater. It is raining. Therefore, I will meet you at the theater.

Modus ponens can be stated formally as:

P → Q, P ∴ Q

where the rule is that whenever an instance of "P → Q" and "P" appear by themselves on lines of a logical proof, Q can validly be placed on a subsequent line; furthermore, the premise P and the implication “dissolves”, their only trace being the symbol Q that is retained for use later e.g. in a more complex deduction. It is closely related to another valid form of argument, modus tollens. Both have apparently similar but invalid forms such as affirming the consequent, denying the antecedent, and evidence of absence. Constructive dilemma is the disjunctive version of modus ponens. Hypothetical syllogism is closely related to modus ponens and sometimes thought of as “double modus ponens.”

23.1 Formal notation

The modus ponens rule may be written in sequent notation:

94 23.2. EXPLANATION 95

P → Q, P ⊢ Q

where ⊢ is a metalogical symbol meaning that Q is a syntactic consequence of P → Q and P in some logical system; or as the statement of a truth-functional tautology or theorem of propositional logic:

((P → Q) ∧ P ) → Q where P, and Q are propositions expressed in some formal system.

23.2 Explanation

The argument form has two premises (hypothesis). The first premise is the “if–then”or conditional claim, namely that P implies Q. The second premise is that P, the antecedent of the conditional claim, is true. From these two premises it can be logically concluded that Q, the consequent of the conditional claim, must be true as well. In artificial intelligence, modus ponens is often called forward chaining. An example of an argument that fits the form modus ponens:

If today is Tuesday, then John will go to work. Today is Tuesday. Therefore, John will go to work.

This argument is valid, but this has no bearing on whether any of the statements in the argument are true; for modus ponens to be a sound argument, the premises must be true for any true instances of the conclusion. An argument can be valid but nonetheless unsound if one or more premises are false; if an argument is valid and all the premises are true, then the argument is sound. For example, John might be going to work on Wednesday. In this case, the reasoning for John's going to work (because it is Wednesday) is unsound. The argument is not only sound on Tuesdays (when John goes to work), but valid on every day of the week. A propositional argument using modus ponens is said to be deductive. In single-conclusion sequent calculi, modus ponens is the Cut rule. The cut-elimination theorem for a calculus says that every proof involving Cut can be transformed (generally, by a constructive method) into a proof without Cut, and hence that Cut is admissible. The Curry–Howard correspondence between proofs and programs relates modus ponens to function application: if f is a function of type P → Q and x is of type P, then f x is of type Q.

23.3 Justification via truth table

The validity of modus ponens in classical two-valued logic can be clearly demonstrated by use of a truth table. In instances of modus ponens we assume as premises that p → q is true and p is true. Only one line of the truth table ̶the first̶satisfies these two conditions (p and p → q). On this line, q is also true. Therefore, whenever p → q is true and p is true, q must also be true.

23.4 See also

• Condensed detachment

• What the Tortoise Said to Achilles 96 CHAPTER 23. MODUS PONENS

23.5 References

[1] Stone, Jon R. (1996). Latin for the Illiterati: Exorcizing the Ghosts of a Dead Language. London, UK: Routledge: 60.

[2] Copi and Cohen

[3] Hurley

[4] Moore and Parker

[5] Enderton 2001:110

[6] Susanne Bobzien (2002). The Development of Modus Ponens in Antiquity, Phronesis 47.

[7] Alfred Tarski 1946:47. Also Enderton 2001:110ff.

[8] Tarski 1946:47

[9] Enderton 2001:111

[10] Whitehead and Russell 1927:9

[11] Whitehead and Russell 1927:9

[12] Jago, Mark (2007). Formal Logic. Humanities-Ebooks LLP. ISBN 978-1-84760-041-7.

23.6 Sources

• Alfred Tarski 1946 Introduction to Logic and to the Methodology of the Deductive Sciences 2nd Edition, reprinted by Dover Publications, Mineola NY. ISBN 0-486-28462-X (pbk). • Alfred North Whitehead and Bertrand Russell 1927 Principia Mathematica to *56 (Second Edition) paperback edition 1962, Cambridge at the University Press, London UK. No ISBN, no LCCCN. • Herbert B. Enderton, 2001, A Mathematical Introduction to Logic Second Edition, Harcourt Academic Press, Burlington MA, ISBN 978-0-12-238452-3.

23.7 External links

• Hazewinkel, Michiel, ed. (2001), “Modus ponens”, Encyclopedia of Mathematics, Springer, ISBN 978-1- 55608-010-4

• Modus ponens at PhilPapers • Modus ponens at Wolfram MathWorld Chapter 24

Presupposition

For other uses, see Presupposition (disambiguation).

In the branch of linguistics known as pragmatics, a presupposition (or ps) is an implicit assumption about the world or background belief relating to an utterance whose truth is taken for granted in discourse. Examples of presuppositions include:

• Jane no longer writes fiction. • Presupposition: Jane once wrote fiction. • Have you stopped eating meat? • Presupposition: you had once eaten meat. • Have you talked to Hans? • Presupposition: Hans exists.

A presupposition must be mutually known or assumed by the speaker and addressee for the utterance to be consid- ered appropriate in context. It will generally remain a necessary assumption whether the utterance is placed in the form of an assertion, denial, or question, and can be associated with a specific lexical item or grammatical feature (presupposition trigger) in the utterance. Crucially, negation of an expression does not change its presuppositions: I want to do it again and I don't want to do it again both presuppose that the subject has done it already one or more times; My wife is pregnant and My wife is not pregnant both presuppose that the subject has a wife. In this respect, presupposition is distinguished from entailment and implicature. For example, The president was assassinated entails that The president is dead, but if the expression is negated, the entailment is not necessarily true.

24.1 Negation of a sentence containing a presupposition

If presuppositions of a sentence are not consistent with the actual state of affairs, then one of two approaches can be taken. Given the sentences My wife is pregnant and My wife is not pregnant when one has no wife, then either:

1. Both the sentence and its negation are false; or 2. Strawson's approach: Both“my wife is pregnant”and“my wife is not pregnant”use a wrong presupposition (i.e. that there exists a referent which can be described with the noun phrase my wife) and therefore can not be assigned truth values.

Bertrand Russell tries to solve this dilemma with two interpretations of the negated sentence:

1.“There exists exactly one person, who is my wife and who is not pregnant”

97 98 CHAPTER 24. PRESUPPOSITION

2.“There does not exist exactly one person, who is my wife and who is pregnant.”

For the first phrase, Russell would claim that it is false, whereas the second would be true according to him.

24.2 Projection of presuppositions

A presupposition of a part of an utterance is sometimes also a presupposition of the whole utterance, and sometimes not. For instance, the phrase my wife triggers the presupposition that I have a wife. The first sentence below carries that presupposition, even though the phrase occurs inside an embedded clause. In the second sentence, however, it does not. John might be mistaken about his belief that I have a wife, or he might be deliberately trying to misinform his audience, and this has an effect on the meaning of the second sentence, but, perhaps surprisingly, not on the first one.

1. John thinks that my wife is beautiful. 2. John said that my wife is beautiful.

Thus, this seems to be a property of the main verbs of the sentences, think and say, respectively. After work by Lauri Karttunen,*[1] verbs that allow presuppositions to “pass up”to the whole sentence (“project”) are called holes, and verbs that block such passing up, or projection of presuppositions are called plugs. Some linguistic environments are intermediate between plugs and holes: They block some presuppositions and allow others to project. These are called filters. An example of such an environment are indicative conditionals (“If-then”clauses). A conditional sentence contains an antecedent and a consequent. The antecedent is the part preceded by the word “if,”and the consequent is the part that is (or could be) preceded by “then.”If the consequent contains a presupposition trigger, and the triggered presupposition is explicitly stated in the antecedent of the conditional, then the presupposition is blocked. Otherwise, it is allowed to project up to the entire conditional. Here is an example:

If I have a wife, then my wife is blonde.

Here, the presupposition triggered by the expression my wife (that I have a wife) is blocked, because it is stated in the antecedent of the conditional: That sentence doesn't imply that I have a wife. In the following example, it is not stated in the antecedent, so it is allowed to project, i.e. the sentence does imply that I have a wife.

If it's already 4am, then my wife is probably angry.

Hence, conditional sentences act as filters for presuppositions that are triggered by expressions in their consequent. A significant amount of current work in semantics and pragmatics is devoted to a proper understanding of when and how presuppositions project.

24.3 Presupposition triggers

A presupposition trigger is a lexical item or linguistic construction which is responsible for the presupposition.*[2] The following is a selection of presuppositional triggers following Stephen C. Levinson's classic textbook on Pragmatics, which in turn draws on a list produced by Lauri Karttunen. As is customary, the presuppositional triggers themselves are italicized, and the symbol » stands for 'presupposes'.*[3]

24.3.1 Definite descriptions

Main article: Definite description

Definite descriptions are phrases of the form“the X”where X is a noun phrase. The description is said to be proper when the phrase applies to exactly one object, and conversely, it is said to be improper when either there exist more than one potential referents, as in“the senator from Ohio”, or none at all, as in“the king of France”. In conventional speech, definite descriptions are implicitly assumed to be proper, hence such phrases trigger the presupposition that the referent is unique and existent. 24.3. PRESUPPOSITION TRIGGERS 99

• John saw the man with two heads. »there exists a man with two heads.

24.3.2 Factive verbs

See also: Epistemology § Truth

In Western epistemology, there is a tradition originating with Plato of defining knowledge as justified true belief. On this definition, for someone to know X, it is required that X be true. A linguistic question thus arises regarding the usage of such phrases: does a person who states “John knows X”implicitly claim the truth of X? Steven Pinker explored this question in a popular science format in a 2007 book on language and cognition, using a widely publicized example from a speech by a U.S. president.*[4] A 2003 speech by George W. Bush included the line, “British Intelligence has learned that Saddam Hussein recently sought significant quantities of uranium from Africa.” *[5] Over the next few years, it became apparent that this intelligence lead was incorrect. But the way the speech was phrased, using a factive verb, implicitly framed the lead as truth rather than hypothesis. The factivity thesis, the proposition that relational predicates having to do with knowledge, such as knows, learn, remembers, and realized, presuppose the factual truth of their object, however, was subject to notable criticism by Allan Hazlett.*[6]

• Martha regrets drinking John's home brew. »Martha drank John's home brew.

• Frankenstein was aware that Dracula was there. »Dracula was there.

• John realized that he was in debt. »John was in debt.

• It was odd how proud he was. »He was proud.

Some further factive predicates: know; be sorry that; be proud that; be indifferent that; be glad that; be sad that.

24.3.3 Implicative verbs

• John managed to open the door. »John tried to open the door.

• John forgot to lock the door. »John ought to have locked, or intended to lock, the door.

Some further implicative predicates: X happened to V»X didn't plan or intend to V; X avoided Ving»X was expected to, or usually did, or ought to V, etc.

24.3.4 Change of state verbs

• John stopped teasing his wife. »John had been teasing his wife.

• Joan began teasing her husband. »Joan hadn't been teasing her husband.

Some further change of state verbs: start; finish; carry on; cease; take (as in X took Y from Z » Y was at/in/with Z); leave; enter; come; go; arrive; etc. 100 CHAPTER 24. PRESUPPOSITION

24.3.5 Iteratives

• The flying saucer came again. »The flying saucer came before.

• You can't get gobstoppers anymore. »You once could get gobstoppers.

• Carter returned to power. »Carter held power before.

Further iteratives: another time; to come back; restore; repeat; for the nth time.

24.3.6 Temporal clauses

• Before Strawson was even born, Frege noticed presuppositions. »Strawson was born.

• While Chomsky was revolutionizing linguistics, the rest of social science was asleep. »Chomsky was revolutionizing linguistics.

• Since Churchill died, we've lacked a leader. »Churchill died.

Further temporal clause constructors: after; during; whenever; as (as in As John was getting up, he slipped).

24.3.7 Cleft sentences

• Cleft construction: It was Henry that kissed Rosie. »Someone kissed Rosie.

• Pseudo-cleft construction: What John lost was his wallet. »John lost something.

24.3.8 Comparisons and contrasts

Comparisons and contrasts may be marked by stress (or by other prosodic means), by particles like “too”, or by comparatives constructions.

• Marianne called Adolph a male chauvinist, and then HE insulted HER. »For Marianne to call Adolph a male chauvinist would be to insult him.

• Carol is a better linguist than Barbara. »Barbara is a linguist.

24.3.9 Counterfactual conditionals

• If the notice had only said 'mine-field' in Welsh as well as in English, we would never have lost poor Llewellyn. »The notice didn't say 'mine-field' in Welsh.

24.3.10 Questions

Presuppose a seeking for what is sought. 24.4. ACCOMMODATION OF PRESUPPOSITIONS 101

24.3.11 Possessive case

• John's children are very noisy. »John has children.

24.4 Accommodation of presuppositions

A presupposition of a sentence must normally be part of the common ground of the utterance context (the shared knowledge of the interlocutors) in order for the sentence to be felicitous. Sometimes, however, sentences may carry presuppositions that are not part of the common ground and nevertheless be felicitous. For example, I can, upon being introduced to someone, out of the blue explain that my wife is a dentist, this without my addressee having ever heard, or having any reason to believe that I have a wife. In order to be able to interpret my utterance, the addressee must assume that I have a wife. This process of an addressee assuming that a presupposition is true, even in the absence of explicit information that it is, is usually called presupposition accommodation. We have just seen that presupposition triggers like my wife (definite descriptions) allow for such accommodation. In “Presupposition and Anaphora: Remarks on the Formulation of the Projection Problem”,*[7] the philosopher Saul Kripke noted that some presupposition triggers do not seem to permit such accommodation. An example of that is the presupposition trigger too. This word triggers the presupposition that, roughly, something parallel to what is stated has happened. For example, if pronounced with emphasis on John, the following sentence triggers the presupposition that somebody other than John had dinner in New York last night.

John had dinner in New York last night, too.

But that presupposition, as stated, is completely trivial, given what we know about New York. Several million people had dinner in New York last night, and that in itself doesn't satisfy the presupposition of the sentence. What is needed for the sentence to be felicitous is really that somebody relevant to the interlocutors had dinner in New York last night, and that this has been mentioned in the previous discourse, or that this information can be recovered from it. Presupposition triggers that disallow accommodation are called anaphoric presupposition triggers.

24.5 Presupposition in Critical discourse analysis

Critical discourse analysis (CDA) seeks to identify presuppositions of an ideological nature. CDA is critical, not only in the sense of being analytical, but also in the ideological sense.*[8] Van Dijk (2003) says CDA “primarily studies the way social power abuse, dominance, and inequality”operate in speech acts (including written text)̶"text and talk”.*[8] Van Dijk describes CDA as written from a particular point of view:*[8] “dissendent research”aimed to “expose”and “resist social inequality.”*[8] One notable feature of ideological presuppositions researched in CDA is a concept termed synthetic personalisation.

24.6 See also

• Fallacy of many questions • Loaded question • Performative contradiction • Exception that proves the rule • Assumption/Presumption (similar words)

24.7 References

[1] Karttunen, Lauri (1974) . Theoretical Linguistics 1 181-94. Also in Pragmatics: A Reader, Steven Davis (ed.), pages 406-415, Oxford University Press, 1991. 102 CHAPTER 24. PRESUPPOSITION

[2] Kadmon, Nirit. Formal pragmatics: semantics, pragmatics, presupposition, and focus. Great Britain: Wiley-Blackwell, 2001, page 10.

[3] Levinson, Stephen C. Pragmatics.Cambridge: Cambridge University Press, 1983, pp. 181-184.

[4] Pinker, Steven (2007), The stuff of thought: language as a window into human nature, Penguin Books, ISBN 978-0-670- 06327-7, pp. 6–9.

[5] Bush, George W., State of the Union Address, January 28th, 2003.

[6] Hazlett, A. (2010).“The Myth of Factive Verbs”. Philosophy and Phenomenological Research 80: 497. doi:10.1111/j.1933- 1592.2010.00338.x.

[7] Kripke, Saul (2009) “Presupposition and Anaphora: Remarks on the Formulation of the Projection Problem,”Linguistic Inquiry, Vol. 40, No. 3, Pages 367-386.

[8]“Critical discourse analysis (CDA) is a type of discourse analytical research that primarily studies the way social power abuse, dominance, and inequality are enacted, reproduced, and resisted by text and talk in the social and political context. With such dissident research, critical discourse analysts take explicit position, and thus want to understand, expose, and ultimately resist social inequality.” Teun Adrianus van Dijk,"Critical Discourse Analysis", chapter 18 in Deborah Schiffrin, Deborah Tannen and Heidi E. Hamilton (eds.), The Handbook of Discourse Analysis,(Wiley-Blackwell, 2003): pp. 352–371.

24.8 Further reading

• Beaver, David. 1997. Presupposition. In J. van Benthem and A. ter Meulen (eds.), The Handbook of Logic and Language, Elsevier, pp. 939–1008. • Henk Zeevat. To appear. Accommodation. In Ramchand, G. and C. Reiss (eds.), Oxford Handbook of Lin- guistic Interfaces, Oxford University Press. Chapter 25

Question

For other uses, see Question (disambiguation). To ask questions about Wikipedia, see Wikipedia:Questions.

A question is a linguistic expression used to make a request for information, or the request made using such an expression. The information requested should be provided in the form of an answer. Questions have developed a range of uses that go beyond the simple eliciting of information from another party. Rhetorical questions, for example, are used to make a point, and are not expected to be answered. Many languages have special grammatical forms for questions (for example, in the English sentence“Are you happy?", the inversion of the subject you and the verb are shows it to be a question rather than a statement). However questions can also be asked without using these interrogative grammatical structures – for example one may use an imperative, as in“Tell me your name”. For detailed information about the grammar of question formation, see Interrogative, and for English specifically, English grammar: Questions.

25.1 Uses

The principal use of questions is to elicit information from the person being addressed, by indicating, more or less precisely, the information which the speaker (or writer) desires. However questions can also be used for a number of other purposes. Questions may be asked for the purpose of testing someone's knowledge, as in a quiz or examination. Raising a question may guide the questioner along an avenue of research (see Socratic method). A research question is an interrogative statement that manifests the objective or line of scholarly or scientific inquiry designed to addresses a specific gap in knowledge. Research questions are expressed in a language that is appropriate for the academic community that has the greatest interest in answers that would address said gap. These interroga- tive statements serve as launching points for the academic pursuit of new knowledge by directing and delimiting an investigation of a topic, a set of studies, or an entire program of research. A rhetorical question is asked to make a point, and does not expect an answer (often the answer is implied or obvious). Some questions are used principally as polite requests, as with “Would you pass the salt?" Pre-suppositional or loaded questions, such as “Have you stopped beating your wife?" may be used as a joke or to embarrass an audience, because any answer a person could give would imply more information than he was willing to affirm. Questions can also be used as titles of works of literature, art and scholarship. Examples include Leo Tolstoy's short story How Much Land Does a Man Need?, the painting And When Did You Last See Your Father?, the movie What About Bob?, and the academic work Who Asked the First Question?.

25.1.1 By purpose

Various categorizations of questions have been proposed. With regard to research projects, one system distin- guishes:*[2]

103 104 CHAPTER 25. QUESTION

• descriptive questions, used primarily with the aim of describing the existence of some thing or process

• relational questions, designed to look at the relationships between two or more variables

• causal questions, designed to determine whether certain variables affect one or more outcome variables

For the purpose of surveys, one type of question asked is the closed-ended (also closed or dichotomous) question, usually requiring a yes/no answer or the choice of an option(s) from a list (see also multiple choice). There are also nominal questions, designed to inquire about a level of quantitative measure, usually making connections between a number and a concept (as in “1 = Moderate; 2 = Severe; 3 = ...”).*[3] Open-ended or open questions give the respondent greater freedom to provide information or opinions on a topic. (The distinction between closed and open questions is applied in a variety of other contexts too, such as job interviewing.) Surveys also often contain qualifying questions (also called filter questions or contingency questions), which serve to determine whether the respondent needs to continue on to answer subsequent questions. Some types of questions that may be used in an educational context are listed in Bloom's Taxonomy of educational objectives. These include questions designed to test and promote:

• Knowledge: Who, what, when, where, why, how . . . ? Describe . . . ?

• Comprehension: Retell . . .

• Application: How is . . . an example of . . . ?; How is . . . related to . . . ?; Why is . . . significant?

• Analysis: What are the parts or features of . . . ? Classify . . . according to . . . ;

• Synthesis: What would you infer from . . . ? What ideas can you add to . . . ? How would you design a new . . . ? What would happen if you combined . . . ? What solutions would you suggest for . . . ?

• Evaluation: Do you agree that . . . ? What do you think about . . . ? What is the most important . . . ? Place the following in order of priority . . . ? How would you decide about . . . ? What criteria would you use to assess . . . ? *[4]

McKenzie's“Questioning Toolkit”*[5] lists 17 types of questions, and suggests that thinkers need to orchestrate and combine these types.*[6] Examples of these question types include the irreverent question, the apparently irrelevant question, the hypothetical question and the unanswerable question. Questions can also be infelicitous, being based on incorrect and illogical premises (e.g. “Why do cats have green wings?").

25.1.2 By grammatical form

Questions that ask whether or not some statement is true are called yes–no questions (or polar questions), since they can in principle be answered by a “yes”or “no” (or similar words or expressions in other languages). Examples include “Do you take sugar?", “Should they be believed?" and “Am I the loneliest person in the world?" A type of question that is similar in form to a yes–no question, but is not intended to be answered with a “yes”or “no”, is the alternative question*[7] (or choice question). This presents two or more alternative answers, as in “Do you want fish or lamb?", or “Are you supporting England, Ireland or Wales?". The expected response is one of the alternatives, or some other indication such as“both”or“neither”(questionnaire forms sometimes contain an option “none of the above”or similar for such questions). Because of their similarity in form to yes–no questions, they may sometimes be answered “yes”or “no”, possibly humorously or as a result of misunderstanding. The other main type of question (other than yes–no questions) is those called wh-questions (or non-polar questions). These use interrogative words (wh-words) such as when, which, who, how, etc. to specify the information that is desired. (In some languages the formation of such questions may involve wh-movement – see the section below for grammatical description.) The name derives from the fact that most of the English interrogative words (with the exception of how) begin with the letters wh. These are the types of question sometimes referred to in journalism and other investigative contexts as the Five Ws. Tag questions are a grammatical structure in which a declarative statement or an imperative is turned into a question by adding an interrogative fragment (the “tag”), such as right in “You remembered the eggs, right?", or isn't it in “It's cold today, isn't it?" Tag questions can be answered with a yes or no. 25.2. GRAMMAR 105

As well as direct questions (such as Where are my keys?), there also exist indirect questions (also called interrogative content clauses), such as where my keys are. These are used as subordinate clauses in sentences such as “I wonder where my keys are”and “Ask him where my keys are.”Indirect questions do not necessarily follow the same rules of grammar as direct questions. For example, in English and some other languages, indirect questions are formed without inversion of subject and verb (compare the word order in“where are they?" and "(I wonder) where they are” ). Indirect questions may also be subject to the changes of tense and other changes that apply generally to indirect speech.

25.2 Grammar

Main article: Interrogative

Languages may use both syntax and prosody to distinguish interrogative sentences (which pose questions) from declarative sentences (which state propositions). Syntax refers to grammatical changes, such as moving words around or adding question words; prosody refers here to changes in intonation while speaking. In English, German, French and various other languages, questions are marked by a distinct word order featuring inversion – the subject is placed after the verb rather than before it: “You are cold”becomes “Are you cold?" However, English allows such inversion only with a particular class of verbs (called auxiliary or special verbs), and thus sometimes requires the addition of an auxiliary do, does or did before inversion can take place (“He sings”→ “Does he sing?") – for details see do-support. In some languages, yes–no questions are marked by an interrogative particle, such as the Japanese か ka, Mandarin 吗 ma and Polish czy. Also, in languages generally, wh-questions are marked by an interrogative word (wh-word) such as what, where or how. In languages such as English this word generally moves to the front of the sentence (wh-fronting), and subject–verb inversion occurs as in yes–no questions, but in some other languages these changes in word order are not necessary (e.g. Mandarin 你要什么?nǐ yào shénme, meaning“what do you want?" is literally “you want what?"). Intonation patterns characteristic of questions often involve a raised pitch near the end of the sentence. In English this occurs especially for yes–no questions; it may also be used for sentences that do not have the grammatical form of questions, but are nonetheless intended to elicit information (declarative questions), as in“You're not using this?" In languages written in Latin, Cyrillic or certain other scripts, a question mark at the end of a sentence identifies questions in writing. (In Spanish an additional inverted mark is placed at the beginning: ¿Cómo está usted? “How are you?".) As with intonation, this feature is not restricted to sentences having the grammatical form of questions – it may also indicate a sentence's pragmatic function.

25.3 Responses

The most typical response to a question is an answer that provides the information indicated as being sought by the questioner. This may range from a simple yes or no (in the case of yes–no questions) to a more complex or detailed answer. (An answer may be correct or incorrect, depending on whether the information it presents is true or false.) Of course other responses to a question are also possible, such as “I don't know”or some other indication of inability or unwillingness to provide a direct answer to the question. “Negative questions”are interrogative sentences which contain negation in their phrasing, such as“Shouldn't you be working?". These can have different ways of expressing affirmation and denial from the standard form of question, and they can be confusing, since it is sometimes unclear whether the answer should be the opposite of the answer to the non-negated question. For example, if one does not have a passport, both “Do you have a passport?" and “Don't you have a passport?" are properly answered with “No”, despite apparently asking opposite questions. The Japanese language avoids this ambiguity. Answering “No”to the second of these in Japanese would mean, “I do have a passport”. A similar ambiguous question in English is “Do you mind if...?" The responder may reply unambiguously “Yes, I do mind,”if they do mind, or “No, I don't mind,”if they don't, but a simple “No”or “Yes”answer can lead to confusion, as a single “No”can seem like a “Yes, I do mind”(as in “No, please don't do that”), and a “Yes” can seem like a “No, I don't mind”(as in “Yes, go ahead”). An easy way to bypass this confusion would be to 106 CHAPTER 25. QUESTION

ask a non-negative question, such as “Is it all right with you if...?" Some languages have different particles (for example the French "si", the German "doch" or the Danish and Norwegian "jo") to answer negative questions (or negative statements) in an affirmative way; they provide a means to express contradiction. More information on these issues can be found in the articles Yes–no question, Yes and no, and Answer ellipsis.

25.4 Learning

Questions are used from the most elementary stage of learning to original research. In the scientific method, a question often forms the basis of the investigation and can be considered a transition between the observation and hypothesis stages. Students of all ages use questions in their learning of topics, and the skill of having learners creating “investigatable”questions is a central part of inquiry education. The Socratic method of questioning student responses may be used by a teacher to lead the student towards the truth without direct instruction, and also helps students to form logical conclusions. A widespread and accepted use of questions in an educational context is the assessment of students' knowledge through exams.

25.5 Philosophical questions

The philosophical questions are conceptual, not factual questions. There are questions that are not fully answered by any other. Philosophy deals with questions that arise when people reflect on their lives and their world. Some philosophical questions are practical: for example, “Is euthanasia justifiable?", “Does the state have the right to censor pornography or restrict tobacco advertising?", “To what extent are Mäori and Päkehä today responsible for decisions made by their ancestors?". Other philosophical questions are more theoretical, although they often arise through thinking about practical issues. The questions just listed, for example, may prompt more general philosophical questions about the circumstances under which it may be morally justifiable to take a life, or about the extent to which the state may restrict the liberty of the individual. Some fascinating, 'classic', questions of philosophy are speculative and theoretical and concern the nature of knowledge, reality and human existence: for example,“What, if anything, can be known with certainty?", “Is the mind essentially non-physical?", “Are values absolute or relative?", “Does the universe need explanation in terms of a Supreme Intelligence?", “What, if anything, is the meaning or purpose of human existence?". Finally, the philosophical questions are typically about conceptual issues; they are often questions about our concepts and the relation between our concepts and the world they represent. Every question implies a statement and every statement implies a question.*[8]

25.6 Origins of questioning behavior

Enculturated apes Kanzi, Washoe, Sarah and a few others who underwent extensive language training programs (with the use of gestures and other visual forms of communications) successfully learned to answer quite complex questions and requests (including question words “who”what”, “where”), although so far they failed to learn how to ask questions themselves. For example, David and Anne Premack wrote: “Though she [Sarah] understood the question, she did not herself ask any questions ̶unlike the child who asks interminable questions, such as What that? Who making noise? When Daddy come home? Me go Granny's house? Where puppy? Sarah never delayed the departure of her trainer after her lessons by asking where the trainer was going, when she was returning, or anything else” .*[9] The ability to ask questions is often assessed in relation to comprehension of syntactic structures. It is widely accepted, that the first questions are asked by humans during their early infancy, at the pre-syntactic, one word stage of language development, with the use of question intonation.*[10]

25.7 See also

• Answer 25.8. REFERENCES 107

• Debate

• Doubt

• Phrasal exclamation

• Inquiry

• Interrobang

• Interrogation

• Interrogative word

• Interrogatory

• Leading question

• Logic

• Problem

• Proposition

• Question mark

• Rhetorical question

• Sentence (linguistics)

• Sentence function

• Truth

• Twenty Questions

• Who Asked the First Question?

25.8 References

[1] Source for quotation

[2] “Research Methods Knowledge Base”. Socialresearchmethods.net. 2006-10-20. Retrieved 2012-06-06.

[3] Research Methods Knowledge Base. Types of Questions. Socialresearchmethods.net

[4] Types of Questions Based on Bloom's Taxonomy. (Bloom, et al., 1956).

[5] Questioning Toolkit

[6] “Punchy Question Combinations”

[7] Loos, Eugene E.; Susan Anderson; Dwight H. Day, Jr.; Paul C. Jordan; J. Douglas Wingate. “What is an alternative question?". Glossary of linguistic terms. SIL International.

[8] Paul, Richard and Elder, Linda. (2006) Critical Thinking Tools for Taking Charge of Your Learning and Your Life, New Jersey: Prentice Hall Publishing. ISBN 0-13-114962-8

[9] Premack, David; Premack, Ann J. (1983). The mind of an ape. New York, London: W. W. Norton & Company. p. 29.

[10] Crystal, David (1987). The Cambridge Encyclopedia of Language. Cambridge. Pg. 241, 143: Cambridge University. 108 CHAPTER 25. QUESTION

25.9 Further reading

• Berti, Enrico, Soggetti di responsabilita: questioni di filosofia pratica, Reggio Emilia, 1993.

• C. L. Hamblin, “Questions”, in: Paul Edwards (ed.), Encyclopedia of Philosophy. • Georg Stahl, “Un développement de la logique des questions”, in: Revue Philosophique de la France et de l'Etranger 88 (1963), 293-301.

• Fieser, James, Lillegard, Norman (eds), Philosophical questions: readings and interactive guides, 2005. • McKenzie, Jamie, Leading questions: From Now On: The Educational Technology Journal, 2007.

• McKenzie, Jamie, Learning to question to wonder to learn, From Now On: The Educational Technology Jour- nal, 2005.

• McKenzie, Jamie, “The Question Mark” • Muratta Bunsen, Eduardo, “Lo erotico en la pregunta”, in: Aletheia 5 (1999), 65-74.

• Smith, Joseph Wayne, Essays on ultimate questions: critical discussions of the limits of contemporary philo- sophical inquiry, Aldershot: Avebury, 1988. Chapter 26

Question dodging

Question dodging is the intentional avoidance of answering a question. This may happen when the person questioned either does not know the answer and wants to avoid embarrassment, or when the person is being interrogated or questioned in debate, and wants to avoid giving a direct response.*[1] Overt question dodging can sometimes be employed humorously, in order to sidestep giving a public answer in a political discussion: when a reporter asked Mayor Richard J. Daley why Hubert Humphrey had lost the state of Illinois in the 1968 presidential election, Daley replied “He lost it because he didn't get enough votes.”*[2] A false accusation of question dodging can sometimes be made as a disingenuous tactic in debate, in the informal fallacy of the loaded question. A common way out of this argument is not to answer the question (e.g. with a simple 'yes' or 'no'), but to challenge the assumption behind the question. This can lead the person questioned to be accused of “dodging the question”.

26.1 Form

Often the aim of dodging a question is to make it seem as though the question was fulfilled. The person who asked the question feeling satisfied with the answer, unaware that the question was not properly answered. The form of a dodged question, this example being “Why are you here?", could be:

• Refusing to answer (“No comment.”)

• Stalling (“Give me a minute.”)

• Changing the subject (“Your shoelace is undone.”)

• Explaining redundant things to distract one's focus (“Well I arrived here 10 minutes ago and I decided that...”)

• Creating an excuse not to answer (“I'm feeling sick, I can't answer now.”)

• Repeating the question (“Why are you here?")

• Answering the question with another question (“Why do you think I'm here?")

• Answering things that weren't asked (“I'm in the corridor.”)

• Questioning the question (“Are you sure that's relevant?")

• Challenging the question (“You assume I am here for a reason.”)

• Giving an answer in the wrong context (“Because I was born.”)

109 110 CHAPTER 26. QUESTION DODGING

26.2 See also

• Question

• Begging the question • Evasion (ethics)

26.3 References

[1] “Why Dodging the Question Works in Debates (and Job Interviews)". BNET. 2008-10-07.

[2] Engel, S. Morris; Soldan. The Study of Philosophy. Rowman & Littlefield. p. 135. ISBN 978-0-7425-4892-3. Retrieved 2010-11-17. Chapter 27

Semantics

Semantics (from Ancient Greek: σημαντικός sēmantikós,“significant”)*[1]*[2] is the study of meaning. It focuses on the relation between signifiers, like words, phrases, signs, and symbols, and what they stand for; their denotation. Linguistic semantics is the study of meaning that is used for understanding human expression through language. Other forms of semantics include the semantics of programming languages, formal logics, and semiotics. In international scientific vocabulary semantics is also called semasiology. The word semantics itself denotes a range of ideas̶from the popular to the highly technical. It is often used in ordinary language for denoting a problem of understanding that comes down to word selection or connotation. This problem of understanding has been the subject of many formal enquiries, over a long period of time, especially in the field of formal semantics. In linguistics, it is the study of the interpretation of signs or symbols used in agents or communities within particular circumstances and contexts.*[3] Within this view, sounds, facial expressions, body language, and proxemics have semantic (meaningful) content, and each comprises several branches of study. In written language, things like paragraph structure and punctuation bear semantic content; other forms of language bear other semantic content.*[3] The formal study of semantics intersects with many other fields of inquiry, including lexicology, syntax, pragmatics, etymology and others. Independently, semantics is also a well-defined field in its own right, often with synthetic properties.*[4] In the philosophy of language, semantics and reference are closely connected. Further related fields include philology, communication, and semiotics. The formal study of semantics can therefore be manifold and complex. Semantics contrasts with syntax, the study of the combinatorics of units of a language (without reference to their meaning), and pragmatics, the study of the relationships between the symbols of a language, their meaning, and the users of the language.*[5] Semantics as a field of study also has significant ties to various representational theories of meaning including truth theories of meaning, coherence theories of meaning, and correspondence theories of meaning. Each of these is related to the general philosophical study of reality and the representation of meaning.

27.1 Linguistics

In linguistics, semantics is the subfield that is devoted to the study of meaning, as inherent at the levels of words, phrases, sentences, and larger units of discourse (termed texts, or narratives). The study of semantics is also closely linked to the subjects of representation, reference and denotation. The basic study of semantics is oriented to the examination of the meaning of signs, and the study of relations between different linguistic units and compounds: homonymy, synonymy, antonymy, hypernymy, hyponymy, meronymy, metonymy, holonymy, paronyms. A key con- cern is how meaning attaches to larger chunks of text, possibly as a result of the composition from smaller units of meaning. Traditionally, semantics has included the study of sense and denotative reference, truth conditions, argument structure, thematic roles, discourse analysis, and the linkage of all of these to syntax.

111 112 CHAPTER 27. SEMANTICS

27.2 Montague grammar

In the late 1960s, Richard Montague proposed a system for defining semantic entries in the lexicon in terms of the lambda calculus. In these terms, the syntactic parse of the sentence John ate every bagel would consist of a subject (John) and a predicate (ate every bagel); Montague demonstrated that the meaning of the sentence altogether could be decomposed into the meanings of its parts and in relatively few rules of combination. The logical predicate thus obtained would be elaborated further, e.g. using truth theory models, which ultimately relate meanings to a set of Tarskiian universals, which may lie outside the logic. The notion of such meaning atoms or primitives is basic to the language of thought hypothesis from the 1970s. Despite its elegance, Montague grammar was limited by the context-dependent variability in word sense, and led to several attempts at incorporating context, such as:

• Situation semantics (1980s): truth-values are incomplete, they get assigned based on context

• Generative lexicon (1990s): categories (types) are incomplete, and get assigned based on context

27.3 Dynamic turn in semantics

In Chomskyan linguistics there was no mechanism for the learning of semantic relations, and the nativist view consid- ered all semantic notions as inborn. Thus, even novel concepts were proposed to have been dormant in some sense. This view was also thought unable to address many issues such as metaphor or associative meanings, and semantic change, where meanings within a linguistic community change over time, and qualia or subjective experience. An- other issue not addressed by the nativist model was how perceptual cues are combined in thought, e.g. in mental rotation.*[6] This view of semantics, as an innate finite meaning inherent in a lexical unit that can be composed to generate mean- ings for larger chunks of discourse, is now being fiercely debated in the emerging domain of cognitive linguistics*[7] and also in the non-Fodorian camp in philosophy of language.*[8] The challenge is motivated by:

• factors internal to language, such as the problem of resolving indexical or anaphora (e.g. this x, him, last week). In these situations context serves as the input, but the interpreted utterance also modifies the context, so it is also the output. Thus, the interpretation is necessarily dynamic and the meaning of sentences is viewed as contexts changing potentials instead of propositions.

• factors external to language, i.e. language is not a set of labels stuck on things, but“a toolbox, the importance of whose elements lie in the way they function rather than their attachments to things.”*[8] This view reflects the position of the later Wittgenstein and his famous game example, and is related to the positions of Quine, Davidson, and others.

A concrete example of the latter phenomenon is semantic underspecification – meanings are not complete without some elements of context. To take an example of one word, red, its meaning in a phrase such as red book is similar to many other usages, and can be viewed as compositional.*[9] However, the colours implied in phrases such as red wine (very dark), and red hair (coppery), or red soil, or red skin are very different. Indeed, these colours by themselves would not be called red by native speakers. These instances are contrastive, so red wine is so called only in comparison with the other kind of wine (which also is not white for the same reasons). This view goes back to de Saussure:

Each of a set of synonyms like redouter ('to dread'), craindre ('to fear'), avoir peur ('to be afraid') has its particular value only because they stand in contrast with one another. No word has a value that can be identified independently of what else is in its vicinity.*[10] and may go back to earlier Indian views on language, especially the Nyaya view of words as indicators and not carriers of meaning.*[11] An attempt to defend a system based on propositional meaning for semantic underspecification can be found in the generative lexicon model of James Pustejovsky, who extends contextual operations (based on type shifting) into the lexicon. Thus meanings are generated “on the fly”(as you go), based on finite context. 27.4. PROTOTYPE THEORY 113

27.4 Prototype theory

Another set of concepts related to fuzziness in semantics is based on prototypes. The work of Eleanor Rosch in the 1970s led to a view that natural categories are not characterizable in terms of necessary and sufficient conditions, but are graded (fuzzy at their boundaries) and inconsistent as to the status of their constituent members. One may compare it with Jung's archetype, though the concept of archetype sticks to static concept. Some post-structuralists are against the fixed or static meaning of the words. Derrida, following Nietzsche, talked about slippages in fixed meanings. Systems of categories are not objectively out there in the world but are rooted in people's experience. These categories evolve as learned concepts of the world – meaning is not an objective truth, but a subjective construct, learned from experience, and language arises out of the “grounding of our conceptual systems in shared embodiment and bodily experience”.*[12] A corollary of this is that the conceptual categories (i.e. the lexicon) will not be identical for different cultures, or indeed, for every individual in the same culture. This leads to another debate (see the Sapir– Whorf hypothesis or Eskimo words for snow).

27.5 Theories in semantics

27.5.1 Model theoretic semantics

Main article: formal semantics (linguistics)

Originates from Montague's work (see above). A highly formalized theory of natural language semantics in which expressions are assigned denotations (meanings) such as individuals, truth values, or functions from one of these to another. The truth of a sentence, and more interestingly, its logical relation to other sentences, is then evaluated relative to a model.

27.5.2 Formal (or truth-conditional) semantics

Main article: truth-conditional semantics

Pioneered by the philosopher Donald Davidson, another formalized theory, which aims to associate each natural lan- guage sentence with a meta-language description of the conditions under which it is true, for example: 'Snow is white' is true if and only if snow is white. The challenge is to arrive at the truth conditions for any sentences from fixed meanings assigned to the individual words and fixed rules for how to combine them. In practice, truth-conditional semantics is similar to model-theoretic semantics; conceptually, however, they differ in that truth-conditional seman- tics seeks to connect language with statements about the real world (in the form of meta-language statements), rather than with abstract models.

27.5.3 Lexical and conceptual semantics

Main article: conceptual semantics

This theory is an effort to explain properties of argument structure. The assumption behind this theory is that syntactic properties of phrases reflect the meanings of the words that head them.*[13] With this theory, linguists can better deal with the fact that subtle differences in word meaning correlate with other differences in the syntactic structure that the word appears in.*[13] The way this is gone about is by looking at the internal structure of words.*[14] These small parts that make up the internal structure of words are termed semantic primitives.*[14]

27.5.4 Lexical semantics

Main article: lexical semantics 114 CHAPTER 27. SEMANTICS

A linguistic theory that investigates word meaning. This theory understands that the meaning of a word is fully reflected by its context. Here, the meaning of a word is constituted by its contextual relations.*[15] Therefore, a distinction between degrees of participation as well as modes of participation are made.*[15] In order to accomplish this distinction any part of a sentence that bears a meaning and combines with the meanings of other constituents is la- beled as a semantic constituent. Semantic constituents that cannot be broken down into more elementary constituents are labeled minimal semantic constituents.*[15]

27.5.5 Computational semantics

Main article: computational semantics

Computational semantics is focused on the processing of linguistic meaning. In order to do this concrete algorithms and architectures are described. Within this framework the algorithms and architectures are also analyzed in terms of decidability, time/space complexity, data structures they require and communication protocols.*[16]

27.6 Computer science

Main article: Semantics (computer science)

In computer science, the term semantics refers to the meaning of languages, as opposed to their form (syntax). Ac- cording to Euzenat, semantics “provides the rules for interpreting the syntax which do not provide the meaning directly but constrains the possible interpretations of what is declared.”*[17] In other words, semantics is about in- terpretation of an expression. Additionally, the term is applied to certain types of data structures specifically designed and used for representing information content.

27.6.1 Programming languages

The semantics of programming languages and other languages is an important issue and area of study in computer science. Like the syntax of a language, its semantics can be defined exactly. For instance, the following statements use different syntaxes, but cause the same instructions to be executed: Generally these operations would all perform an arithmetical addition of 'y' to 'x' and store the result in a variable called 'x'. Various ways have been developed to describe the semantics of programming languages formally, building on mathematical logic:*[18]

• Operational semantics: The meaning of a construct is specified by the computation it induces when it is executed on a machine. In particular, it is of interest how the effect of a computation is produced.

• Denotational semantics: Meanings are modelled by mathematical objects that represent the effect of executing the constructs. Thus only the effect is of interest, not how it is obtained.

• Axiomatic semantics: Specific properties of the effect of executing the constructs are expressed as assertions. Thus there may be aspects of the executions that are ignored.

27.6.2 Semantic models

Terms such as semantic network and semantic data model are used to describe particular types of data models char- acterized by the use of directed graphs in which the vertices denote concepts or entities in the world, and the arcs denote relationships between them. The Semantic Web refers to the extension of the World Wide Web via embedding added semantic metadata, using semantic data modelling techniques such as Resource Description Framework (RDF) and Web Ontology Language (OWL). 27.7. PSYCHOLOGY 115

27.7 Psychology

In psychology, semantic memory is memory for meaning – in other words, the aspect of memory that preserves only the gist, the general significance, of remembered experience – while episodic memory is memory for the ephemeral details – the individual features, or the unique particulars of experience. The term 'episodic memory' was introduced by Tulving and Schacter in the context of 'declarative memory' which involved simple association of factual or objec- tive information concerning its object. Word meaning is measured by the company they keep, i.e. the relationships among words themselves in a semantic network. The memories may be transferred intergenerationally or isolated in one generation due to a cultural disruption. Different generations may have different experiences at similar points in their own time-lines. This may then create a vertically heterogeneous semantic net for certain words in an other- wise homogeneous culture.*[19] In a network created by people analyzing their understanding of the word (such as Wordnet) the links and decomposition structures of the network are few in number and kind, and include part of, kind of, and similar links. In automated ontologies the links are computed vectors without explicit meaning. Various automated technologies are being developed to compute the meaning of words: latent semantic indexing and support vector machines as well as natural language processing, neural networks and predicate calculus techniques. Ideasthesia is a psychological phenomenon in which activation of concepts evokes sensory experiences. For example, in synesthesia, activation of a concept of a letter (e.g., that of the letter A) evokes sensory-like experiences (e.g., of red color).

27.8 See also

27.8.1 Linguistics and semiotics

• Asemic writing

• Cognitive semantics

• Colorless green ideas sleep furiously

• Computational semantics

• Discourse representation theory

• General semantics

• Generative semantics

• Hermeneutics

• Natural semantic metalanguage

• Onomasiology

• Phono-semantic matching

• Pragmatic maxim

• Pragmaticism

• Pragmatism

• Problem of universals

• Semantic change or progression

• Semantic class

• Semantic feature

• Semantic field

• Semantic lexicon 116 CHAPTER 27. SEMANTICS

• Semantic primes • Semantic property • Sememe • Semiosis • Semiotics • SPL notation

27.8.2 Logic and mathematics

• Formal logic • Game semantics • Model theory • Gödel's incompleteness theorems • Proof-theoretic semantics • Semantic consequence • Semantic theory of truth • Semantics of logic • Truth-value semantics

27.8.3 Computer science

• Formal semantics of programming languages • Knowledge representation • Semantic networks • Semantic transversal • Semantic analysis • Semantic compression • Semantic HTML • Semantic integration • Semantic interpretation • Semantic link • Semantic reasoner • Semantic service oriented architecture • Semantic spectrum • Semantic unification • Semantic Web

27.8.4 Psychology

• Ideasthesia 27.9. REFERENCES 117

27.9 References

[1] σημαντικός. Liddell, Henry George; Scott, Robert; A Greek–English Lexicon at the Perseus Project

[2] The word is derived from the Ancient Greek word σημαντικός (semantikos), “related to meaning, significant”, from σημαίνω semaino,“to signify, to indicate”, which is from σῆμα sema,“sign, mark, token”. The plural is used in analogy with words similar to physics, which was in the neuter plural in Ancient Greek and meant “things relating to nature”.

[3] Neurath, Otto; Carnap, Rudolf; Morris, Charles F. W. (Editors) (1955). International Encyclopedia of Unified Science. Chicago, IL: University of Chicago Press.

[4] Cruse, Alan; Meaning and Language: An introduction to Semantics and Pragmatics, Chapter 1, Oxford Textbooks in Lin- guistics, 2004; Kearns, Kate; Semantics, Palgrave MacMillan 2000; Cruse, D. A.; Lexical Semantics, Cambridge, MA, 1986.

[5] Kitcher, Philip; Salmon, Wesley C. (1989). Scientific Explanation. Minneapolis, MN: University of Minnesota Press. p. 35.

[6] Barsalou, L.; Perceptual Symbol Systems, Behavioral and Brain Sciences, 22(4), 1999

[7] Langacker, Ronald W. (1999). Grammar and Conceptualization. Berlin/New York: Mouton de Gruyer. ISBN 3-11- 016603-8.

[8] Peregrin, Jaroslav (2003). Meaning: The Dynamic Turn. Current Research in the Semantics/Pragmatics Interface. London: Elsevier.

[9] Gärdenfors, Peter (2000). Conceptual Spaces: The Geometry of Thought. MIT Press/Bradford Books. ISBN 978-0-585- 22837-2.

[10] de Saussure, Ferdinand (1916). The Course of General Linguistics (Cours de linguistique générale).

[11] Matilal, Bimal Krishna (1990). The Word and the World: India's Contribution to the Study of Language. Oxford. The Nyaya and Mimamsa schools in Indian vyākaraṇa tradition conducted a centuries-long debate on whether sentence meaning arises through composition on word meanings, which are primary; or whether word meanings are obtained through analysis of sentences where they appear. (Chapter 8).

[12] Lakoff, George; Johnson, Mark (1999). Philosophy in the Flesh: The embodied mind and its challenge to Western thought. Chapter 1. New York, NY: Basic Books. OCLC 93961754.

[13] Levin, Beth; Pinker, Steven; Lexical & Conceptual Semantics, Blackwell, Cambridge, MA, 1991

[14] Jackendoff, Ray; Semantic Structures, MIT Press, Cambridge, MA, 1990

[15] Cruse, D.; Lexical Semantics, Cambridge University Press, Cambridge, MA, 1986

[16] Nerbonne, J.; The Handbook of Contemporary Semantic Theory (ed. Lappin, S.), Blackwell Publishing, Cambridge, MA, 1996

[17] Euzenat, Jerome. Ontology Matching. Springer-Verlag Berlin Heidelberg, 2007, p. 36

[18] Nielson, Hanne Riis; Nielson, Flemming (1995). Semantics with Applications, A Formal Introduction (1st ed.). Chicester, England: John Wiley & Sons. ISBN 0-471-92980-8.

[19] Giannini, A. J.; Semiotic and Semantic Implications of “Authenticity”, Psychological Reports, 106(2):611–612, 2010

27.10 External links

• semanticsarchive.net

• Teaching page for A-level semantics • Chomsky, Noam; On Referring, Harvard University, 30 October 2007 (video)

• Jackendoff, Ray; Conceptual Semantics, Harvard University, 13 November 2007 (video)

• Semantics: an interview with Jerry Fodor (ReVEL, vol. 5, no. 8 (2007)) Chapter 28

Signed graph

In the area of graph theory in mathematics, a signed graph is a graph in which each edge has a positive or negative sign. Formally, a signed graph Σ is a pair (G, σ) that consists of a graph G = (V, E) and a sign mapping or signature σ from E to the sign group {+,−}. The graph may have loops and multiple edges as well as half-edges (with only one endpoint) and loose edges (with no endpoints). Half and loose edges do not receive signs. (In the terminology of the article on graphs, it is a multigraph, but we say graph because in signed graph theory it is usually unnatural to restrict to simple graphs.) The sign of a circle (this is the edge set of a simple cycle) is defined to be the product of the signs of its edges; in other words, a circle is positive if it contains an even number of negative edges and negative if it contains an odd number of negative edges. The fundamental fact about a signed graph is the set of positive circles, denoted by B(Σ). A signed graph, or a subgraph or edge set, is called balanced if every circle in it is positive (and it contains no half-edges). Two fundamental questions about a signed graph are: Is it balanced? What is the largest size of a balanced edge set in it? The first question is not difficult; the second is computationally intractable (technically, it is NP-hard). Signed graphs were first introduced by Harary to handle a problem in social psychology (Cartwright and Harary, 1956). They have been rediscovered many times because they come up naturally in many unrelated areas. For instance, they enable one to describe and analyze the geometry of subsets of the classical root systems. They appear in topological graph theory and group theory. They are a natural context for questions about odd and even cycles in graphs. They appear in computing the ground state energy in the non-ferromagnetic Ising model; for this one needs to find a largest balanced edge set in Σ. They have been applied to data classification in correlation clustering.

28.1 Examples

* • The complete signed graph on n vertices with loops, denoted by ±Kn o, has every possible positive and negative edge including negative loops, but no positive loops. Its edges correspond to the roots of the root system Cn; the column of an edge in the incidence matrix (see below) is the vector representing the root.

• The complete signed graph with half-edges, ±Kn', is ±Kn with a half-edge at every vertex. Its edges corre- spond to the roots of the root system Bn, half-edges corresponding to the unit basis vectors.

• The complete signed link graph, ±Kn, is the same but without loops. Its edges correspond to the roots of the root system Dn.

• An all-positive signed graph has only positive edges. If the underlying graph is G, the all-positive signing is written +G.

• An all-negative signed graph has only negative edges. It is balanced if and only if it is bipartite because a circle is positive if and only if it has even length. An all-negative graph with underlying graph G is written −G.

• A signed complete graph has as underlying graph G the ordinary complete graph Kn. It may have any signs. Signed complete graphs are equivalent to two-graphs, which are of value in finite group theory. A two-graph

118 28.2. ADJACENCY MATRIX 119

can be defined as the class of vertex sets of negative triangles (having an odd number of negative edges) in a signed complete graph.

28.2 Adjacency matrix

The adjacency matrix of a signed graph Σ on n vertices is an n × n matrix A(Σ). It has a row and column for each vertex. The entry avw in row v and column w is the number of positive vw edges minus the number of negative vw edges. On the diagonal, avv = 0 if there are no loops or half-edges; the correct definition when such edges exist depends on the circumstances.

28.3 Orientation

A signed graph is oriented when each end of each edge is given a direction, so that in a positive edge the ends are both directed from one endpoint to the other, and in a negative edge either both ends are directed outward, to their own vertices, or both are directed inward, away from their vertices. Thus, an oriented signed graph is the same as a bidirected graph. (It is very different from a signed digraph.)

28.4 Incidence matrix

The (more correctly,“an”) incidence matrix of a signed graph with n vertices and m edges is an n × m matrix, with a row for each vertex and a column for each edge. It is obtained by orienting the signed graph in any way. Then its entry ηij is +1 if edge j is oriented into vertex i, −1 if edge j is oriented out of vertex i, and 0 if vertex i and edge j are not incident. This rule applies to a link, whose column will have two nonzero entries with absolute value 1, a half-edge, whose column has a single nonzero entry +1 or −1, and a loose edge, whose column has only zeroes. The column of a loop, however, is all zero if the loop is positive, and if the loop is negative it has entry ±2 in the row corresponding to its incident vertex. Any two incidence matrices are related by negating some subset of the columns. Thus, for most purposes it makes no difference which orientation we use to define the incidence matrix, and we may speak of the incidence matrix of Σ without worrying about exactly which one it is. Negating a row of the incidence matrix corresponds to switching the corresponding vertex.

28.5 Switching

Switching a vertex in Σ means negating the signs of all the edges incident to that vertex. Switching a set of vertices means negating all the edges that have one end in that set and one end in the complementary set. Switching a series of vertices, once each, is the same as switching the whole set at once. Switching of signed graphs (signed switching) is generalized from Seidel (1976), where it was applied to graphs (graph switching), in a way that is equivalent to switching of signed complete graphs. Switching equivalence means that two graphs are related by switching, and an equivalence class of signed graphs under switching is called a switching class. Sometimes these terms are applied to equivalence of signed graphs under the combination of switching and isomorphism, especially when the graphs are unlabeled; but to distinguish the two concepts the combined equivalence may be called switching isomorphism and an equivalence class under switching isomorphism may be called a switching isomorphism class. Switching a set of vertices affects the adjacency matrix by negating the rows and columns of the switched vertices. It affects the incidence matrix by negating the rows of the switched vertices. 120 CHAPTER 28. SIGNED GRAPH

28.6 Fundamental theorem

A signed graph is balanced if and only if its vertex set can be divided into two sets (either of which may be empty), X and Y, so that each edge between the sets is negative and each edge within a set is positive. This is the first theorem of signed graphs (Harary, 1953). It generalizes the theorem that an ordinary (unsigned) graph is bipartite if and only if every cycle has even length. A simple proof uses the method of switching. To prove Harary's theorem, one shows by induction that Σ can be switched to be all positive if and only if it is balanced. A weaker theorem, but with a simpler proof, is that if every 3-cycle in a signed complete graph is balanced, then either all nodes are connected by positive edges or the nodes can be divided into two groups A and B such that every pair of nodes in A are connected by a positive edge, every pair of nodes in B are connected by a positive edge, and all edges going between A and B are negative edges. For the proof, pick an arbitrary node n and place it and all those nodes that are linked to n by a positive edge in one group, called A, and all those linked to n by a negative edge in the other, called B. Since this is a complete graph, every two nodes in A must be friends and every two nodes in B must be friends, otherwise there would be a 3-cycle which was unbalanced. (Since this is a complete graph, any one negative edge would cause an unbalanced three cycle.) Likewise, all negative edges must go between the two groups.*[1]

28.7 Frustration

Give each vertex a value of +1 or −1; we call this a state of Σ. An edge is called satisfied if it is positive and both endpoints have the same value, or it is negative and the endpoints have opposite values. An edge that is not satisfied is called frustrated. The smallest number of frustrated edges over all states is called the frustration index (or line index of balance) of Σ. Finding the frustration index is hard, in fact, it is NP-hard. One can see this by observing that the frustration index of an all-negative signed graph is equivalent to the maximum cut problem in graph theory, which is NP-hard. The reason for the equivalence is that the frustration index equals the smallest number of edges whose negation (or, equivalently, deletion; a theorem of Harary) makes Σ balanced. (This can be proved easily by switching.) The frustration index is important in a model of spin glasses, the mixed Ising model. In this model, the signed graph is fixed. A state consists of giving a “spin”, either “up”or “down”, to each vertex. We think of spin up as +1 and spin down as −1. Thus, each state has a number of frustrated edges. The energy of a state is larger when it has more frustrated edges, so a ground state is a state with the fewest frustrated energy. Thus, to find the ground state energy of Σ one has to find the frustration index.

28.8 Matroid theory

There are two matroids associated with a signed graph, called the signed-graphic matroid (also called the frame matroid or sometimes bias matroid) and the lift matroid, both of which generalize the cycle matroid of a graph. They are special cases of the same matroids of a biased graph. The frame matroid (or signed-graphic matroid) M(G) (Zaslavsky, 1982) has for its ground set the edge set E. An edge set is independent if each component contains either no circles or just one circle, which is negative. (In matroid theory a half-edge acts exactly like a negative loop.) A circuit of the matroid is either a positive circle, or a pair of negative circles together with a connecting simple path, such that the two circles are either disjoint (then the connecting path has one end in common with each circle and is otherwise disjoint from both) or share just a single common vertex (in this case the connecting path is that single vertex). The rank of an edge set S is n − b, where n is the number of vertices of G and b is the number of balanced components of S, counting isolated vertices as balanced components. This matroid is the column matroid of the incidence matrix of the signed graph. That is why it describes the linear dependencies of the roots of a classical root system.

The extended lift matroid L0(G) has for its ground set the set E0 the union of edge set E with an extra point, which we denote e0. The lift matroid L(G) is the extended lift matroid restricted to E. The extra point acts exactly like a negative loop, so we describe only the lift matroid. An edge set is independent if it contains either no circles or just one circle, which is negative. (This is the same rule that is applied separately to each component in the signed-graphic matroid.) A matroid circuit is either a positive circle or a pair of negative circles that are either disjoint or have just a common vertex. The rank of an edge set S is n − c + ε, where c is the number of components of S, counting isolated 28.9. OTHER KINDS OF “SIGNED GRAPH” 121

vertices, and ε is 0 if S is balanced and 1 if it is not.

28.9 Other kinds of “signed graph”

Sometimes the signs are taken to be +1 and −1. This is only a difference of notation, if the signs are still multiplied around a circle and the sign of the product is the important thing. However, there are two other ways of treating the edge labels that do not fit into signed graph theory. The term signed graph is applied occasionally to graphs in which each edge has a weight, w(e) = +1 or −1. These are not the same kind of signed graph; they are weighted graphs with a restricted weight set. The difference is that weights are added, not multiplied. The problems and methods are completely different. The name is also applied to graphs in which the signs function as colors on the edges. The significance of the color is that it determines various weights applied to the edge, and not that its sign is intrinsically significant. This is the case in knot theory, where the only significance of the signs is that they can be interchanged by the two-element group, but there is no intrinsic difference between positive and negative. The matroid of a sign-colored graph is the cycle matroid of the underlying graph; it is not the frame or lift matroid of the signed graph. The sign labels, instead of changing the matroid, become signs on the elements of the matroid. In this article we discuss only signed graph theory in the strict sense. For sign-colored graphs see colored matroids.

28.9.1 Signed digraph

A signed digraph is a directed graph with signed arcs. Signed digraphs are far more complicated than signed graphs, because only the signs of directed cycles are significant. For instance, there are several definitions of balance, each of which is hard to characterize, in strong contrast with the situation for signed undirected graphs. Signed digraphs should not be confused with oriented signed graphs. The latter are bidirected graphs, not directed graphs (except in the trivial case of all positive signs).

28.10 Coloring

As with unsigned graphs, there is a notion of signed graph coloring. Where a coloring of a graph is a mapping from the vertex set to the natural numbers, a coloring of a signed graph is a mapping from the vertex set to the integers. The constraints on proper colorings come from the edges of the signed graph. The integers assigned to two vertices must be distinct if they are connected by a positive edge. The labels on adjacent vertices must not be additive inverses if the vertices are connected by a negative edge. There can be no proper coloring of a signed graph with a positive loop. When restricting the vertex labels to the set of integers with magnitude at most a natural number k, the set of proper colorings of a signed graph is finite. The relation between the number of such proper colorings and k is a polynomial in k. This is analogous to the chromatic polynomial of unsigned graphs.

28.11 Applications

28.11.1 Social psychology

In social psychology, signed graphs have been used to model social situations, with positive edges representing friend- ships and negative edges enmities between nodes, which represent people (Cartwright and Harary 1956). Then, for example, a positive 3-cycle is either three mutual friends, or two friends with a common enemy; while a negative 3-cycle is either three mutual enemies, or two enemies who share a mutual friend. Positive cycles are supposed to be stable social situations, whereas negative cycles are supposed to be unstable. According to the theory, in the case of three mutual enemies, this is because sharing a common enemy is likely to cause two of the enemies to become friends. In the case of two enemies sharing a friend, the shared friend is likely to choose one over the other and turn one of his or her friendships into an enmity.*[2] This approach to social groups may be called structural balance theory. It has been applied not only in small group psychology but also to large social systems. 122 CHAPTER 28. SIGNED GRAPH

Structural balance has been severely challenged, especially in its application to large systems, on the theoretical ground that friendly relations tie a society together, while a society divided into two camps of enemies would be highly unstable.*[3] Experimental studies have also provided only weak confirmation of the predictions of structural balance theory.*[4]

28.11.2 Spin glasses

In physics, signed graphs are a natural context for the general, nonferromagnetic Ising model, which is applied to the study of spin glasses.

28.11.3 Data clustering

Correlation clustering looks for natural clustering of data by similarity. The data points are represented as the vertices of a graph, with a positive edge joining similar items and a negative edge joining dissimilar items.

28.12 Generalizations

A signed graph is a special kind of gain graph, where the gain group has order 2. The pair (G, B(G)) is a special kind of biased graph.

28.13 Notes

[1] Luis Von Ahn Science of the Web Lecture 3 p. 28

[2] Luis Von Ahn Science of the Web Lecture 3 p. 17

[3] B. Anderson, in Perspectives on Social Network Research, ed. P.W. Holland and S. Leinhardt. New York: Academic Press, 1979.

[4] Julian O. Morrissette and John C. Jahnke, No relations and relations of strength zero in the theory of structural balance, Human Relations, vol. 20 (1967), pp. 189-195.

28.14 References

• Cartwright, D.; Harary, F. (1956), “Structural balance: a generalization of Heider's theory”, Psychological Review 63: 277–293, doi:10.1037/h0046049. • Harary, Frank (1955), “On the notion of balance of a signed graph”, The Michigan Mathematical Journal 2: 143–146, MR 0067468. • Seidel, J. J. (1976), “A survey of two-graphs”, Colloquio Internazionale sulle Teorie Combinatorie (Rome, 1973), Tomo I, Atti dei Convegni Lincei 17, Rome: Accademia Nazionale dei Lincei, pp. 481–511, MR 0550136.

• Zaslavsky, Thomas (1982),“Signed graphs”, Discrete Applied Mathematics 4 (1): 47–74, doi:10.1016/0166- 218X(82)90033-6, MR 676405. Erratum. Discrete Applied Mathematics, 5 (1983), 248.

• Zaslavsky, Thomas (1998), “A mathematical bibliography of signed and gain graphs and allied areas”, Electronic Journal of Combinatorics 5, Dynamic Surveys 8, 124 pp., MR 1744869. Chapter 29

Skew-symmetric graph

In graph theory, a branch of mathematics, a skew-symmetric graph is a directed graph that is isomorphic to its own transpose graph, the graph formed by reversing all of its edges, under an isomorphism that is an involution without any fixed points. Skew-symmetric graphs are identical to the double covering graphs of bidirected graphs. Skew-symmetric graphs were first introduced under the name of antisymmetrical digraphs by Tutte (1967), later as the double covering graphs of polar graphs by Zelinka (1976b), and still later as the double covering graphs of bidirected graphs by Zaslavsky (1991). They arise in modeling the search for alternating paths and alternating cycles in algorithms for finding matchings in graphs, in testing whether a still life pattern in Conway's Game of Life may be partitioned into simpler components, in graph drawing, and in the implication graphs used to efficiently solve the 2-satisfiability problem.

29.1 Definition

As defined, e.g., by Goldberg & Karzanov (1996), a skew-symmetric graph G is a directed graph, together with a function σ mapping vertices of G to other vertices of G, satisfying the following properties:

1. For every vertex v, σ(v) ≠ v,

2. For every vertex v, σ(σ(v)) = v,

3. For every edge (u,v), (σ(v),σ(u)) must also be an edge.

One may use the third property to extend σ to an orientation-reversing function on the edges of G. The transpose graph of G is the graph formed by reversing every edge of G, and σ defines a graph isomorphism from G to its transpose. However, in a skew-symmetric graph, it is additionally required that the isomorphism pair each vertex with a different vertex, rather than allowing a vertex to be mapped to itself by the isomorphism or to group more than two vertices in a cycle of isomorphism. A path or cycle in a skew-symmetric graph is said to be regular if, for each vertex v of the path or cycle, the corre- sponding vertex σ(v) is not part of the path or cycle.

29.2 Examples

Every directed path graph with an even number of vertices is skew-symmetric, via a symmetry that swaps the two ends of the path. However, path graphs with an odd number of vertices are not skew-symmetric, because the orientation- reversing symmetry of these graphs maps the center vertex of the path to itself, something that is not allowed for skew-symmetric graphs. Similarly, a directed cycle graph is skew-symmetric if and only if it has an even number of vertices. In this case, the number of different mappings σ that realize the skew symmetry of the graph equals half the length of the cycle.

123 124 CHAPTER 29. SKEW-SYMMETRIC GRAPH

29.3 Polar/switch graphs, double covering graphs, and bidirected graphs

A skew-symmetric graph may equivalently be defined as the double covering graph of a polar graph (introduced by Zelinka (1974), Zelinka (1976), called a switch graph by Cook (2003)), which is an undirected graph in which the edges incident to each vertex are partitioned into two subsets. Each vertex of the polar graph corresponds to two vertices of the skew-symmetric graph, and each edge of the polar graph corresponds to two edges of the skew- symmetric graph. This equivalence is the one used by Goldberg & Karzanov (1996) to model problems of matching in terms of skew-symmetric graphs; in that application, the two subsets of edges at each vertex are the unmatched edges and the matched edges. Zelinka (following F. Zitek) and Cook visualize the vertices of a polar graph as points where multiple tracks of a train track come together: if a train enters a switch via a track that comes in from one direction, it must exit via a track in the other direction. The problem of finding non-self-intersecting smooth curves between given points in a train track comes up in testing whether certain kinds of graph drawings are valid (Hui, Schaefer & Štefankovič 2004) and may be modeled as the search for a regular path in a skew-symmetric graph. A closely related concept is the bidirected graph of Edmonds & Johnson (1970)(“polarized graph”in the terminology of Zelinka (1974), Zelinka (1976)), a graph in which each of the two ends of each edge may be either a head or a tail, independently of the other end. A bidirected graph may be interpreted as a polar graph by letting the partition of edges at each vertex be determined by the partition of endpoints at that vertex into heads and tails; however, swapping the roles of heads and tails at a single vertex (“switching”the vertex, in the terminology of Zaslavsky (1991)) produces a different bidirected graph but the same polar graph. For the correspondence between bidirected graphs and skew-symmetric graphs (i.e., their double covering graphs) see Zaslavsky (1991), Section 5, or Babenko (2006). To form the double covering graph (i.e., the corresponding skew-symmetric graph) from a polar graph G, create for each vertex v of G two vertices v0 and v1, and let σ(vi) = v1 − i. For each edge e = (u,v) of G, create two directed edges in the covering graph, one oriented from u to v and one oriented from v to u. If e is in the first subset of edges at v, these two edges are from u0 into v0 and from v1 into u1, while if e is in the second subset, the edges are from u0 into v1 and from v0 into u1. In the other direction, given a skew-symmetric graph G, one may form a polar graph that has one vertex for every corresponding pair of vertices in G and one undirected edge for every corresponding pair of edges in G. The undirected edges at each vertex of the polar graph may be partitioned into two subsets according to which vertex of the polar graph they go out of and come in to. A regular path or cycle of a skew-symmetric graph corresponds to a path or cycle in the polar graph that uses at most one edge from each subset of edges at each of its vertices.

29.4 Matching

In constructing matchings in undirected graphs, it is important to find alternating paths, paths of vertices that start and end at unmatched vertices, in which the edges at odd positions in the path are not part of a given partial matching and in which the edges at even positions in the path are part of the matching. By removing the matched edges of such a path from a matching, and adding the unmatched edges, one can increase the size of the matching. Similarly, cycles that alternate between matched and unmatched edges are of importance in weighted matching problems. As Goldberg & Karzanov (1996) showed, an alternating path or cycle in an undirected graph may be modeled as a regular path or cycle in a skew-symmetric directed graph. To create a skew-symmetric graph from an undirected graph G with a specified matching M, view G as a switch graph in which the edges at each vertex are partitioned into matched and unmatched edges; an alternating path in G is then a regular path in this switch graph and an alternating cycle in G is a regular cycle in the switch graph. Goldberg & Karzanov (1996) generalized alternating path algorithms to show that the existence of a regular path between any two vertices of a skew-symmetric graph may be tested in linear time. Given additionally a non-negative length function on the edges of the graph that assigns the same length to any edge e and to σ(e), the shortest regular path connecting a given pair of nodes in a skew-symmetric graph with m edges and n vertices may be tested in time O(m log n). If the length function is allowed to have negative lengths, the existence of a negative regular cycle may be tested in polynomial time. Along with the path problems arising in matchings, skew-symmetric generalizations of the max-flow min-cut theorem have also been studied (Goldberg & Karzanov 2004; Tutte 1967). 29.5. STILL LIFE THEORY 125

29.5 Still life theory

Cook (2003) shows that a still life pattern in Conway's Game of Life may be partitioned into two smaller still lifes if and only if an associated switch graph contains a regular cycle. As he shows, for switch graphs with at most three edges per vertex, this may be tested in polynomial time by repeatedly removing bridges (edges the removal of which disconnects the graph) and vertices at which all edges belong to a single partition until no more such simplifications may be performed. If the result is an empty graph, there is no regular cycle; otherwise, a regular cycle may be found in any remaining bridgeless component. The repeated search for bridges in this algorithm may be performed efficiently using a dynamic graph algorithm of Thorup (2000). Similar bridge-removal techniques in the context of matching were previously considered by Gabow, Kaplan & Tarjan (1999).

29.6 Satisfiability

~x2 x0

x6 ~x4 x3

~x5 ~x1 x1 x5

~x3 x4 ~x6

~x0 x2

An implication graph. Its skew symmetry can be realized by rotating the graph through a 180 degree angle and reversing all edges.

An instance of the 2-satisfiability problem, that is, a Boolean expression in conjunctive normal form with two variables or negations of variables per clause, may be transformed into an implication graph by replacing each clause u∨v by the two implications (¬u)⇒v and (¬v)⇒u . This graph has a vertex for each variable or negated variable, and a directed edge for each implication; it is, by construction, skew-symmetric, with a correspondence σ that maps each variable to its negation. As Aspvall, Plass & Tarjan (1979) showed, a satisfying assignment to the 2-satisfiability instance is equivalent to a partition of this implication graph into two subsets of vertices, S and σ(S), such that no edge starts in S and ends in σ(S). If such a partition exists, a satisfying assignment may be formed by assigning a true value to every variable in S and a false value to every variable in σ(S). This may be done if and only if no strongly connected component of the graph contains both some vertex v and its complementary vertex σ(v). If two vertices belong to the 126 CHAPTER 29. SKEW-SYMMETRIC GRAPH same strongly connected component, the corresponding variables or negated variables are constrained to equal each other in any satisfying assignment of the 2-satisfiability instance. The total time for testing strong connectivity and finding a partition of the implication graph is linear in the size of the given 2-CNF expression.

29.7 Recognition

It is NP-complete to determine whether a given directed graph is skew-symmetric, by a result of Lalonde (1981) that it is NP-complete to find a color-reversing involution in a bipartite graph. Such an involution exists if and only if the directed graph given by orienting each edge from one color class to the other is skew-symmetric, so testing skew-symmetry of this directed graph is hard. This complexity does not affect path-finding algorithms for skew- symmetric graphs, because these algorithms assume that the skew-symmetric structure is given as part of the input to the algorithm rather than requiring it to be inferred from the graph alone.

29.8 References

• Aspvall, Bengt; Plass, Michael F.; Tarjan, Robert E. (1979), “A linear-time algorithm for testing the truth of certain quantified boolean formulas”, Information Processing Letters 8 (3): 121–123, doi:10.1016/0020- 0190(79)90002-4. • Babenko, Maxim A. (2006), “Acyclic bidirected and skew-symmetric graphs: algorithms and structure”, Computer Science – Theory and Applications, Lecture Notes in Computer Science 3967, Springer-Verlag, pp. 23–34, doi:10.1007/11753728_6, ISBN 978-3-540-34166-6. • Biggs, Norman (1974), Algebraic Graph Theory, London: Cambridge University Press. • Cook, Matthew (2003),“Still life theory”, New Constructions in Cellular Automata, Santa Fe Institute Studies in the Sciences of Complexity, Oxford University Press, pp. 93–118. • Edmonds, Jack; Johnson, Ellis L. (1970),“Matching: a well-solved class of linear programs”, Combinatorial Structures and their Applications: Proceedings of the Calgary Symposium, June 1969, New York: Gordon and Breach. Reprinted in Combinatorial Optimization ̶Eureka, You Shrink!, Springer-Verlag, Lecture Notes in Computer Science 2570, 2003, pp. 27–30, doi:10.1007/3-540-36478-1_3. • Gabow, Harold N.; Kaplan, Haim; Tarjan, Robert E. (1999),“Unique maximum matching algorithms”, Proc. 31st ACM Symp. Theory of Computing (STOC), pp. 70–78, doi:10.1145/301250.301273, ISBN 1-58113-067- 8. • Goldberg, Andrew V.; Karzanov, Alexander V. (1996), “Path problems in skew-symmetric graphs”, Com- binatorica 16 (3): 353–382, doi:10.1007/BF01261321. • Goldberg, Andrew V.; Karzanov, Alexander V. (2004), “Maximum skew-symmetric flows and matchings”, Mathematical programming 100 (3): 537–568, doi:10.1007/s10107-004-0505-z. • Hui, Peter; Schaefer, Marcus; Štefankovič, Daniel (2004),“Train tracks and confluent drawings”, Proc. 12th Int. Symp. Graph Drawing, Lecture Notes in Computer Science 3383, Springer-Verlag, pp. 318–328. • Lalonde, François (1981), “Le problème d'étoiles pour graphes est NP-complet”, Discrete Mathematics 33 (3): 271–280, doi:10.1016/0012-365X(81)90271-5, MR 602044. • Thorup, Mikkel (2000), “Near-optimal fully-dynamic graph connectivity”, Proc. 32nd ACM Symposium on Theory of Computing, pp. 343–350, doi:10.1145/335305.335345, ISBN 1-58113-184-4. • Tutte, W. T. (1967), “Antisymmetrical digraphs”, Canadian Journal of Mathematics 19: 1101–1117, doi:10.4153/CJM-1967-101-8. • Zaslavsky, Thomas (1982), “Signed graphs”, Discrete Applied Mathematics 4: 47–74, doi:10.1016/0166- 218X(82)90033-6. • Zaslavsky, Thomas (1991),“Orientation of signed graphs”, European Journal of Combinatorics 12: 361–375, doi:10.1016/s0195-6698(13)80118-7. 29.8. REFERENCES 127

• Zelinka, Bohdan (1974), “Polar graphs and railway traffic”, Aplikace Matematiky 19: 169–176.

• Zelinka, Bohdan (1976a),“ of polar and polarized graphs”, Czechoslovak Mathematical Journal 26: 339–351.

• Zelinka, Bohdan (1976b), “Analoga of Menger's theorem for polar and polarized graphs”, Czechoslovak Mathematical Journal 26: 352–360. Chapter 30

Transpose graph

In the mathematical and algorithmic study of graph theory, the converse,*[1] transpose*[2] or reverse*[3] of a directed graph G is another directed graph on the same set of vertices with all of the edges reversed compared to the orientation of the corresponding edges in G. That is, if G contains an edge (u,v) then the converse/transpose/reverse of G contains an edge (v,u) and vice versa.

30.1 Notation

The name “converse”arises because the reversal of arrows corresponds to taking the converse of an implication in logic. The name“transpose”is because the adjacency matrix of the transpose directed graph is the transpose of the adjacency matrix of the original directed graph. There is no general agreement on preferred terminology. The converse is denoted symbolically as G', G*T, G*R, or other notations, depending on which terminology is used and which book or article is the source for the notation.

30.2 Applications

Although there is little difference mathematically between a graph and its transpose, the difference may be larger in computer science, depending on how a given graph is represented. For instance, for the web graph, it is easy to determine the outgoing links of a vertex, but hard to determine the incoming links, while in the reversal of this graph the opposite is true. In graph algorithms, therefore, it may sometimes be useful to construct the reversal of a graph, in order to put the graph into a form which is more suitable for the operations being performed on it. An example of this is Kosaraju's algorithm for strongly connected components, which applies depth first search twice, once to the given graph and a second time to its reversal.

30.3 Related concepts

A skew-symmetric graph is a graph that is isomorphic to its own transpose graph, via a special kind of isomorphism that pairs up all of the vertices. The inverse relation of a is the relation that reverses the ordering of each pair of related objects. If the relation is interpreted as a directed graph, this is the same thing as the transpose of the graph. In particular, the order dual of a partial order can be interpreted in this way as the transposition of a transitively closed directed acyclic graph.

128 30.4. REFERENCES 129

30.4 References

[1] Harary, Frank; Norman, Robert Z.; Cartwright, Dorwin (1965), Structural Models: An Introduction to the Theory of Di- rected Graphs, New York: Wiley.

[2] Cormen, Thomas H.; Leiserson, Charles E., Rivest, Ronald L.. Introduction to Algorithms. MIT Press and McGraw-Hill., ex. 22.1–3, p. 530.

[3] Essam, John W.; Fisher, Michael E. (1970),“Some basic definitions in graph theory”, Review of Modern Physics 42 (2): 271–288, doi:10.1103/RevModPhys.42.271, entry 2.24, p. 275. 130 CHAPTER 30. TRANSPOSE GRAPH

30.5 Text and image sources, contributors, and licenses

30.5.1 Text

• Bidirected graph Source: https://en.wikipedia.org/wiki/Bidirected_graph?oldid=644060757 Contributors: Zaslav, Rjwilmsi, Solarus- dude, Davepape, David Eppstein, VolkovBot, Addbot, AnomieBOT, Twri, DrilBot, I dream of horses and Anonymous: 2 • Bipartite double cover Source: https://en.wikipedia.org/wiki/Bipartite_double_cover?oldid=563327850 Contributors: Tomo, RFBailey, Headbomb, David Eppstein, Twri, SlumdogAramis and Citation bot 1 • Complex question Source: https://en.wikipedia.org/wiki/Complex_question?oldid=630512684 Contributors: Mrwojo, Radgeek, Piotrus, Wtshymanski, RussBot, SmackBot, Stifle, Schwallex, DougHill, Cydebot, Scepbot, Clan-destine, Sonicsuns, Nyttend, Sroc, Lova Falk, Martarius, Niceguyedc, Addbot, Mdw0, Materialscientist, Sellyme, Machine Elf 1735, JKDw, The Arbiter, Duoduoduo, EmausBot, ZéroBot, Cobaltcigs, H3llBot, ClueBot NG, JYBot, Hypervisor, Eyesnore and Anonymous: 25 • Directed graph Source: https://en.wikipedia.org/wiki/Directed_graph?oldid=667034926 Contributors: Michael Hardy, Booyabazooka, Altenmann, Bkell, Giftlite, Vadmium, Andreas Kaufmann, Zaslav, Grue, Linas, SixWingedSeraph, Chobot, Wavelength, Stepa, Wook- ieInHeat, BiT, Nbarth, Delcypher, Meno25, Was a bee, Kozuch, Headbomb, Hamaryns, JAnDbot, Catgut, David Eppstein, R'n'B, Mar- cuse7~enwiki, M-le-mot-dit, Llorenzi, VolkovBot, HughD, Constant314, TXiKiBoT, Shauncutts, PaulTanenbaum, Rinix, Henry Delforn (old), Justin W Smith, JP.Martin-Flatin, Brews ohare, NuclearWarfare, MystBot, Addbot, Jarble, Luckas-bot, Ptbotgourou, Pcap, Calle, -Anne Bauval, Miym, Corruptcopper, Einkil, Mark Renier, Ricardo Fer ,آرمان ,Dzied Bulbash, Bryan.burgers, Twri, Ms.wiki.us, Xqbot reira de Oliveira, Pierre5018, EmausBot, WikitanvirBot, Sinuhe20, ChuispastonBot, Joerg Bader, Helpful Pixie Bot, Alesiom, Mark viking, Saranavan, Werddemer, SofjaKovalevskaja and Anonymous: 36 • Downward entailing Source: https://en.wikipedia.org/wiki/Downward_entailing?oldid=604850753 Contributors: Michael Hardy, Au- gur, Stevey7788, Qwertyus, Volfy, Epolk, SmackBot, Imz, Pvodenski, Alexey Feldgendler, Gregbard, Hharley, Erik9bot, Lam Kin Keung, KellerST, Tomtung and Anonymous: 10 • Entailment (pragmatics) Source: https://en.wikipedia.org/wiki/Entailment_(pragmatics)?oldid=653895710 Contributors: Vaganyik, Michael Hardy, Radgeek, Enochlau, Ancheta Wis, Vishahu, Andy Smith, FrancisTyers, Graham87, SmackBot, Imz, Antonielly, Greg- bard, Addbot, Srich32977, GrouchoBot, Mathonius, Erik9bot, Lam Kin Keung, Milad pourrahmani, David815, Johnsoniensis, Sweeeet- heart and Anonymous: 12 • Fallacy Source: https://en.wikipedia.org/wiki/Fallacy?oldid=664677464 Contributors: AxelBoldt, The Anome, Ed Poor, M~enwiki, Stevertigo, Mrwojo, Michael Hardy, Dominus, Graue, IZAK, Arthur Frayn, Poor Yorick, Timwi, Dcoetzee, DJ Clayworth, Markhurd, Mephistopheles, Furrykef, Hyacinth, Major Danby, Paul Klenk, SchmuckyTheCat, Clementi, Luis Dantas, BenFrantzDale, Ravn, Archenzo, Kpalion, SoWhy, HorsePunchKid, Jossi, CSTAR, Rdsmith4, Yayay, Picapica, Esperant, Mike Rosoft, Blanchette, Discospinster, Rich Farmbrough, Silence, Xezbeth, Beska, Causa sui, Smalljim, Rbj, MaxHund, Cohesion, SpeedyGonsales, Helix84, JesseHogan, Mdd, Wayfarer, Knucmo2, Alansohn, Gary, Metron4, Snowolf, Garrisonroo, Mbloore, Georgius~enwiki, Mikeo, Bsadowski1, Versageek, SteinbDJ, Inarius, Zntrip, Mel Etitis, Marcn, RHaworth, MONGO, MrDarcy, Apokrif, Trevor Andersen, Hughcharlesparker, Teemu Leisti, Cataclysm, Rjwilmsi, Koavf, Vary, Hiberniantears, Pearlg, DoubleBlue, Kwhittingham, RobertG, Ayla, Jrtayloriv, Mathrick, Preslethe, Common Man, Chobot, DVdm, Skoosh, Peter S., ChristianEdwardGruber, Stephenb, Cate, Thane, Blue Dream, Wiki alf, Cleared as filed, Squatrano, DryaUnda, Bota47, Mnyakko, Kermit2, Bobryuu, JB Piggin, Mhhza, SMcCandlish, Juliano, Fram, Geof- frey.landis, Kungfuadam, Finell, Btipling, SmackBot, WookieInHeat, Yamaguchi 先⽣, Gilliam, Brianski, Anthonzi, Jprg1966, Master of Puppets, MartinPoulter, Roscelese, Ted87, Bruce Marlin, Huwmanbeing, Scwlong, Can't sleep, clown will eat me, Alphathon, Fac- torial, Rsm99833, RavenStorm, Mr.Z-man, Cybercobra, Richard001, Glover, Jon Awbrey, Just plain Bill, Kendrick7, Marcus Brute, Soundguy95, Ollj, Mukadderat, ArglebargleIV, Robomaeyhem, Slavatrudu, Kuru, Antonielly, Jdfawcett, Grumpyyoungman01, Meco, Synergism, Christian Historybuff, Hu12, Iridescent, K, Toddsschneider, Wjejskenewr, Aeternus, Chris53516, Ouzo~enwiki, George100, ChrisCork, CmdrObot, N2e, Smallpond, Pro bug catcher, Keithh, Safalra, Dmsc893, Gregbard, Theo Clark, Cydebot, Steel, Gogo Dodo, Bellerophon5685, Ttiotsw, Dynaflow, Clovis Sangrail, Nearfar, Letranova, Epbr123, Hacky, Vertium, Dfrg.msc, Paith, Poe Joe, Froggo Zijgeb, Wikiwikibangbang, SvenAERTS, AntiVandalBot, Serenity id, Majorly, Gioto, Onthesideoftheangels, Geraintluff, Carolmooredc, IrishPete, Rtrev, Alphachimpbot, Nosbig, Bhikkhu Santi, JAnDbot, Sonicsuns, Slacka123, Relyk, Andonic, Bearpatch, Basesurge, Bong- warrior, Meredyth, AtticusX, QuizzicalBee, Roger2909, Gamkiller, Mahitgar, LookingGlass, Allstarecho, Chris G, Waninge, MartinBot, Dennisthe2, ExplicitImplicity, Ulkomaalainen, Livecoral, Jarhed, Mscbray, J.delanoy, Colincbn, Neon white, Gblandst, McSly, WAC50, Elfchief, BrettAllen, Carolsuehaney, General Ludd, TopGun, Guyzero, Arwack, Wikipeterproject, Wordreader, VolkovBot, BlackJar72, VmanBG, Philip Trueman, Drjonesgp, TXiKiBoT, Jalwikip, Crowne, Liko81, Ontoraul, Clarince63, Martin451, Anarchangel, Jamelan, Wykypydya, Southwestspringroll, Enigmaman, SQL, Graymornings, Lova Falk, Enviroboy, Rmawhorter~enwiki, Zx-man, Drufin, Kir- benS, EJF, SieBot, YonaBot, VVVBot, Africangenesis, Bg sparx, ChristinaT3, Abhishikt, ClydeOnline, Techman224, Msrasnw, Alatari, Thorncrag, R00m c, RobinHood70, Pierc3000, ClueBot, Dead10ck, Wikievil666, The Thing That Should Not Be, Boarshead2, Table- Manners, Voxpuppet, Napzilla, Latreia, Arakunem, Anapazapa, Ansh666, Thomas Kist, Doobie61, Hyrim, Excirial, Watchduck, Nudve, Vanisheduser12345, Lartoven, Brews ohare, Spirals31, Frozen4322, Bd pride, Spinoff, HumphreyW, Xavierstuvw, Crazy Boris with a red beard, Tarheel95, Wikiuser100, Ilikepie2221, Skarebo, MystBot, Starfire777, Rexroad2, Proatheism, Angryapathy, Addbot, Cymbal- monkey, Hda3ku, Thomblake, MasterDarksol, Fluffernutter, LaaknorBot, CarsracBot, Joycloete, Bob K31416, Alpinwolf, Ehrenkater, Hintss, Tide rolls, Petecutter, Lightbot, OlEnglish, Jarble, Kyu-san, Legobot, Luckas-bot, Yobot, Shaka78, Sumail, Silvart, AnomieBOT, Jim1138, IRP, Flewis, Mahmudmasri, LuoWencan, Tangerinewarrior, Sionus, ANHERDEDED, Capricorn42, Jeffrey Mall, Keremka- cel, Jazzdrummers, Luis Felipe Schenone, GrouchoBot, HannahSuzanneeee, RibotBOT, N419BH, Watershipper, FreeKnowledgeCreator, FrescoBot, Wikipe-tan, Mark Renier, Peabody80, Machine Elf 1735, Aldy, Codecreations, Leftware, Pinethicket, ScienceGolfFanatic, Philawsophy, Jacobean Grid, Thedarkknight491, Cnwilliams, Aharburg, Fparnon, Orenburg1, CircularReason, Fenwaysoxfreak, Lotje, Nphyx, Dinamik-bot, Maleonmoney123123, Jon Harper, XJDHDR, Kazenokaze, DARTH SIDIOUS 2, Nihilist999, Breezeboy, DASH- Bot, EmausBot, Zgrkbr, WikitanvirBot, Enfascination, Pologic, Tisane, Wikipelli, AsceticRose, Grzdacz, John Cline, Odinandsleipner, Cybermud, Mephisto Panic, Progers1618, Aaronmthompson, ROFL zealot, TheStrelok~enwiki, Kranix, Tijfo098, Robin Lionheart, Splashgordeaux, DeAmazonia, ClueBot NG, Wilde Jagd, Iiii I I I, Unscintillating, NestleNW911, Snotbot, Masssly, Mo2 can do, Con- temptofcourt, Zreid89, PrincessWortheverything, Thr4shl0v3r, Helpful Pixie Bot, WNYY98, Wikisian, BG19bot, W.andrea, Longbyte1, Darouet, Doug1941, Allecher, Chaz1453, WhatsHisName, Harizotoh9, MrBill3, Swhoyt, MrSabazius, BattyBot, Justincheng12345-bot, Bdgreene, Theone1234, Mutoso, Arcandam, Redbullet750, Timelezz, Доктор прагматик, Sminthopsis84, Mogism, Billy D Allen, DF- Bothma, Toothacherism, Twilightzoner, Melonkelon, Eyesnore, Baileybutton, DoomPlume, Origamite, Shanna Moyes, Ginsuloft, Anish- 30.5. TEXT AND IMAGE SOURCES, CONTRIBUTORS, AND LICENSES 131

wiki12, Zhnirlwaupp, Ronaldleppink, Monkbot, TerryAlex, Jsoto008, Ghosthux, Horique, Jerodlycett, Fabriziomacagno and Anonymous: 529 • Fixed point (mathematics) Source: https://en.wikipedia.org/wiki/Fixed_point_(mathematics)?oldid=666322196 Contributors: Damian Yerrick, AxelBoldt, The Anome, Paul Ebermann, Michael Hardy, Clausen, HolIgor, Charles Matthews, Dysprosia, Robbot, Fredrik, Math- Martin, JerryFriedman, Tobias Bergemann, Marc Venot, Giftlite, BenFrantzDale, Risk one, Noe, Beland, CSTAR, Vbganesh, PhotoBox, Smyth, Bender235, Rgdboer, Obradovic Goran, WojciechSwiderski~enwiki, Oleg Alexandrov, Japanese Searobin, Linas, Chochopk, Qwertyus, Staecker, Salix alba, Vonkje, Krishnavedala, GioeleBarabucci, Algebraist, YurikBot, Hairy Dude, Jpbowen, Zwobot, Tribaal, Benmachine, Pokipsy76, Karmastan, Tinctorius, Lambiam, DHR, Tomlee2060, Jim.belk, EdC~enwiki, Novangelis, Gco, CRGreathouse, Cydebot, Xantharius, Thijs!bot, Mathmoclaire, Escarbot, CosineKitty, Magioladitis, Jtir, Cuzkatzimhut, VolkovBot, TXiKiBoT, Lechat- jaune, Dmcq, Dogah, SieBot, Rdhettinger, Cliff, UKoch, SilvonenBot, MystBot, Addbot, LaaknorBot, Luckas-bot, Yobot, Ptbotgourou, AnomieBOT, Isheden, Confluente, Ciclopediatro, Meier99, Jonkerz, Jowa fan, EmausBot, Manuel Valadez Sánchez, Aughost, Chew- ings72, Tijfo098, ChuispastonBot, Wcherowi, Xavier10xavier, Frietjes, Kasirbot, NotWith, Loraof and Anonymous: 50 • Graph isomorphism Source: https://en.wikipedia.org/wiki/Graph_isomorphism?oldid=664144309 Contributors: AxelBoldt, Michael Hardy, Booyabazooka, Dcoetzee, Jitse Niesen, McKay, Altenmann, MathMartin, Bkell, Giftlite, Jason Quinn, Rich Farmbrough, Paul August, EmilJ, AdamAtlas, Trjumpet, MoraSique, Puzne~enwiki, Linas, Shreevatsa, Oliphaunt, Marudubshinki, BD2412, Rjwilmsi, Maxal, YurikBot, Michael Slone, KSmrq, Arthur Rubin, Iotatau, Itub, SmackBot, Davepape, DHN-bot~enwiki, Blake-, Loopology, J. Finkelstein, Aeons, CBM, Citrus538, SuperMidget, Blaisorblade, Kozuch, Headbomb, Nemnkim, David Eppstein, PierreCA22, Veg- asprof, Robert Illes, Daniele.tampieri, AlnoktaBOT, TXiKiBoT, Jamelan, Jludwig, Justin W Smith, Tim32, PixelBot, Sleepinj, MystBot, Addbot, DOI bot, Verbal, Lightbot, PV=nRT, Yobot, Nibbio84, Citation bot, Twri, Kingfishr, RibotBOT, Ricardo Ferreira de Oliveira, Citation bot 1, Thomasp1985, RjwilmsiBot, Norlesh, Igor Yalovecky, El Roih, Aerosprite, Ansatz, Anrnusna, Monkbot and Anonymous: 39 • Graph theory Source: https://en.wikipedia.org/wiki/Graph_theory?oldid=667682086 Contributors: AxelBoldt, Kpjas, LC~enwiki, Robert Merkel, Zundark, Taw, Jeronimo, BlckKnght, Dze27, Oskar Flordal, Andre Engels, Karl E. V. Palmen, Shd~enwiki, XJaM, JeLuF, Arvindn, Gianfranco, Matusz, PierreAbbat, Miguel~enwiki, Boleslav Bobcik, FvdP, Camembert, Hirzel, Tomo, Patrick, Chas zzz brown, Michael Hardy, Wshun, Booyabazooka, Glinos, Meekohi, Jakob Voss, TakuyaMurata, GTBacchus, Grog~enwiki, Pcb21, Dgrant, CesarB, Looxix~enwiki, Ellywa, Ams80, Ronz, Nanshu, Gyan, Nichtich~enwiki, Mark Foskey, Александър, Poor Yorick, Caramdir~enwiki, Mxn, Charles Matthews, Berteun, Almi, Hbruhn, Dysprosia, Daniel Quinlan, Gutza, Doradus, Zoicon5, Roachmeister, Populus, Zero0000, Doctorbozzball, McKay, Shizhao, Optim, Robbot, Brent Gulanowski, Fredrik, Altenmann, Dittaeva, Gandalf61, MathMartin, Sverdrup, Puckly, KellyCoinGuy, Thesilverbail, Bkell, Paul Murray, Fuelbottle, ElBenevolente, Aknxy, Dina, Tobias Bergemann, Giftlite, Dben- benn, Thv, The Cave Troll, Elf, Lupin, Brona, Pashute, Duncharris, Andris, Jorge Stolfi, Tyir, Sundar, GGordonWorleyIII, Alan Au, Bact, Knutux, APH, Tomruen, Tyler McHenry, Naerbnic, Peter Kwok, Robin klein, Ratiocinate, Andreas Kaufmann, Chmod007, Made- wokherd, Discospinster, Solitude, Guanabot, Qutezuce, Mani1, Paul August, Bender235, Zaslav, Tompw, Diego UFCG~enwiki, Chalst, Shanes, Renice, C S, Csl77, Jojit fb, Photonique, Jonsafari, Obradovic Goran, Jumbuck, Msh210, Alansohn, Liao, Mailer diablo, Mari- anocecowski, Aquae, Blair Azzopardi, Oleg Alexandrov, Youngster68, Linas, LOL, Ruud Koot, Tckma, Astrophil, Davidfstr, GregorB, SCEhardt, Stochata, Xiong, Graham87, Magister Mathematicae, SixWingedSeraph, Rjwilmsi, Gmelli, George Burgess, Eugeneiiim, Ar- bor, Kalogeropoulos, Fred Bradstadt, FayssalF, FlaBot, PaulHoadley, RexNL, Vonkje, Chobot, Jinma, YurikBot, Wavelength, Michael Slone, Gaius Cornelius, Alex Bakharev, Morphh, SEWilcoBot, Jaxl, Ino5hiro, Xdenizen, Daniel Mietchen, Shepazu, Rev3nant, Lt-wiki- bot, Jwissick, Arthur Rubin, LeonardoRob0t, Agro1986, Eric.weigle, Allens, Sardanaphalus, Melchoir, Brick Thrower, Ohnoitsjamie, Oli Filth, OrangeDog, Taxipom, DHN-bot~enwiki, Tsca.bot, Onorem, GraphTheoryPwns, Lpgeffen, Jon Awbrey, Henning Makholm, Mlpkr, SashatoBot, Whyfish, Disavian, MynameisJayden, Idiosyncratic-bumblebee, Dicklyon, Quaeler, Lanem, Tawkerbot2, Ylloh, Mahlerite, CRGreathouse, Dycedarg, Requestion, Bumbulski, Myasuda, RUVARD, The Isiah, Ntsimp, Abeg92, Corpx, DumbBOT, Anthonynow12, Thijs!bot, Jheuristic, King Bee, Pstanton, Hazmat2, Headbomb, Marek69, Eleuther, AntiVandalBot, Whiteknox, Hannes Eder, Space- farer, Myanw, JAnDbot, MER-C, Igodard, Restname, Sangak, Tmusgrove, Feeeshboy, Usien6, Ldecola, David Eppstein, Kope, Der- Hexer, Oroso, MartinBot, R'n'B, Uncle Dick, Joespiff, Ignatzmice, Shikhar1986, Tarotcards, Policron, XxjwuxX, Yecril, JohnBlackburne, Dggreen, Anonymous Dissident, Alcidesfonseca, Anna Lincoln, Ocolon, Magmi, PaulTanenbaum, Geometry guy, Fivelittlemonkeys, Sa- credmint, Spitfire8520, Radagast3, SieBot, Dawn Bard, Toddst1, Jon har, Bananastalktome, Titanic4000, Beda42, Maxime.Debosschere, Damien Karras, ClueBot, DFRussia, PipepBot, Justin W Smith, Vacio, Wraithful, Garyzx, Mild Bill Hiccup, DragonBot, Fchristo, Hans Adler, Dafyddg, Razorflame, Rmiesen, Kruusamägi, Pugget, Darkicebot, BodhisattvaBot, Tangi-tamma, Addbot, Dr.S.Ramachandran, Cerber, DOI bot, Ronhjones, Low-frequency internal, CanadianLinuxUser, Protonk, LaaknorBot, Smoke73, Delaszk, Favonian, Mauro- bio, Lightbot, Jarble, Ettrig, Luckas-bot, Yobot, Kilom691, Trinitrix, Jean.julius, AnomieBOT, Womiller99, Sonia, Jim1138, Piano non troppo, Gragragra, RandomAct, Citation bot, Ayda D, Xqbot, Jerome zhu, Capricorn42, Nasnema, Miym, GiveAFishABone, RibotBOT, Jalpar75, Aaditya 7, Ankitbhatt, FrescoBot, Mark Renier, SlumdogAramis, Citation bot 1, Sibian, Pinethicket, RobinK, Wsu-dm-jb, D75304, Wsu-f, Xnn, Obankston, Andrea105, RjwilmsiBot, TjBot, Powerthirst123, Aaronzat, EmausBot, Domesticenginerd, Eleferen- Bot, Jmencisom, Slawekb, Akutagawa10, D.Lazard, Netha Hussain, Tolly4bolly, ChuispastonBot, EdoBot, ClueBot NG, Watersmeetf- reak, Matthiaspaul, MelbourneStar, Outraged duck, OMurgo, Bazuz, Aks1521, Masssly, Joel B. Lewis, Johnsopc, HMSSolent, 4368a, BG19bot, Ajweinstein, Канеюку, MusikAnimal, AvocatoBot, Bereziny, Brad7777, Sofia karampataki, ChrisGualtieri, GoShow, Dexbot, Cerabot~enwiki, Omgigotanaccount, Wikiisgreat123, Faizan, Maxwell bernard, Bg9989, Zsoftua, SakeUPenn, Yloreander, StaticElec- tricity, Gold4444, Cyborgbadger, Zachwaltman, Gr pbi, KasparBot and Anonymous: 378 • Implication graph Source: https://en.wikipedia.org/wiki/Implication_graph?oldid=409100328 Contributors: Altenmann, Vadmium, GregorB, CBM, Thisisraja, David Eppstein, DavidCBryant, DOI bot, Twri, ClueBot NG and Anonymous: 2 • Implicational hierarchy Source: https://en.wikipedia.org/wiki/Implicational_hierarchy?oldid=589522231 Contributors: Heron, Kwamik- agami, Woohookitty, Crystallina, SmackBot, Bluebot, BrainMagMo, Newydd, Ophion, Addbot, Erik9bot, BG19bot and Anonymous: 1 • Implicational propositional calculus Source: https://en.wikipedia.org/wiki/Implicational_propositional_calculus?oldid=627534165 Con- tributors: Michael Hardy, EmilJ, BD2412, Grafen, RDBury, Mhss, Byelf2007, JRSpriggs, CmdrObot, CBM, Gregbard, Cydebot, Thijs!bot, Balloonguy, R'n'B, N4nojohn, Hotfeba, Graymornings, Hugo Herbelin, Addbot and Anonymous: 3 • Implicature Source: https://en.wikipedia.org/wiki/Implicature?oldid=660643237 Contributors: Radgeek, Andycjp, Lucky13pjn, Burschik, Jim Henry, Sfeldman, Rich Farmbrough, El C, Reinyday, Flamingspinach, KYPark, Authr, Mo-Al, The wub, FlaBot, Trickstar, Smack- Bot, Imz, Antonielly, Sjf, Iridescent, Thomasmeeks, Gregbard, Sloth monkey, DumbBOT, Knakts, Dawnseeker2000, Silver Edge, Bong- warrior, Yakushima, Nimic86, Nieske, PubliusNemo, Kaffeeringe.de, DragonBot, Addbot, Americanlinguist, GrouchoBot, Teamprag, Citation bot 1, Pallerti, Hriber, Shpowell, Whisky drinker, Tyranny Sue, ZéroBot, ClueBot NG, Implyer, Helpful Pixie Bot, Klas Katt, Darigon Jr., Epicgenius, Monkbot and Anonymous: 32 132 CHAPTER 30. TRANSPOSE GRAPH

• Implicit Source: https://en.wikipedia.org/wiki/Implicit?oldid=611451795 Contributors: Radiojon, BD2412, Blacklemon67, Xqbot, Tesser- act2, Jim Michael, D.Lazard and Anonymous: 1 • Informal fallacy Source: https://en.wikipedia.org/wiki/Informal_fallacy?oldid=648171027 Contributors: Bjpremore~enwiki, CesarB, Andycjp, Piotrus, Gronky, Arancaytar, I9Q79oL78KiL0QTFHgyc, Tene, Adoniscik, Colmfinito, Thiseye, McGeddon, Thumperward, Cybercobra, Andeggs, Mukadderat, Grumpyyoungman01, Feureau, Chris53516, Gregbard, Letranova, Derrekito, Nyttend, Anarchia, J.delanoy, Jbessie, Gzkn, Bilbobee, Jamelan, OlivierMiR, VanishedUserABC, Radagast3, Paradoctor, Napzilla, Werson, Addbot, CarsracBot, Logicchecker, Machine Elf 1735, ArdeshirBozorg, Aldy, DrilBot, EmausBot, Ibbn, ZéroBot, TyA, Helpful Pixie Bot, Northamerica1000, Randomocity999, Whizz40, Gravuritas, Ihaveacatonmydesk, Godsy and Anonymous: 21 • Involution (mathematics) Source: https://en.wikipedia.org/wiki/Involution_(mathematics)?oldid=666222505 Contributors: Zundark, Patrick, Michael Hardy, Dante Alighieri, Jimfbleak, Stevenj, Charles Matthews, DJ Clayworth, Phys, Goethean, Altenmann, Giftlite, Christopher Parham, DavidCary, Mousomer, Lupin, Dratman, Waltpohl, Matt Crypto, Satyadev, Sam Derbyshire, Paul August, BenjBot, Rgdboer, Wfisher, Army1987, Sebastian Goll, Arthena, Oleg Alexandrov, Linas, Hendrik Fuß, Dzordzm, Mathbot, YurikBot, Russ- Bot, Archelon, Gaius Cornelius, Trovatore, Irishguy, Terbospeed, Mmernex, Imz, InverseHypercube, BiT, MalafayaBot, Octahedron80, Nbarth, SashatoBot, Lambiam, J. Finkelstein, Fontenot 1031, CJauff, Loadmaster, Sharnak, Rschwieb, Hetar, Vanished user, JRSpriggs, Myasuda, Gregbard, Thijs!bot, Neqq, David Eppstein, JaGa, LordAnubisBOT, Cuzkatzimhut, VolkovBot, TXiKiBoT, Hqb, Anony- mous Dissident, Bonefaithsenseless, LBehounek, Cnilep, Phe-bot, David Plum, The Stickler, PerryTachett, Wpoely86, Excirial, Addbot, Zorrobot, Luckas-bot, Pcap, Citation bot, Xqbot, Omnipaedista, A. di M., Sławomir Biały, Codwiki, RjwilmsiBot, EmausBot, Slawekb, ZéroBot, Quondum, ChuispastonBot, Ronreiter, BattyBot, Deltahedron, Makecat-bot, GeoffreyT2000, Mdeibert1, Some1Redirects4You and Anonymous: 36 • Linguistic universal Source: https://en.wikipedia.org/wiki/Linguistic_universal?oldid=651711369 Contributors: Stevertigo, Brettz9, Cadr, Timwi, Furrykef, AnonMoos, Robbot, Babbage, Sundar, Mustafaa, Rama, Rspeer, Dbachmann, Ntennis, Livajo, Kwamikagami, RJCraig, Mark Dingemanse, Ish ishwar, IJzeren Jan, FrancisTyers, Sburke, Commander Keane, TAKASUGI Shinji, Mlewan, Ligulem, Palpalpalpal, Closedmouth, Tropylium, Trickstar, SmackBot, DanielPenfield, Chris the speller, Croquant, VikSol, Tsca.bot, Grover cleve- land, Cybercobra, Lambiam, Ergative rlt, JorisvS, Edricson, CmdrObot, Dragon guy, Mccajor, DorganBot, Franck Dernoncourt, Felix ahlner, WRK, Fratrep, Mr. Stradivarius, Kanguole, Addbot, Luckas-bot, Yobot, DirlBot, The Wiki ghost, Tammystark, WikitanvirBot, Solomonfromfinland, Helpful Pixie Bot, BG19bot, Annabelle Lukin, Kolinuts68 and Anonymous: 28 • Linguistics Source: https://en.wikipedia.org/wiki/Linguistics?oldid=668093060 Contributors: AxelBoldt, Lee Daniel Crocker, Brion VIBBER, Devotchka, Mav, Uriyan, The Anome, Tbackstr, Taw, Slrubenstein, RoseParks, Ap, DanKeshet, Css, LA2, Eob, Jkominek, XJaM, Vaganyik, SolKarma, Hannes Hirzel, Boleslav Bobcik, Ellmist, Graft, Sara Parks Ricker, Ryguasu, Hirzel, Hephaestos, Stever- tigo, DennisDaniels, Quintessent, JohnOwens, Michael Hardy, Tim Starling, Two halves, Lexor, AdamRaizen, Liftarn, Menchi, Wap- caplet, Nine Tail Fox, Paul A, Poitypoity, Alfio, Looxix~enwiki, Ahoerstemeier, Nanshu, Docu, Snoyes, LittleDan, Glenn, Bogdangiusca, Poor Yorick, Nikai, Cadr, Rotem Dan, Jacquerie27, Hectorthebat, Mxn, BRG, Coren, Crusadeonilliteracy, Alex S, Charles Matthews, Guaka, Nohat, Ike9898, Fuzheado, Wik, Mahaabaala, DJ Clayworth, Furrykef, Buridan, Topbanana, Nickshanks, Pakaran, Hoss, Puz- zletChung, Zeke (usurped), Branddobbe, Robbot, Altenmann, Kowey, Kagredon, Hadal, Borislav, Benc, Davidjobson, ElBenevolente, Tappel, Giftlite, Marnanel, Barbara Shack, Sj, Sinuhe, Netoholic, Aphaia, Ich, GregLee, Maarten van Vliet, Aalahazrat~enwiki, Jorge Stolfi, Macrakis, Python eggs, James Crippen, Andycjp, Wleman, Knutux, Quadell, Ran, Antandrus, Beland, Jossi, Hazchem, Supadawg, Icairns, Ramendra, HamYoyo, Random account 47, Moxfyre, Lacrimosus, Valmi, Bluemask, Grstain, Freakofnurture, Venu62, Mind- spillage, EugeneZelenko, Discospinster, Zaheen, Rich Farmbrough, Dbachmann, Mani1, Gronky, Bender235, Kbh3rd, Livajo, El C, Szyslak, Szquirrel, Kwamikagami, QuartierLatin1968, Matve, Sietse Snel, Dennis Brown, Bastique, Grick, Bobo192, Func, Polocrunch, Ziggurat, Giraffedata, Man vyi, Thewayforward, Rje, Mdd, Passw0rd, Jumbuck, Alansohn, Mark Dingemanse, Thebeginning, Cromwellt, Ish ishwar, Suruena, BDD, MIT Trekkie, Gigacannon, Kinema, Jane Santos~enwiki, Newnoise~enwiki, JALockhart, FrancisTyers, Angr, Velho, Woohookitty, Mindmatrix, Vaiyach, Wdkaye, Cbdorsett, Dolfrog, Yury Tarasievich, Karmosin, Doric Loon, Ashmoo, Graham87, TaivoLinguist, Chun-hian, Island, Joelemaltais, Sjö, Squideshi, Ej, Mayumashu, Quiddity, Vegaswikian, Kalogeropoulos, Trwier, FlaBot, Naraht, Musical Linguist, RexNL, Gurch, Pete.Hurd, Chobot, Metropolitan90, UkPaolo, YurikBot, Wavelength, RobotE, Deeptrivia, Jlittlet, Alt-o, RussBot, Phantomtiger, Notyourbroom, Gaius Cornelius, Pseudomonas, NawlinWiki, Fwc, AdiJapan, Aaron Schulz, Botteville, Maunus, Wknight94, Googl, Lt-wiki-bot, Ninly, Mike Dillon, Thnidu, Closedmouth, Donald Albury, Pb30, Feedmymind, Fram, Bentong Isles, Jonathan.s.kt, Purple Sheep, GrinBot~enwiki, DVD R W, Torgo, Sintonak.X, SmackBot, Bomac, Jagged 85, Iph, Sebesta, Gilliam, Skizzik, Desiphral, Chris the speller, TimBentley, Imiraven, Fplay, Ehrbar, MalafayaBot, Akanemoto, J. Spencer, DHN-bot~enwiki, Colonies Chris, Chlewbot, Yidisheryid, Zalmoxe, Stevenmitchell, Khoikhoi, Cybercobra, Infovoria, Daykart, Hgilbert, Drphilharmonic, DMacks, Just plain Bill, Mitsuki152, Nasz, Byelf2007, SashatoBot, ArglebargleIV, John, Ergative rlt, Almkglor, Pfold, J Crow, Astuishin, Ferhengvan, Jose77, Kvng, Norm mit, DDD DDD~enwiki, KisstheGEEK, Jrw7, Scarlet Lioness, Dlohcierekim, George100, CRGreathouse, Kylu, Richaraj, Brandon.macuser, MarsRover, WeggeBot, Neelix, Alton, Innoak, Paul500, Gregbard, Fil- ipeS, Cydebot, Mike2000~enwiki, Xemoi, Gogo Dodo, Indeterminate, Corpx, Aintsemic, Ludling, Asenine, Garik, Jsteph, Mattisse, Thijs!bot, Epbr123, Ante Aikio, Jobber, Raymond Feilner, Headbomb, Sobreira, Marek69, Picus viridis, Dezidor, Natalie Erin, Escarbot, Mandyvigilante, Pprabhakarrao, Weaponbb7, Hires an editor, AntiVandalBot, Majorly, Gioto, Widefox, Caledones, Dbrodbeck, Sveno- nius, 2bornot2b, JMStewart, Comhreir, TuvicBot, JAnDbot, Kaobear, The Transhumanist, Shermanmonroe, Mindstore, LittleOldMe, .anacondabot, Ophion, FaerieInGrey, Connormah, Puellanivis, Bongwarrior, VoABot II, Askari Mark, MastCell, Prof. Sergei Grinev- Griniewicz, Ling.Nut, Ecksemmess, Artegis, Bookuser, Objectivesea, Ninabeck~enwiki, VegKilla, FatsoGSD, Nposs, Cooper-42, Emw, User A1, JoergenB, E104421, Esanchez7587, WLU, JNF Tveit, Gun Powder Ma, MartinBot, Graphitus, Himatsu Bushi, R'n'B, Ewan dunbar, Francis Tyers, Filll, Andjor, Trusilver, Maurice Carbonaro, Nigholith, Bot-Schafter, NaomiRG, The Transhumanist (AWB), Richard Brooks MK, Headfacemouth, NewEnglandYankee, Trilobitealive, Touch Of Light, Flatterworld, ThinkBlue, Christopher Kraus, AlanBarnet, Joshua Issac, Emil Perder, Treisijs, Avitohol, Greg-si, Squids and Chips, Idioma-bot, Glossologist, VolkovBot, Andrea moro, Macedonian, Ematch1, Jeff G., LokiClock, Powered, TXiKiBoT, Deleet, Jalwikip, Technopat, Noticket, JhsBot, Akerbeltz, Rumiton, Lecorbeau5, Synthebot, Lova Falk, Cnilep, Monty845, HiDrNick, AlleborgoBot, Logan, Newbyguesses, Tcamps42, SieBot, YonaBot, Ni- hil novi, Krawi, Smsarmad, LeadSongDog, Firstwingman, Oda Mari, Antonio Lopez, Steven Zhang, Rosiestep, Spitfire19, CharlesGilling- ham, Rick richards, Mr. Stradivarius, Susan118, Efe, Mr. Granger, KBYU, Ricklaman, Pedrodius, ClueBot, Robbiemuffin, PipepBot, WinedAndDined, Foxj, Ufotrain, DionysosProteus, Michaelrayw2, Apmab1, DragonBot, Alexbot, Rhododendrites, Emufreak2, Jotter- bot, Hans Adler, Puceron, SchreiberBike, Elatb, Languageleon, DumZiBoT, Muspilli, Aaron north, Milesbarger, GordonUS, Libcub, Jelly Roal, Jbeans, AkselGerner, Tayste, Addbot, Crazycrazycrazycrazy, WmGB, SunDragon34, Ronhjones, CarsracBot, Joycloete, Glane23, Debresser, Favonian, ChenzwBot, SpBot, Tabitha2000, SamatBot, LinkFA-Bot, Tassedethe, Numbo3-bot, Supriyya, Erutuon, Tide rolls, Lightbot, OlEnglish, Luckas-bot, Yobot, Ptbotgourou, Rogerb67, KamikazeBot, Universal Life, Alvinpoe, Synchronism, AnomieBOT, Rubinbot, Rjanag, Galoubet, LMBM2012, Jo3sampl, Citation bot, Bobelvis, Quebec99, MauritsBot, Xqbot, TheAMmollusc, Tinucheri- 30.5. TEXT AND IMAGE SOURCES, CONTRIBUTORS, AND LICENSES 133

anBot II, Capricorn42, Poetaris, Jmundo, Petropoxy (Lithoderm Proxy), GrouchoBot, Lilfireball05, Ute in DC, RibotBOT, SassoBot, Macbookair3140, Kirin13, GhalyBot, Tech408, Eugene-elgato, Kompar~enwiki, Reinhard Hartmann, Knightingail, Tobby72, Pepper, The Nerd from Earth, Osmòtic, Dolmagray, Jadoogiri, Stinguist, Citation bot 1, Mundart, Scholar1982, SuperJew, Wikispan, Trou- bledTraveler, Pinethicket, Wesn, Jonesey95, Hoo man, Piandcompany, Orenburg1, FoxBot, TobeBot, Trappist the monk, Darigan, Lam Kin Keung, Alinovic, Jan.Keromnes, Kajervi, Ashot Gabrielyan, Tbhotch, TheMesquito, Ulgen, Hermitstudy, Tartarus21, EmausBot, WikitanvirBot, Allformweek, RA0808, Fellowscientist, Wikipelli, Lucas Thoms, ZéroBot, Fæ, Midas02, Jt6195, Caiomarinho, Wayne Slam, Znth, Akshatrathi, Philafrenzy, Gem131, Hm423, Tijfo098, ChuispastonBot, Gum375, NTox, ResidentAnthropologist, Reynoldst, DASHBotAV, Jeliot, Eshleyy, Mjbmrbot, ClueBot NG, Squarrels, AK IM OP, Kalomfa, Jones Malcolm, Triggerzeal13, Widr, LiliCharlie, Ncapriola, Xagg, Linguaua, Helpful Pixie Bot, MIRACLE ONOCHIE, Mr. Stradivarius on tour, BG19bot, Skifunkster2011, Rulevoider, Amirke5585, Mathematicmajic, AvocatoBot, Jobin RV, ElbowingYouOut, Aranea Mortem, Tamara Ustinova, Vanberg, Brad7777, Hus- seinjacob75, Aisteco, BattyBot, HueSatLum, Nawafpower, Theoretick, VinTing, Oskar.shura.too.cool.101, Gansam12, Depthgr8, JYBot, Dexbot, Mogism, AMMeier, Lugia2453, Dirtclustit, MrsCaptcha, Malaysiaboy, CsDix, BreakfastJr, Eajohnson09, Jorge el Greco, Psi- coFS, Hollymcwane2, Nigellwh, Laurie Sanderz, Befneyvarley, JaconaFrere, MarthaStew, Monkbot, Loyalismus, Linda.lulu88, Alliberry, Hellangels1234, Klib.so, Heartily, Lklinguist, Ryanhw, Jimmykier, Pplautomi, Adam31415926535, Vshea22, Vnhb57, SamantaUdarata, ArkansaFresco, Mustapha Mourchid, Karl-Theater, KasparBot and Anonymous: 546 • Loaded question Source: https://en.wikipedia.org/wiki/Loaded_question?oldid=667013077 Contributors: Tarquin, Sir Paul, Arteitle, Dcoetzee, Ww, Radgeek, Talkingtoaj, Tschild, Furrykef, Populus, Wiwaxia, Mrdice, Altenmann, Justanyone, Tobias Bergemann, Taak, Chameleon, Mckaysalisbury, Piotrus, Kusunose, Nick-in-South-Africa, DragonflySixtyseven, Uly, Karl Dickman, Calwatch, Guppyfin- soup, Ta bu shi da yu, Rich Farmbrough, Silence, Gronky, Art LaPella, Euyyn, Jonathan Drain, Lore Sjoberg, Viriditas, La goutte de pluie, Cspurrier, Kbir1, Brainy J, Sam Korn, Officiallyover, Free Bear, Babajobu, Ashley Pomeroy, Mailer diablo, Schapel, Nightstallion, AlexMyltsev, Stemonitis, Miss Madeline, Tabletop, Force10, BD2412, Jshadias, Sjö, Rjwilmsi, Wildyoda, Leithp, Harmil, Pathoschild, Jondor, Tardis, Common Man, Stefanbojark, Bgwhite, Wasted Time R, YurikBot, Hairy Dude, RussBot, Red Slash, Bovineone, Jon- colvin, KissL, Grafen, Keith aquino, Eurosong, Bomkia~enwiki, StuRat, Lt-wiki-bot, TheMadBaron, Thnidu, JQF, ZoFreX, Ybbor, Groyolo, SmackBot, Kellen, Reedy, Eskimbot, BiT, Geoff B, The Rhymesmith, Nerrolken, Goaty, Boothman, Metamagician3000, An- deggs, BrotherFlounder, USSVagrant, Robofish, Antonielly, Grumpyyoungman01, TheHYPO, Eoseth, Spebudmak, Colonel Warden, JHP, DougHill, Father Ignatius, Penbat, Gregbard, Cydebot, Tenbergen, Alaibot, Thijs!bot, D4g0thur, Tocharianne, Kingnixon, Oreo Priest, AntiVandalBot, Uvaphdman, Defaultdotxbe, Waerloeg, Clan-destine, Opertinicy, Legitimus, Dreaded Walrus, Barek, Jmartins- son, Destynova, JaGa, DGG, S3000, Jackson Peebles, Dennisthe2, Akronym, DandyDan2007, P4k, Trumpet marietta 45750, Pyrospirit, Foofighter20x, Dorftrottel, MichaelProcton, Davin, Jamelan, Graymornings, Wassamatta, Lova Falk, SimonTrew, Cosmo0, ClueBot, Niceguyedc, Trivialist, Rockfang, SchreiberBike, GeeAlice, Jojhutton, AnnaFrance, Thiefalt, Weaseloid, Legobot, Yobot, AnomieBOT, Fahadsadah, Omnipaedista, Logicchecker, Mitranim, Peteinterpol, Machine Elf 1735, JKDw, Momergil, MeUser42, ArtistScientist, Dryranm, Dcjackman, EmausBot, ZéroBot, Morgan Hauser, Lothar Klaic, Alchemy Heels I, ClueBot NG, Widr, MerlIwBot, Helpful Pixie Bot, Popcornduff, Whitmerj, Meabandit, JYBot, BayShrimp, Ceosad, Ihaveacatonmydesk and Anonymous: 106 • Material conditional Source: https://en.wikipedia.org/wiki/Material_conditional?oldid=665659334 Contributors: William Avery, Dcljr, AugPi, Charles Matthews, Dcoetzee, Doradus, Cholling, Giftlite, Jason Quinn, Nayuki, TedPavlic, Elwikipedista~enwiki, Nortexoid, Vesal, Eric Kvaalen, BD2412, Kbdank71, Martin von Gagern, Joel D. Reid, Fresheneesz, Vonkje, NevilleDNZ, RussBot, KSchutte, NawlinWiki, Trovatore, Avraham, Closedmouth, Arthur Rubin, SyntaxPC, Fctk~enwiki, SmackBot, Amcbride, Incnis Mrsi, Pokipsy76, BiT, Mhss, Jaymay, Tisthammerw, Sholto Maud, Robma, Cybercobra, Jon Awbrey, Oceanofperceptions, Byelf2007, Grumpyyoung- man01, Clark Mobarry, Beefyt, Rory O'Kane, Dreftymac, Eassin, JRSpriggs, Gregbard, FilipeS, Cydebot, Julian Mendez, Thijs!bot, Egriffin, Jojan, Escarbot, Applemeister, WinBot, Salgueiro~enwiki, JAnDbot, Olaf, Alastair Haines, Arno Matthias, JaGa, Santiago Saint James, Pharaoh of the Wizards, Pyrospirit, SFinside, Anonymous Dissident, The Tetrast, Cnilep, Radagast3, Newbyguesses, Light- breather, Paradoctor, Iamthedeus, Soler97, Francvs, Classicalecon, Josang, Ruy thompson, Watchduck, Hans Adler, Djk3, Marc van Leeuwen, Tbsdy lives, Addbot, Melab-1, Fyrael, Morriswa, SpellingBot, CarsracBot, Chzz, Jarble, Meisam, Luckas-bot, AnomieBOT, Sonia, Pnq, Bearnfæder, FrescoBot, Greyfriars, Machine Elf 1735, RedBot, MoreNet, Beyond My Ken, John of Reading, Hgetnet, Hi- bou57, ClueBot NG, Movses-bot, Jiri 1984, Masssly, Dooooot, Noobnubcakes, Hanlon1755, Leif Czerny, CarrieVS, Jochen Burghardt, Lukekfreeman, ArchReader, NickDragonRyder, Indomitavis, Rathkirani, AnotherPseudonym, Xerula, Matthew Kastor, Mathematical Truth and Anonymous: 73 • Material implication Source: https://en.wikipedia.org/wiki/Material_implication?oldid=594296624 Contributors: Incnis Mrsi, Greg- bard, Cydebot, Fyrael and DPL bot • Modus ponens Source: https://en.wikipedia.org/wiki/Modus_ponens?oldid=661639310 Contributors: AxelBoldt, Zundark, Tarquin, Larry Sanger, Andre Engels, Rootbeer, Ryguasu, Frecklefoot, Michael Hardy, Voidvector, Liftarn, J'raxis, AugPi, BAxelrod, Charles Matthews, Dysprosia, Jitse Niesen, Andyfugard, Ruakh, Giftlite, Jrquinlisk, Leonard G., 20040302, Siroxo, Matt Crypto, Neilc, Toy- toy, Antandrus, Yayay, Sword~enwiki, Jiy, Rich Farmbrough, Elwikipedista~enwiki, Jonon, Nortexoid, Obradovic Goran, Jumbuck, Marabean, M7, Trylks, Dandv, Ruud Koot, Waldir, Marudubshinki, Graham87, Rjwilmsi, Notapipe, [email protected], RexNL, Spencerk, WhyBeNormal, YurikBot, Vecter, KSmrq, Shawn81, Schoen, Voidxor, Noam~enwiki, Hakeem.gadi, Pacogo7, Otto ter Haar, Incnis Mrsi, Eskimbot, Mhss, Nicolas.Wu, Cybercobra, Spiritia, Wvbailey, Gobonobo, Robofish, Jim.belk, Don Warren, CBM, Gregbard, Cydebot, Steel, Thijs!bot, Leuko, Magioladitis, Yocko, Mbarbier, Ercarter, Heliac, Anarchia, Nathanjones15, Policron, ABF, TXiKiBoT, Broadbot, Cgwaldman, Jamelan, Chenzw, Yintan, Svick, Classicalecon, Velvetron, Logperson, Alejandrocaro35, Darkicebot, Addbot, Luckas-bot, Yobot, Jim1138, Jo3sampl, Xqbot, Tyrol5, Romnempire, Machine Elf 1735, WillNess, Wingman4l7, ClueBot NG, Trust- edgunny, DBigXray, Svartskägg, MRG90, Dooooot, Planeswalkerdude, Firefirefire2 and Anonymous: 83 • Presupposition Source: https://en.wikipedia.org/wiki/Presupposition?oldid=617114070 Contributors: Vaganyik, Markhurd, Cashton, Ajd, Filemon, Monedula, Javier Carro, Burschik, Rich Farmbrough, Jnestorius, Szyslak, Seanb, Tritium6, Vesal, The JPS, KYPark, Zozza~enwiki, The wub, Chobot, Korg, Hairy Dude, TimNelson, Neither, Action potential, Gadget850, HeadleyDown, SmackBot, BiT, Tyciol, Rory096, Gobonobo, Antonielly, Gregbard, Thijs!bot, Knakts, Luna Santin, Nimic86, Nigholith, Hirokun, DragonBot, Three- quarter-ten, SchreiberBike, Vegetator, Addbot, Lkarttunen, Yobot, Ptbotgourou, MauritsBot, Omnipaedista, The Wiki ghost, Harold Philby, MastiBot, Thabick, MeUser42, Bgpaulus, Fernandabelen, Trustnummer, WikitanvirBot, Rtb30, Helpful Pixie Bot, Epoloski and Anonymous: 36 • Question Source: https://en.wikipedia.org/wiki/Question?oldid=666403878 Contributors: Manning Bartlett, Heron, Ryguasu, Michael Hardy, Ihcoyc, Mkweise, Theresa knott, Raven in Orbit, Rdrozd, Dcoetzee, Carlossuarez46, Robbot, Zandperl, RedWolf, Altenmann, Postdlf, Ruakh, Seth Ilys, BovineBeast, Pablo-flores, Ancheta Wis, Gtrmp, Kenny sh, Brian Kendig, Zigger, Jfdwolff, Christopherlin, Lichtconlon, Woggly, Utcursch, LiDaobing, JoJan, Piotrus, Jossi, Karol Langner, Marcos, LHOON, Ukexpat, Squash, Gazpacho, Mike 134 CHAPTER 30. TRANSPOSE GRAPH

Rosoft, Discospinster, Rich Farmbrough, TomPreuss, Nard the Bard, Ihatefile007, ESkog, Kjoonlee, Mr. Billion, Kwamikagami, Art LaPella, Peter Greenwell, Chirag, Mdd, Mark Dingemanse, Kurt Shaped Box, Sobolewski, Wtmitchell, SidP, VivaEmilyDavies, Arthexis, Velho, Woohookitty, Jonathan de Boyne Pollard, Bellenion, -Ril-, Umofomia, Marudubshinki, Tjbk tjb, Graham87, BD2412, DePiep, Dpv, MarSch, Quiddity, MJGR, FlaBot, DaGizza, Visor, YurikBot, Lighterside, RussBot, Piet Delport, Schoen, Wimt, Matia.gr, Pt- camn, Bdhamilton, Misza13, CLW, Jkelly, Closedmouth, E Wing, SmackBot, Tumbleman, Edgar181, HalfShadow, Gilliam, Ohnoit- sjamie, JordeeBec, SchfiftyThree, Neo-Jay, Octahedron80, Sct72, Snowmanradio, GeorgeMoney, Jon Awbrey, Rivers99, Ben Moore, Alatius, Twas Now, Esurnir, Tawkerbot2, Dhammapal, The ed17, Floridi~enwiki, Leevanjackson, Orderinchaos, Gregbard, FilipeS, Dusty relic, Tawkerbot4, Zalgo, FrancoGG, Letranova, Epbr123, Escarbot, AntiVandalBot, Seaphoto, Beta16, Drakonicon, Fayenatic london, Modernist, PhJ, Qwerty Binary, JAnDbot, MER-C, Eurobas, Acroterion, Bongwarrior, VoABot II, Nimic86, Catgut, DerHexer, JaGa, MartinBot, Mermaid from the Baltic Sea, R'n'B, J.delanoy, Pharaoh of the Wizards, Bogey97, Gzkn, I Love MediaWiki, Smile- sALot, SJP, Cometstyles, JavierMC, Halmstad, Meaningful Username, DSRH, TheGreenFog, TXiKiBoT, Deleet, Vipinhari, GDonato, Hawk fan2, Psyche825, Davin, Madhero88, Brianga, The Random Editor, SieBot, Caltas, RJaguar3, Yintan, Txshldem07, MWLit- tleGuy, Josephjordania, Oxymoron83, OKBot, Tuatarian, Escape Orbit, ClueBot, Sjlewis55, The Thing That Should Not Be, Secret (renamed), Brewcrewer, Lartoven, SchreiberBike, ClanCC, XLinkBot, JinJian, Ejosse1, Aucassin, Addbot, Gc1mak, Jojhutton, Captain- tucker, AndersBot, Debresser, Favonian, Mhvahdat, Luckas-bot, Yobot, 2D, Ptbotgourou, Fraggle81, Anypodetos, CinchBug, Maxí, IW.HG, Nardia0, AnomieBOT, DemocraticLuntz, Rjanag, Kingpin13, Neurolysis, ArthurBot, DSisyphBot, MY MOM WONT LET ME EAT AT THE TABLE WITH A SWORD., Maddie!, Abce2, RibotBOT, Pugduke, LucienBOT, Winterst, JKDw, Pinethicket, El estremeñu, Reconsider the static, Leasnam, NYMFan69-86, Lotje, Kjlovescats, Virginexplorer, Bento00, Sneffel, EmausBot, Wikitanvir- Bot, Ajraddatz, Tommy2010, Irteza adam, Thoughts in space, ZéroBot, The Nut, Midas02, Monkeybutt50, Rcsprinter123, Chuispaston- Bot, Lom Konkreta, ClueBot NG, Jeposadap, Jarviknarvik, Rex4445, Widr, Amy Zitzelberger, HMSSolent, DBigXray, Coheninmontana, Alirezabot, Crazycrazycrazygrrl, Little miss tyra, Jaden4244, Victor Yus, Justify265, Redredred21, ChrisGualtieri, Commontern, Night- timeDriver50000, Codename Lisa, CASEYGARTLAND, Lugia2453, Hillbillyholiday, Tentinator, Reedjc, Leoniewild, EarnSomeRe- spect, Viholi, Dj28brandy, Lakun.patra, Bigfatballs23, Ceosad, Mamamajama, Vvvgcfff, Azealia911, KasparBot, Slifer274, Dr.Hua and Anonymous: 260 • Question dodging Source: https://en.wikipedia.org/wiki/Question_dodging?oldid=634005896 Contributors: ChessA4, JKDw, EmCat24, Top Jim, Helpful Pixie Bot, Monkbot, Ihaveacatonmydesk and Anonymous: 4 • Semantics Source: https://en.wikipedia.org/wiki/Semantics?oldid=668129588 Contributors: The Anome, Youssefsan, Vaganyik, Or- tolan88, Ben-Zin~enwiki, Hannes Hirzel, Heron, Ryguasu, Netesq, Stevertigo, Michael Hardy, Pit~enwiki, Gdarin, Rp, Kku, Looxix~enwiki, Glenn, Rossami, Andres, Hectorthebat, Jitse Niesen, Mjklin, Haukurth, Shizhao, Fvw, Jens Meiert, Jon Roland, Seriv, Robbot, Lambda, Pigsonthewing, Jakohn, Kiwibird, Sverdrup, Rursus, Moink, Spellbinder, Marc Venot, Gwalla, Markus Krötzsch, Jpta~enwiki, HHirzel, Everyking, Zhen Lin, Eequor, Khalid hassani, Jackol, Javier Carro, JoJan, Mukerjee, Augur, Kntg, Bornslippy, Urhixidur, Yuriz, Lu- cidish, Rich Farmbrough, Cacycle, Rama, Slipstream, Kzzl, Dbachmann, Paul August, Jaberwocky6669, Evice, El C, Chalst, Joan- joc~enwiki, Linkoman, Enric Naval, Nortexoid, Jonsafari, Jooyoonchung, Helix84, Anthony Appleyard, Mark Dingemanse, Sligocki, Cdc, Sabrebattletank, Ish ishwar, Tycho, EvenT, Jason L. Gohlke, Redvers, Simlorie, Galaxiaad, Ott, Jtauber, Velho, Woohookitty, Mindmatrix, Kokoriko, Kelisi, Analogisub, SDC, Mandarax, Graham87, Imersion, Rjwilmsi, Mayumashu, Koavf, Jivecat, Dmccreary, Brighterorange, Mlinar~enwiki, NeoAmsterdam, FlaBot, Sinatra, Isotope23, Ben Babcock, Vonkje, Comiscuous, Lambyuk, Chobot, Sonic Mew, Roboto de Ajvol, YurikBot, Wavelength, Hairy Dude, Retodon8, Stephenb, Anomalocaris, NawlinWiki, Maunus, Mark- Brooks, JECompton, WAS 4.250, Light current, G. Lakoff, Lt-wiki-bot, Donald Albury, SMcCandlish, JuJube, Pred, AGToth, Nick- elShoe, Sardanaphalus, SmackBot, Zerida, Unyoyega, Shamalyguy, Lindosland, Chris the speller, MasterofUnvrs314, MK8, MalafayaBot, Droll, Jerome Charles Potts, A. B., Scwlong, Zsinj, Frap, Ioscius, Chlewbot, SundarBot, Khoikhoi, Cybercobra, Iblardi, Battamer, Jon Awbrey, Byelf2007, SashatoBot, 16@r, Hvn0413, Nabeth, Kvng, Hu12, Gandalf1491, J Di, DEddy, Ziusudra, George100, Stifynse- mons, Wolfdog, Sir Vicious, Kensall, Gregbard, FilipeS, Cydebot, Warhorus, ST47, Quibik, Nickleus, Gimmetrow, Thijs!bot, Wikid77, Runch, Mbell, Dalahäst, Azymuthca, X201, Nick Number, Mentifisto, AntiVandalBot, Shawn wiki, Gioto, Widefox, TimVickers, Dylan Lake, Danny lost, JAnDbot, MER-C, Shermanmonroe, Jmchambers90, Dcooper, .anacondabot, Daveh1, AndriesVanRenssen, Tmus- grove, Nicodemus13, Mahitgar, Revery~enwiki, Mechanismic, Ekotkie, MartinBot, J.delanoy, Cyborg Ninja, Piercetheorganist, Dbiel, Rod57, AKA MBG, Lygophile, Erick.Antezana, Lrunge, RasputinJSvengali, Macedonian, LokiClock, Philip Trueman, Amos Han, TXiKiBoT, Purpose Observatory, Aaeamdar, Goberiko~enwiki, HillarySco, Merijn2, Synthebot, Lova Falk, Cnilep, Jimbo2222, Lo- gan, Botev, SieBot, Nubiatech, Kgoarany, Asderff, PaulColby, Jerryobject, Yerpo, ScAvenger lv, Strife911, Bguest, MiNombreDeGuerra, Doc honcho, CharlesGillingham, Emptymountains, Martarius, ClueBot, Bbadree, Tanglewood4, Eklir, Niceguyedc, DragonBot, Awi007, PixelBot, Vanisheduser12345, Rhododendrites, MacedonianBoy, Cenarium, Aleksd, Micmachete, MystBot, Alanthehat, Addbot, Rdan- neskjold, The singapore ministry of education sucks, AVand, Guoguo12, Landon1980, Friginator, K1US, Aboctok, Ayatniazi, Cana- dianLinuxUser, Pirtskhalava, CarsracBot, Numbo3-bot, Erutuon, Tide rolls, JAHendler, Krixou, Legobot, Luckas-bot, TaBOT-zerem, Vanished user rt41as76lk, AnakngAraw, 8ung3st, Molewood6, Rockypedia, Rjanag, Govindmaheswaran, Jim1138, Materialscientist, Citation bot, LilHelpa, Xqbot, Hyggelig, Lynch9000s, Aenioc, JustinCope82, Omnipaedista, Benjamin Dominic, FrescoBot, Levalley, ,Jonkerz ,کاشف عقیل ,Citation bot 1, Mundart, Smithonian, Harold Philby, Pinethicket, Joost.b, RedBot, MastiBot, Nora lives, FoxBot Lotje, Theyetiman12345, RobotQuistnix, 2bluey, Mchcopl, Zegarad, EmausBot, Jefffi, Active Banana, Hpvpp, Alexey.kudinkin, Lla- mas4drama'10, Unreal7, SporkBot, Gabnh, Eric Biggs, Edunoramus, Kgsbot, Ready, Odysseus1479, Tijfo098, Manytexts, ClueBot NG, Squarrels, Aniketdalal, Movses-bot, Helpful Pixie Bot, BG19bot, BenSmak, Boblibr, Lawandeconomics1, Davidiad, Tom Pippens, Se- mantia, UnconsciousInferno, Darylgolden, Suraduttashandilya, Dave5702, Kevin12xd, Faizan, Bienmanchot, Ahernandez33, Didigodot, Noizy Boy, Sarahjane212013, Pavel Stankov, Csusarah, FelixRosch, Good afternoon, Nøkkenbuer, Spyker247, KasparBot, Vjpand and Anonymous: 277 • Signed graph Source: https://en.wikipedia.org/wiki/Signed_graph?oldid=624715547 Contributors: Michael Hardy, Giftlite, Mike Rosoft, ArnoldReinhold, Zaslav, Natalya, Rjwilmsi, Davepape, 1dragon, Jcarroll, Beetstra, Harrigan, Khamar, Blathnaid, Hermel, David Epp- stein, !dea4u, Addbot, Luckas-bot, Yobot, Twri, Labbottcmu, Sibbles, FrescoBot, Kiefer.Wolfowitz, RjwilmsiBot, EmausBot, Wcherowi, Briansdumb and Anonymous: 6 • Skew-symmetric graph Source: https://en.wikipedia.org/wiki/Skew-symmetric_graph?oldid=609519537 Contributors: Giftlite, Zaslav, Oliphaunt, Ketiltrout, Rjwilmsi, Chris the speller, TheTito, David Eppstein, Addbot, Citation bot, Twri, Xqbot, Miym, Citation bot 1 and Helpful Pixie Bot • Transpose graph Source: https://en.wikipedia.org/wiki/Transpose_graph?oldid=541280031 Contributors: Giftlite, Zaslav, Karelj, Cara- binieri, Chaotic Mind, Falnom1987, Vanish2, David Eppstein, R'n'B, VolkovBot, Demize, Addbot, Luckas-bot, Yobot, Twri, Makecat-bot and Anonymous: 1 30.5. TEXT AND IMAGE SOURCES, CONTRIBUTORS, AND LICENSES 135

30.5.2 Images • File:4-tournament.svg Source: https://upload.wikimedia.org/wikipedia/commons/8/89/4-tournament.svg License: Public domain Con- tributors: Own work Original artist: Booyabazooka • File:6n-graf.svg Source: https://upload.wikimedia.org/wikipedia/commons/5/5b/6n-graf.svg License: Public domain Contributors: Image: 6n-graf.png simlar input data Original artist: User:AzaToth • File:Ancient_Tamil_Script.jpg Source: https://upload.wikimedia.org/wikipedia/commons/6/69/Ancient_Tamil_Script.jpg License: CC BY 2.0 Contributors: Ancient Tamil Script Original artist: Symphoney Symphoney from New York, US • File:Bidirected_graph_features.svg Source: https://upload.wikimedia.org/wikipedia/commons/6/68/Bidirected_graph_features.svg Li- cense: CC0 Contributors: Own work Original artist: David Eppstein • File:Brain.png Source: https://upload.wikimedia.org/wikipedia/commons/7/73/Nicolas_P._Rougier%27s_rendering_of_the_human_ brain.png License: GPL Contributors: http://www.loria.fr/~{}rougier Original artist: Nicolas Rougier • File:Commons-logo.svg Source: https://upload.wikimedia.org/wikipedia/en/4/4a/Commons-logo.svg License: ? Contributors: ? Origi- nal artist: ? • File:Cosine_fixed_point.svg Source: https://upload.wikimedia.org/wikipedia/commons/e/ea/Cosine_fixed_point.svg License: CC-BY- SA-3.0 Contributors: ? Original artist: ? • File:Covering-graph-1.svg Source: https://upload.wikimedia.org/wikipedia/commons/c/ce/Covering-graph-1.svg License: CC BY-SA 3.0 Contributors: Own work Original artist: Miym • File:Covering-graph-2.svg Source: https://upload.wikimedia.org/wikipedia/commons/2/2d/Covering-graph-2.svg License: CC BY- SA 3.0 Contributors: Own work Original artist: Miym • File:Covering-graph-4.svg Source: https://upload.wikimedia.org/wikipedia/commons/9/98/Covering-graph-4.svg License: CC BY- SA 3.0 Contributors: Own work Original artist: Miym • File:Directed.svg Source: https://upload.wikimedia.org/wikipedia/commons/a/a2/Directed.svg License: Public domain Contributors: ? Original artist: ? • File:DirectedDegrees.svg Source: https://upload.wikimedia.org/wikipedia/commons/7/77/DirectedDegrees.svg License: GFDL Con- tributors: Own work Original artist: Melchoir • File:Directed_acyclic_graph_2.svg Source: https://upload.wikimedia.org/wikipedia/commons/0/03/Directed_acyclic_graph_2.svg Li- cense: Public domain Contributors: Own work Original artist: Johannes Rössel (talk) • File:Disambig_gray.svg Source: https://upload.wikimedia.org/wikipedia/en/5/5f/Disambig_gray.svg License: Cc-by-sa-3.0 Contribu- tors: ? Original artist: ? • File:Edit-clear.svg Source: https://upload.wikimedia.org/wikipedia/en/f/f2/Edit-clear.svg License: Public domain Contributors: The Tango! Desktop Project. Original artist: The people from the Tango! project. And according to the meta-data in the file, specifically: “Andreas Nilsson, and Jakub Steiner (although minimally).” • File:Fixed_point_example.svg Source: https://upload.wikimedia.org/wikipedia/commons/2/20/Fixed_point_example.svg License: CC0 Contributors: Own work Original artist: Krishnavedala • File:Folder_Hexagonal_Icon.svg Source: https://upload.wikimedia.org/wikipedia/en/4/48/Folder_Hexagonal_Icon.svg License: Cc- by-sa-3.0 Contributors: ? Original artist: ? • File:Globelang.png Source: https://upload.wikimedia.org/wikipedia/commons/2/2f/Globelang.png License: Public domain Contrib- utors: Orig- inal artist: User:Ikiroid • File:Graph_isomorphism_a.svg Source: https://upload.wikimedia.org/wikipedia/commons/9/9a/Graph_isomorphism_a.svg License: CC-BY-SA-3.0 Contributors: ? Original artist: ? • File:Graph_isomorphism_b.svg Source: https://upload.wikimedia.org/wikipedia/commons/8/84/Graph_isomorphism_b.svg License: CC-BY-SA-3.0 Contributors: ? Original artist: ? • File:Implication_graph.svg Source: https://upload.wikimedia.org/wikipedia/commons/2/2f/Implication_graph.svg License: Public do- main Contributors: Own work Original artist: David Eppstein • File:Incidence_matrix_-_directed_graph.svg Source: https://upload.wikimedia.org/wikipedia/commons/5/50/Incidence_matrix_-_directed_ graph.svg License: CC BY-SA 3.0 Contributors: Own work Original artist: Pistekjakub • File:Internet_map_1024.jpg Source: https://upload.wikimedia.org/wikipedia/commons/d/d2/Internet_map_1024.jpg License: CC BY 2.5 Contributors: Originally from the English Wikipedia; description page is/was here. Original artist: The Opte Project • File:Involution.svg Source: https://upload.wikimedia.org/wikipedia/commons/9/98/Involution.svg License: Public domain Contributors: self-made, with en:Inkscape Original artist: Oleg Alexandrov • File:Konigsberg_bridges.png Source: https://upload.wikimedia.org/wikipedia/commons/5/5d/Konigsberg_bridges.png License: CC- BY-SA-3.0 Contributors: Public domain (PD), based on the image • 136 CHAPTER 30. TRANSPOSE GRAPH

Original artist: Bogdan Giuşcă • File:Logic_portal.svg Source: https://upload.wikimedia.org/wikipedia/commons/7/7c/Logic_portal.svg License: CC BY-SA 3.0 Con- tributors: Own work Original artist: Watchduck (a.k.a. Tilman Piesk) • File:Logical_connectives_Hasse_diagram.svg Source: https://upload.wikimedia.org/wikipedia/commons/3/3e/Logical_connectives_ Hasse_diagram.svg License: Public domain Contributors: Own work Original artist: Watchduck (a.k.a. Tilman Piesk) • File:Mergefrom.svg Source: https://upload.wikimedia.org/wikipedia/commons/0/0f/Mergefrom.svg License: Public domain Contribu- tors: ? Original artist: ? • File:ParseTree.svg Source: https://upload.wikimedia.org/wikipedia/commons/6/6e/ParseTree.svg License: Public domain Contributors: en:Image:ParseTree.jpg Original artist: Traced by User:Stannered • File:People_icon.svg Source: https://upload.wikimedia.org/wikipedia/commons/3/37/People_icon.svg License: CC0 Contributors: Open- Clipart Original artist: OpenClipart • File:Portal-puzzle.svg Source: https://upload.wikimedia.org/wikipedia/en/f/fd/Portal-puzzle.svg License: Public domain Contributors: ? Original artist: ? • File:Psi2.svg Source: https://upload.wikimedia.org/wikipedia/commons/6/6c/Psi2.svg License: Public domain Contributors: ? Original artist: ? • File:Question_book-new.svg Source: https://upload.wikimedia.org/wikipedia/en/9/99/Question_book-new.svg License: Cc-by-sa-3.0 Contributors: Created from scratch in Adobe Illustrator. Based on Image:Question book.png created by User:Equazcion Original artist: Tkgd2007 • File:Socrates.png Source: https://upload.wikimedia.org/wikipedia/commons/c/cd/Socrates.png License: Public domain Contributors: Transferred from en.wikipedia to Commons. Original artist: The original uploader was Magnus Manske at English Wikipedia Later versions were uploaded by Optimager at en.wikipedia. • File:Symbol_list_class.svg Source: https://upload.wikimedia.org/wikipedia/en/d/db/Symbol_list_class.svg License: Public domain Con- tributors: ? Original artist: ? • File:Text_document_with_red_question_mark.svg Source: https://upload.wikimedia.org/wikipedia/commons/a/a4/Text_document_ with_red_question_mark.svg License: Public domain Contributors: Created by bdesham with Inkscape; based upon Text-x-generic.svg from the Tango project. Original artist: Benjamin D. Esham (bdesham) • File:Venn1011.svg Source: https://upload.wikimedia.org/wikipedia/commons/1/1e/Venn1011.svg License: Public domain Contributors: ? Original artist: ? • File:Whitneys_theorem_exception.svg Source: https://upload.wikimedia.org/wikipedia/commons/e/ec/Whitneys_theorem_exception. svg License: Public domain Contributors: • Complete_graph_K3.svg Original artist: User:Dcoetzee *derivative work: Dcoetzee (talk) • File:Wikibooks-logo-en-noslogan.svg Source: https://upload.wikimedia.org/wikipedia/commons/d/df/Wikibooks-logo-en-noslogan. svg License: CC BY-SA 3.0 Contributors: Own work Original artist: User:Bastique, User:Ramac et al. • File:Wikibooks-logo.svg Source: https://upload.wikimedia.org/wikipedia/commons/f/fa/Wikibooks-logo.svg License: CC BY-SA 3.0 Contributors: Own work Original artist: User:Bastique, User:Ramac et al. • File:Wikinews-logo.svg Source: https://upload.wikimedia.org/wikipedia/commons/2/24/Wikinews-logo.svg License: CC BY-SA 3.0 Contributors: This is a cropped version of Image:Wikinews-logo-en.png. Original artist: Vectorized by Simon 01:05, 2 August 2006 (UTC) Updated by Time3000 17 April 2007 to use official Wikinews colours and appear correctly on dark backgrounds. Originally uploaded by Simon. • File:Wikipedia_multilingual_network_graph_July_2013.svg Source: https://upload.wikimedia.org/wikipedia/commons/5/5b/Wikipedia_ multilingual_network_graph_July_2013.svg License: CC BY-SA 3.0 Contributors: Own work Original artist: Computermacgyver • File:Wikiquote-logo.svg Source: https://upload.wikimedia.org/wikipedia/commons/f/fa/Wikiquote-logo.svg License: Public domain Contributors: ? Original artist: ? • File:Wikisource-logo.svg Source: https://upload.wikimedia.org/wikipedia/commons/4/4c/Wikisource-logo.svg License: CC BY-SA 3.0 Contributors: Rei-artur Original artist: Nicholas Moreau • File:Wikiversity-logo-Snorky.svg Source: https://upload.wikimedia.org/wikipedia/commons/1/1b/Wikiversity-logo-en.svg License: CC BY-SA 3.0 Contributors: Own work Original artist: Snorky • File:Wiktionary-logo-en.svg Source: https://upload.wikimedia.org/wikipedia/commons/f/f8/Wiktionary-logo-en.svg License: Public domain Contributors: Vector version of Image:Wiktionary-logo-en.png. Original artist: Vectorized by Fvasconcellos (talk · contribs), based on original logo tossed together by Brion Vibber

30.5.3 Content license

• Creative Commons Attribution-Share Alike 3.0