<<

and Probability Lecture 1: Probability as Logic

Wesley Holliday & Thomas Icard UC Berkeley & Stanford

August 11, 2014 ESSLLI, T¨ubingen

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 1 Overview

Logic as a of:

I -preserving I / deduction

I consistency I rationality

I definability ...

Probability as a theory of:

I uncertain inference I induction

I learning I rationality

I information ...

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 2 Some questions and points of contact:

I In what ways might probability be said to extend logic?

I How do probability and various logical systems differ on what they say about rational inference?

I Can logic be used to gain a better understanding of probability? Say, through standard techniques of formalization, definability, questions of completeness, , etc.?

I Can the respective advantages of logical representation and probabilistic learning and inference be combined to create more powerful reasoning systems?

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 3 Course Outline

I Day 1: Probability as (Extended) Logic

I Day 2: Probability, Nonmonotonicity, and Graphical Models

I Day 3: Beyond Boolean Logic

I Day 4: Qualitative Probability

I Day 5: Logical Dynamics

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 4 Interpretations of Probability

I Frequentist: Probabilities are about ‘limiting frequencies’ of events.

I Propensity: Probabilities are about physical dispositions, or propensities, of events.

I Logical: Probabilities are determined objectively via a logical language and some additional principles, e.g., of ‘symmetry’.

I Bayesian: Probabilities are subjective and reflect an agent’s degree of confidence concerning some event.

. .

We will mostly remain neutral, focusing on mathematical questions, but will note when the interpretation is critical.

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 5 A measurable space is a pair (W , E) with

I W is an arbitrary I E is a σ-algebra over W , i.e., a subset of ℘(W ) closed under complement and infinite union.

A is a triple (W , E, µ), with (W , E) a measurable space and µ : E → [0, 1] a satisfying: I µ(W ) = 1 ; I µ(E ∪ F ) = µ(E ) + µ(F ), whenever E ∩ F = ∅ .

A random from (W , E, µ) to some other space (V , F) is a function X : W → V such that: whenever X (E ) = F and F ∈ F, we have E ∈ E .

Then µ induces a natural distribution for X , typically written

P(X = v) = µ({w ∈ W : X (w) = v}) .

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 6 Suppose we have a propositional logical language L generated as follows

ϕ ::= A | B | ... | ϕ ∧ ϕ | ¬ϕ

We can define a probability P : L → [0, 1] directly on L, requiring

I P(ϕ) = 1, for any ϕ ;

I P(ϕ ∨ ψ) = P(ϕ) + P(ψ), whenever  ¬(ϕ ∧ ψ) .

Equivalent set of requirements:

I P(ϕ) = 1 for any tautology ;

I P(ϕ) ≤ P(ψ) whenever  ϕ → ψ ;

I P(ϕ) = P(ϕ ∧ ψ) + P(ϕ ∧ ¬ψ) .

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 7 It is then easy to show:

I P(ϕ) = 0, for any ϕ ;

I P(¬ϕ) = 1 − P(ϕ) ;

I P(ϕ ∨ ψ) = P(ϕ) + P(ψ) − P(ϕ ∧ ψ) ;

I A propositional valuation sending to 1 or 0 is a special case of a probability function ;

I A probability on L gives rise to a standard over ‘world-states’, i.e., maximally consistent sets of formulas from L .

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 8 With this setup we can easily define, e.g., :

P(ϕ ∧ ψ) P(ϕ|ψ) = , P(ψ) provided P(ψ) > 0. With this it is easy to prove Bayes’ Theorem:

Theorem (Bayes)

P(ψ|ϕ)P(ϕ) P(ϕ|ψ) = . P(ψ)

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 9 Today’s Theme: Probability as Logic

Three senses in which probability has been claimed as part of logic:

1. Logical Deduction and Probability Preservation

2. Consistency and Fair

3. Patterns of Plausible Reasoning

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 10 PART I: Logical Deduction and Probability Preservation

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 11 Logical Deduction and Probability Preservation

Suppose we can show Γ ` ϕ in propositional logic, but the assumptions γ ∈ Γ themselves are uncertain.

Knowing about the probabilities of formulas in Γ, what can we conclude about the probability of ϕ?

Example (Adams & Levine) A telephone survey is made of 1,000 people as to their political party affiliation. Supposing 629 people report being Democrat, with some possibility of error in each case. What can we say about the probability of the following ? At least 600 people are Democrats.

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 12 Logical Deduction and Probability Preservation

Proposition (Suppes 1966)

If Γ ` ϕ, then for all P we have P(¬ϕ) ≤ ∑γ∈Γ P(¬γ).

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 13 Logical Deduction and Probability Preservation

Example (Kyburg’s 1965 ‘Lottery ’) Imagine a fair lottery with n tickets. Let γ1, ... , γn be sentences:

γi : ticket i is the winner.

n−1 Each sentence ¬γi might seem reasonable, since it has probability n . However, putting these all together rapidly decreases probability. Indeed, ^  P ¬γi = 0 . i≤n

This demonstrates that the (logical) conclusion of all these individually reasonable —that none of the tickets will win the lottery—has probability 0.

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 14 Logical Deduction and Probability Preservation

Definition (Suppes Entailment) Γ s ϕ, iff for all P: P(¬ϕ) ≤ ∑ P(¬γ) . γ∈Γ

Definition (Adams Entailment) Γ a ϕ, iff for all e > 0, there is a δ > 0, such that for all P:

if P(¬γ) < δ for all γ ∈ Γ, then P(¬ϕ) < e .

Lemma Γ s ϕ, iff Γ a ϕ.

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 15 Logical Deduction and Probability Preservation

Theorem (Adams 1966) is sound and complete for Suppes/Adams entailment.

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 16 Logical Deduction and Probability Preservation

The following are equivalent:

(a) Γ ` ϕ (b) Γ  ϕ (c) Γ s ϕ (d) Γ a ϕ .

Proof. (a) ⇔ (b) is just standard completeness.

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 17 Logical Deduction and Probability Preservation

The following are equivalent:

(a) Γ ` ϕ (b) Γ  ϕ (c) Γ s ϕ (d) Γ a ϕ .

Proof.

For (b) ⇒ (c), suppose Γ 2s ϕ, so there is a P, such that 1 − P(ϕ) > ∑ P(¬γ) . γ∈Γ

From this it follows that 1 > ∑ P(¬γ) + P(ϕ) . γ∈Γ

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 18 Logical Deduction and Probability Preservation

The following are equivalent:

(a) Γ ` ϕ (b) Γ  ϕ (c) Γ s ϕ (d) Γ a ϕ .

Proof.

For (b) ⇒ (c), suppose Γ 2s ϕ, so there is a P, such that 1 − P(ϕ) > ∑ P(¬γ) . γ∈Γ

From this it follows that

1 > ∑ P(¬γ) + P(ϕ) ≥ P(γ1 ∧ · · · ∧ γn → ϕ) , γ∈Γ

for any subset {γ1, ... , γn} ⊆ Γ. Since all tautologies must have probability 1, no such implication γ1 ∧ · · · ∧ γn → ϕ is a tautology.

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 19 Logical Deduction and Probability Preservation

The following are equivalent:

(a) Γ ` ϕ (b) Γ  ϕ (c) Γ s ϕ (d) Γ a ϕ .

Proof.

For (c) ⇒ (d), suppose Γ s ϕ, and e > 0. If Γ is finite, choose e δ = . |Γ| Then we have P(¬ϕ) ≤ ∑ P(¬γ) < ∑ δ = e . γ γ

Hence P(¬ϕ) < e, and Γ a ϕ. The case of Γ infinite is left as exercise.

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 20 Logical Deduction and Probability Preservation

The following are equivalent:

(a) Γ ` ϕ (b) Γ  ϕ (c) Γ s ϕ (d) Γ a ϕ .

Proof.

Finally, for (d) ⇒ (b), suppose Γ 2 ϕ. For any e > 0, we can simply take a valuation witnessing Γ 2 ϕ, which is itself a probability function P making P(ϕ) = 0 and P(γ) = 1, for all γ ∈ Γ. Hence Γ 2a ϕ.

(Note that essentially the same proof works to show (c) ⇒ (b).)

Thus we have shown (a) ⇔ (b) ⇒ (c) ⇒ (d) ⇒ (b). a

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 21 Logical Deduction and Probability Preservation

Example In the telephone survey example, suppose each person’s statement has probability 0.999. Then the best upper bound we can get on probability for the negation of the conclusion is

629 ∗ 0.001 = 0.629 .

That is, our best lower bound on the probability that there are at least 600 Democrats is 0.371. This seems weak.

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 22 Logical Deduction and Probability Preservation

Definition 0 0 Where Γ  ϕ, we say Γ ⊆ Γ is essential if Γ − Γ 2 ϕ.

Definition Given Γ  ϕ, the degree of essentialness of γ ∈ Γ is given by 1 E (γ) = , |Sγ|

where |Sγ| is the size of the smallest essential set containing γ.

Theorem (Adams & Levine 1975) If Γ ` ϕ, then P(¬ϕ) ≤ ∑ E (γ)P(¬γ) . γ∈Γ

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 23 Logical Deduction and Probability Preservation

Example Back to the telephone survey example, we can now obtain a much better lower bound on what seems to be a strong conclusion. The upper bound on the negation of the conclusion is now 1 629 ∗ 0.001 ∗ ≈ 0.02 , 30 making the lower bound on probability just below 0.98.

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 24 Part II: Consistency and Fair Odds

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 25 Consistency and Fair Odds

We strive to make judgments as dispassionate, reflective, and wise as possible by a doctrine that shows where and how they intervene and lays bare possible inconsistencies between judgments. There is an instructive between [deductive] logic, which convinces one that acceptance of some opinions as ‘certain’ entails the of others, and the theory of subjective probabilities, which similarly connects uncertain opinions.

–Bruno de Finetti, 1974

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 26 Consistency and Fair Odds

de Finetti’s central idea: Probabilities as chance-based odds

Example (Shakespeare, via Girotto & Gonzalez; Howson) Speed: Sir Proteus, save you! Saw you my master? Proteus: But now he parted hence, to embark for Milan. Speed: Twenty to one, then, he is shipp’d already.

(Two Gentlemen of Verona, 1592)

Gloss: A bet that collects Y if he is gone, and pays X if not, will be fair, provided X : Y = 20 : 1. These are fair betting odds. 20 1 The normalized betting odds, 21 and 21 , are called betting quotients.

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 27 Consistency and Fair Odds

I S: stake

I 1ϕ: indicator function (= 1 if ϕ holds, 0 otherwise) I p: betting quotient for ϕ

A bet on ϕ at (positive) stake S: I Pay S ∗ p I Receive S if ϕ is true, nothing otherwise

A bet against ϕ at (negative) stake S: I Receive S ∗ p I Pay S is ϕ is true, nothing otherwise

In both cases the value of the gamble is given by:

S(1ϕ − p)

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 28 Consistency and Fair Odds

The probability calculus is about probabilities of logically complex propositions. What can we say about fair betting odds in this setting?

de Finetti’s critical assumption (cf. Skyrms 1975, Howson 2007, etc.):

(A) Fair gambles do not become collectively unfair on collection into a joint gamble.

Assumption (A)—a kind of agglomerativity assumption—will allow us to about probabilities, by reasoning about which gambles are fair on the basis of the assumed fairness of other gambles.

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 29 Consistency and Fair Odds

Proposition

1. If p is fair betting quotient for ϕ, then 1 − p is fair for ¬ϕ.

2. If  ¬(ϕ ∧ ψ), with fair betting quotients p and q on ϕ and ψ, respectively, then p + q is fair betting quotient for ϕ ∨ ψ.

Proof. For 1., note that

S(1ϕ − p) = −S[1¬ϕ − (1 − p)] .

For 2., note that

S(1ϕ − p) + S(1ψ − q) = S[1ϕ∨ψ − (p + q)] .

We now invoke principle (A). a

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 30 Consistency and Fair Odds

Fundamental idea:

Take fair betting quotients p for formulas ϕ to be probabilities P(ϕ).

The of probability are thus axioms of consistency.

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 31 Consistency and Fair Odds

Dutch Book Theorem

If we conceive of principle (A) as telling us what combinations of bets a person ought to accept, then we can prove:

Theorem (de Finetti 1937) Someone with judgments P : L → [0, 1] is subject to a sure loss, iff P is inconsistent with the .

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 32 Consistency and Fair Odds

Dutch Book Theorem

Example Suppose  ¬(ϕ ∧ ψ), and P(ϕ ∨ ψ) = r, while P(ϕ) = p, P(ψ) = q, and r > p + q. Such a person would then sell bets:

−1(1ϕ − p) and − 1(1ψ − q)

while buying 1(1ϕ∨ψ − r) which results in a sure loss:

−r + p + q < 0 .

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 33 Consistency and Fair Odds

Suppose P : L0 → [0, 1] is defined on some finite sublanguage L0 ⊆ L.

Theorem (de Finetti 1974) The following are equivalent: 1. P does not conflict with the probability axioms (i.e., is consistent). 2. P is the restriction of a probability function on all of L. 3. No system of bets with odds as given by P leads to certain loss.

In fact, de Finetti proved something stronger, for general (potentially uncountable) probability spaces.

Many have argued that 3. is merely for dramatic effect, with consistency the fundamental notion. (See especially papers by Colin Howson.)

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 34 Part III: Patterns of Plausible Reasoning

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 35 Patterns of Plausible Reasoning

Our theme is simply: as Extended Logic. The “new” perception amounts to the recognition that the mathematical rules of probability theory are not merely rules for calculating frequencies of “random variables”; they are also the unique consistent rules for conducting inference (i.e. plausible reasoning) of any kind . . .

–E. T. Jaynes, 1995

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 36 Patterns of Plausible Reasoning

Deductive inference

All wombats are wild animals No wild animals snore No wombats snore

Jos´eis either in Frankfurt or Paris Jos´eis not in Paris Jos´eis in Frankfurt . .

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 37 Patterns of Plausible Reasoning

Plausible Inference

Example (Jaynes 1995) Suppose some dark night a policeman walks down a street, apparently deserted; but suddenly he hears a burglar alarm, looks across the street, and sees a jewelry store with a broken window. Then a gentleman wearing a mask comes crawling out through the broken window, carrying a bag which turns out to be full of expensive jewelry. The policeman doesn’t hesitate at all in deciding that this gentleman is dishonest.

ϕ being true makes ψ more plausible ψ is true ϕ becomes more plausible

(N.B. This is one of P´olya’s (1954) five patterns of plausible inference.)

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 38 Patterns of Plausible Reasoning

Jaynes’ three assumptions guiding a theory of probability:

1. Degrees of plausibility should be real .

2. We must respect logic in requiring consistency of assessment.

3. The aim is to capture intuitive reasoning patterns about plausibility.

ϕ|ψ How plausible is ϕ given that ψ is true?

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 39 Patterns of Plausible Reasoning

ϕ ∧ ψ|χ

I In order for ϕ ∧ ψ to be true, ψ should be true. Thus ψ|χ should somehow be involved.

I If ψ is true, we still need ϕ to be true as well. So ϕ|ψ ∧ χ should somehow be involved.

I Thus, we conclude that there should be some monotonic function F : R × R → R, for which   ϕ ∧ ψ|χ = F (ψ|χ), (ϕ|ψ ∧ χ) .

I Consistency requires that ϕ ∧ ψ|χ and ψ ∧ ϕ|χ coincide, and thus     F (ψ|χ), (ϕ|ψ ∧ χ) = F (ϕ|χ), (ψ|ϕ ∧ χ) .

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 40 Patterns of Plausible Reasoning

As a matter of logic    (ϕ ∧ ψ) ∧ χ ↔ ϕ ∧ (ψ ∧ χ) . Hence consistency requires that

(ϕ ∧ ψ) ∧ χ | δ = ϕ ∧ (ψ ∧ χ) | δ ,

i.e., that   F F (χ|δ), (ψ|χ ∧ δ), (ϕ|ψ ∧ χ ∧ δ)   = F (χ|δ), F (ψ|χ ∧ δ), (ϕ|ψ ∧ χ ∧ δ) . Or more generally,

F F (x, y), z = F x, F (y, z) .

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 41 Patterns of Plausible Reasoning

Lemma (Cox 1946; Jaynes 1995) Under certain further assumptions about F —e.g., that it is continuous—there is a unique set of solutions to these requirements. Namely there must be a function w : R → R such that

w(ϕ ∧ ψ|χ) = w(ψ|χ)w(ϕ|ψ ∧ χ) .

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 42 Patterns of Plausible Reasoning

w(ϕ ∧ ψ|χ) = w(ψ|χ)w(ϕ|ψ ∧ χ) (1)

Evidently, if ϕ is certain, then ϕ ∧ ψ|χ = ψ|χ and ϕ|ψ ∧ χ = ϕ|χ . Hence, plugging into (1) we obtain w(ψ|χ) = w(ϕ|χ)w(ψ|χ), i.e.,

w(ϕ|χ) = 1.

Likewise, if ϕ is impossible, then ϕ ∧ ψ|χ = ϕ|χ and ϕ|ψ ∧ χ = ϕ|χ. Hence, again appealing to (1) we obtain w(ϕ|ψ) = w(ϕ|ψ)w(ψ|χ), i.e.,

w(ϕ|χ) = 0.

That is, in any case, we always have0 ≤ w(ϕ|χ) ≤ 1.

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 43 Patterns of Plausible Reasoning

Interim Summary

I Under basic assumptions, we have shown that there should always be weights w for plausibilities ϕ|ψ in between 0 and 1.

I These weights should obey the following equation:

w(ϕ ∧ ψ|χ) = w(ψ|χ)w(ϕ|ψ ∧ χ) .

I What about w(¬ϕ|ψ)?

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 44 Patterns of Plausible Reasoning

Jaynes argues that there should be some S such that:

w(¬ϕ|ψ) = Sw(ϕ|ψ) ,

with S continuous and monotone decreasing.

Furthermore, by consistency we must have

w(ϕ ∧ ψ|χ) = w(ϕ|χ)Sw(¬ψ|ϕ ∧ χ) .

Lemma (Cox 1946; Jaynes 1995) The only way of satisfying these requirements is if

w(¬ϕ|ψ) = 1 − w(ϕ|ψ) .

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 45 Patterns of Plausible Reasoning

Theorem (Cox 1946; Jaynes 1995) The requirements discussed above ensure there is P : L × L → [0, 1]:

I 0 ≤ P(ϕ|ψ) ≤ 1 ;

I P(¬ϕ|ψ) = 1 − P(ϕ|ψ) ;

I P(ϕ ∧ ψ|χ) = P(ψ|χ)P(ϕ|ψ ∧ χ).

As Jaynes says, we can now use logic and the rules above to assign plausibilities/probabilities to any complex sentence on the basis of those assigned to simpler sentences.

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 46 Patterns of Plausible Reasoning

Example (Sum Rule)

P(ϕ ∨ ψ|χ) = 1 − P(¬ϕ ∧ ¬ψ|χ) = 1 − P(¬ϕ|χ)P(¬ψ|¬ϕ ∧ χ) = 1 − P(¬ϕ|χ)[1 − P(ψ|¬ϕ ∧ χ)] = P(ϕ|χ) + P(¬ϕ ∧ ψ|χ) = P(ϕ|χ) + P(ψ|χ)P(¬ϕ|ψ ∧ χ) = P(ϕ|χ) + P(ψ|χ)[1 − P(ϕ|ψ ∧ χ)] = P(ϕ|χ) + P(ψ|χ) − P(ψ|χ)P(ϕ|ψ ∧ χ) = P(ϕ|χ) + P(ψ|χ) − P(ϕ ∧ ψ|χ) .

a

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 47 Patterns of Plausible Reasoning

Example (Jewelry Thief) Returning to the other example, recall the pattern we wanted to capture: ϕ being true makes ψ more plausible ψ is true ϕ becomes more plausible This now comes out as a natural result of the calculus. For, suppose

P(ψ|ϕ ∧ χ) > P(ψ|χ) .

Then, because it is now derivable that

P(ψ|ϕ ∧ χ) P(ϕ|ψ ∧ χ) = P(ϕ|χ) , P(ψ|χ) it follows that P(ϕ|ψ ∧ χ) > P(ϕ|χ) .

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 48 Summary and Preview

I Many have claimed that probability should be seen as a part of (extended) logic. We have seen three sources of such claims:

• Logical inference preserves not just truth, but also probability.

• Probability, like logic, is about consistency.

• Probability is the theory of reasonable inference, going beyond logic.

I Next time we will look more closely at these ‘patterns of reasonable inference’, focusing on the relation between probabilistic notions and logical systems concerned with nonmonotonicity.

Wesley Holliday & Thomas Icard: Logic and Probability, Lecture 1: Probability as Logic 49