[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Probabilistic logic (also probability logic and probabilistic reasoning) involves the use of probability and logic to deal with uncertain situations. Probabilistic logic extends traditional logic truth tables with probabilistic expressions. A difficulty of probabilistic logics is their tendency to multiply the computational complexities of their probabilistic and logical components. Other difficulties include the possibility of counter-intuitive results, such as in case of belief fusion in Dempster–Shafer theory. Source trust and epistemic uncertainty about the probabilities they provide, such as defined in subjective logic, are additional elements to consider. The need to deal with a broad variety of contexts and issues has led to many different proposals.

Logical background

edit

There are numerous proposals for probabilistic logics. Very roughly, they can be categorized into two different classes: those logics that attempt to make a probabilistic extension to logical entailment, such as Markov logic networks, and those that attempt to address the problems of uncertainty and lack of evidence (evidentiary logics).

That the concept of probability can have different meanings may be understood by noting that, despite the mathematization of probability in the Enlightenment, mathematical probability theory remains, to this very day, entirely unused in criminal courtrooms, when evaluating the "probability" of the guilt of a suspected criminal.[1]

More precisely, in evidentiary logic, there is a need to distinguish the objective truth of a statement from our decision about the truth of that statement, which in turn must be distinguished from our confidence in its truth: thus, a suspect's real guilt is not necessarily the same as the judge's decision on guilt, which in turn is not the same as assigning a numerical probability to the commission of the crime, and deciding whether it is above a numerical threshold of guilt. The verdict on a single suspect may be guilty or not guilty with some uncertainty, just as the flipping of a coin may be predicted as heads or tails with some uncertainty. Given a large collection of suspects, a certain percentage may be guilty, just as the probability of flipping "heads" is one-half. However, it is incorrect to take this law of averages with regard to a single criminal (or single coin-flip): the criminal is no more "a little bit guilty" than predicting a single coin flip to be "a little bit heads and a little bit tails": we are merely uncertain as to which it is. Expressing uncertainty as a numerical probability may be acceptable when making scientific measurements of physical quantities, but it is merely a mathematical model of the uncertainty we perceive in the context of "common sense" reasoning and logic. Just as in courtroom reasoning, the goal of employing uncertain inference is to gather evidence to strengthen the confidence of a proposition, as opposed to performing some sort of probabilistic entailment.

Historical context

edit

Historically, attempts to quantify probabilistic reasoning date back to antiquity. There was a particularly strong interest starting in the 12th century, with the work of the Scholastics, with the invention of the half-proof (so that two half-proofs are sufficient to prove guilt), the elucidation of moral certainty (sufficient certainty to act upon, but short of absolute certainty), the development of Catholic probabilism (the idea that it is always safe to follow the established rules of doctrine or the opinion of experts, even when they are less probable), the case-based reasoning of casuistry, and the scandal of Laxism (whereby probabilism was used to give support to almost any statement at all, it being possible to find an expert opinion in support of almost any proposition.).[1]

Modern proposals

edit

Below is a list of proposals for probabilistic and evidentiary extensions to classical and predicate logic.

  • The term "probabilistic logic" was first used by Jon Von Neumann in a series of Cal Tech lectures 1952 and 1956 paper "Probabilistic logics and the synthesis of reliable organisms from unreliable components", and subsequently in a paper by Nils Nilsson published in 1986, where the truth values of sentences are probabilities.[2] The proposed semantical generalization induces a probabilistic logical entailment, which reduces to ordinary logical entailment when the probabilities of all sentences are either 0 or 1. This generalization applies to any logical system for which the consistency of a finite set of sentences can be established.
  • The central concept in the theory of subjective logic[3] is opinions about some of the propositional variables involved in the given logical sentences. A binomial opinion applies to a single proposition and is represented as a 3-dimensional extension of a single probability value to express probabilistic and epistemic uncertainty about the truth of the proposition. For the computation of derived opinions based on a structure of argument opinions, the theory proposes respective operators for various logical connectives, such as e.g. multiplication (AND), comultiplication (OR), division (UN-AND) and co-division (UN-OR) of opinions,[4] conditional deduction (MP) and abduction (MT).,[5] as well as Bayes' theorem.[6]
  • The approximate reasoning formalism proposed by fuzzy logic can be used to obtain a logic in which the models are the probability distributions and the theories are the lower envelopes.[7] In such a logic the question of the consistency of the available information is strictly related with the one of the coherence of partial probabilistic assignment and therefore with Dutch book phenomena.
  • Markov logic networks implement a form of uncertain inference based on the maximum entropy principle—the idea that probabilities should be assigned in such a way as to maximize entropy, in analogy with the way that Markov chains assign probabilities to finite state machine transitions.
  • Systems such as Ben Goertzel's Probabilistic Logic Networks (PLN) add an explicit confidence ranking, as well as a probability to atoms and sentences. The rules of deduction and induction incorporate this uncertainty, thus side-stepping difficulties in purely Bayesian approaches to logic (including Markov logic), while also avoiding the paradoxes of Dempster–Shafer theory. The implementation of PLN attempts to use and generalize algorithms from logic programming, subject to these extensions.
  • In the field of probabilistic argumentation, various formal frameworks have been put forward. The framework of "probabilistic labellings",[8] for example, refers to probability spaces where a sample space is a set of labellings of argumentation graphs. In the framework of "probabilistic argumentation systems"[9][10] probabilities are not directly attached to arguments or logical sentences. Instead it is assumed that a particular subset   of the variables   involved in the sentences defines a probability space over the corresponding sub-σ-algebra. This induces two distinct probability measures with respect to  , which are called degree of support and degree of possibility, respectively. Degrees of support can be regarded as non-additive probabilities of provability, which generalizes the concepts of ordinary logical entailment (for  ) and classical posterior probabilities (for  ). Mathematically, this view is compatible with the Dempster–Shafer theory.
  • The theory of evidential reasoning[11] also defines non-additive probabilities of probability (or epistemic probabilities) as a general notion for both logical entailment (provability) and probability. The idea is to augment standard propositional logic by considering an epistemic operator K that represents the state of knowledge that a rational agent has about the world. Probabilities are then defined over the resulting epistemic universe Kp of all propositional sentences p, and it is argued that this is the best information available to an analyst. From this view, Dempster–Shafer theory appears to be a generalized form of probabilistic reasoning.

See also

edit

References

edit
  1. ^ a b James Franklin, The Science of Conjecture: Evidence and Probability before Pascal, 2001 The Johns Hopkins Press, ISBN 0-8018-7109-3.
  2. ^ Nilsson, N. J., 1986, "Probabilistic logic," Artificial Intelligence 28(1): 71-87.
  3. ^ A. Jøsang. Subjective Logic: A formalism for reasoning under uncertainty. Springer Verlag, 2016
  4. ^ Jøsang, A. and McAnally, D., 2004, "Multiplication and Comultiplication of Beliefs," International Journal of Approximate Reasoning, 38(1), pp.19-51, 2004
  5. ^ Jøsang, A., 2008, "Conditional Reasoning with Subjective Logic," Journal of Multiple-Valued Logic and Soft Computing, 15(1), pp.5-38, 2008
  6. ^ A. Jøsang. Generalising Bayes' Theorem in Subjective Logic. 2016 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI 2016), Baden-Baden, Germany, 2016.
  7. ^ Gerla, G., 1994, "Inferences in Probability Logic," Artificial Intelligence 70(1–2):33–52.
  8. ^ Riveret, R.; Baroni, P.; Gao, Y.; Governatori, G.; Rotolo, A.; Sartor, G. (2018), "A Labelling Framework for Probabilistic Argumentation", Annals of Mathematics and Artificial Intelligence, 83: 221–287.
  9. ^ Kohlas, J., and Monney, P.A., 1995. A Mathematical Theory of Hints. An Approach to the Dempster–Shafer Theory of Evidence. Vol. 425 in Lecture Notes in Economics and Mathematical Systems. Springer Verlag.
  10. ^ Haenni, R, 2005, "Towards a Unifying Theory of Logical and Probabilistic Reasoning," ISIPTA'05, 4th International Symposium on Imprecise Probabilities and Their Applications: 193-202. "Archived copy" (PDF). Archived from the original (PDF) on 2006-06-18. Retrieved 2006-06-18.{{cite web}}: CS1 maint: archived copy as title (link)
  11. ^ Ruspini, E.H., Lowrance, J., and Strat, T., 1992, "Understanding evidential reasoning," International Journal of Approximate Reasoning, 6(3): 401-424.

Further reading

edit
edit