Frans Voorbraak
My name is Frans Voorbraak, and I am a postdoc working at the department of Mathematics, Computer Science, Physics and Astronomy (WINS) of the University of Amsterdam. The main topic of my research is reasoning with uncertainty in AI. Presently, this research is carried out in the context of the NWO Pionier-project Reasoning with Uncertainty. I started my research in this field by writing a PhD thesis at Utrecht University on epistemic logic and uncertainty. After joining the Pionier-project, I became specifically interested in reasoning with uncertainty in robotics. I am currently studying what I call Partial Probability Theory, which is a formalism that generalizes probability theory by allowing the probability measures to be only partially determined.
This page contains abstracts of and links to recent papers, and a list of my main publications. Some of my technical reports can be found here.
The topics of my PhD thesis are epistemic logic and reasoning with uncertainty. The thesis appeared in 1993 as volume 7 of the series Queastiones Infinitae published by the Department of Philosophy, Utrecht University. A short description of its contents can be found below. If you want to have a copy of the thesis, please mail me.
As Far as I Know. Epistemic Logic and
Uncertainty
Abstract: This thesis is about epistemic logic and reasoning
with uncertainty. Epistemic logic is the logical study of notions
such as knowledge and belief. The study of these notions has a long
history in philosophy, and, more recently, researchers in computer
science and artificial intelligence (AI) have become interested in
the formal description of the properties of epistemic notions.
Possible applications are the formal specification of distributed
systems and the formal description of knowledge-based systems.
Since the available information is often insufficient to arrive at
certain knowledge or convictions, intelligent behaviour typically
requires the handling of uncertainty. Hence it is no surprise that AI
researchers pay much attention to formalisms for reasoning with
uncertainty. In this thesis we study (some aspects of) reasoning with
numeric degrees of certainty (Probabilistic Reasoning) and reasoning
with defeasible, plausible conclusions based on incomplete
information (Nonmonotonic Reasoning).
This thesis consists of four chapters which can in principle be read
independently. Chapter 1 contains some preliminary remarks and an
overview of the three research areas which are discussed in this
thesis, namely Epistemic Logic, Probabilistic Reasoning, and
Nonmonotonic Reasoning. This chapter is primarily intended for
readers who are not familiar with (all) the research areas mentioned
above, and can be skipped by those who are.
In chapter 2 (Epistemic Logic) we argue that there are many
interesting epistemic notions and we propose generalized Kripke
models as a tool for a systematic, semantic approach to these
different notions. In addition to, or in a sense as a result of, this
more systematic approach to epistemic logic we obtain as our main
results
Chapter 3 (Probabilistic Reasoning) includes a discussion of modal treat ments of probabilistic belief, and some arguments in favour of interval-valued probability theory and against the popular Dempster-Shafer theory. The main results are
In chapter 4 (Nonmonotonic Reasoning) we discuss an epistemic variant of the preferential semantics for nonmonotonic logics and we ask attention for interesting variants of two major nonmonotonic formalisms. The main results are
At ESSLLI'96, the European Summer School on Logic, Language, and Information (August 12-23,Prague), I presented the following paper at the workshop Quantitative and Symbolic Approaches to Uncertainty.
Epistemic Logic and Uncertainty
(ps-file)
Abstract: The paper discusses the application of epistemic
logic to the logical study of reasoning with uncertain beliefs.
Following a brief overview of standard epistemic logic, which
typically deals with full or certain belief, several extensions to
weaker notions of belief are introduced. Various proposals to
formalize probabilistic reasoning as a modal logic are reviewed.
Further, the relation between inner measures and belief functions
from Dempster-Shafer theory is clarified.
In December 1995, members of the Pionier-project Reasoning with Uncertainty and researchers from the robotics-group of the University of Amsterdam organized an international workshop RUR '95 (Reasoning with Uncertainty in Robotics). The proceedings of the workshop were published by Springer:
Reasoning with Uncertainty in Robotics. Proceedings International Workshop RUR '95, Leo Dorst, Michiel van Lambalgen, Frans Voorbraak (eds.). Lecture Notes in Artificial Intelligence Vol. 1093. Springer, Berlin. VIII, 387 pages. 1996. (Contents.)
At this workshop, I presented the following tutorial paper on reasoning with uncertainty in AI, which is published on pages 52-90 of the proceedings.
Reasoning with Uncertainty in AI
(ps-file)
Abstract: This paper provides an introduction to the field of
reasoning with uncertainty in Artificial Intelligence (AI), with an
emphasis on reasoning with numeric uncertainty. The considered
formalisms are Probability Theory and some of its generalizations,
the Certainty Factor Model, Dempster-Shafer Theory, and Probabilistic
Networks..
Ideally, the available evidence allows an ideally rational agent to represent his degrees of belief by means of a probability function. In general, the situation is less ideal, and the evidence only partially determines the probabilities.
In Partial Probability Theory (PPT), belief states of ideally rational agents are represented by means of constraints on probability measures, without assuming that the constraints determine a unique probability function. This assumption has been attacked by many other researchers, in particular by the proponents of Dempster-Shafer theory (DS theory). However, PPT has a clear probabilistic interpretation, and is more general than the lower and upper probability interpretation of DS theory.
The following paper on combining evidence in PPT was presented at NAIC '96 (Eighth Dutch Conference on Artificial Intelligence, November 21-22, 1996, Utrecht University). It is published on pages 369-379 of the proceedings of the conference edited by J.-J.Ch. Meyer and L.C. van der Gaag.
Evidence Combination in AI and
Robotics
(ps-file)
Abstract: In this paper, we discuss the problem of combining
several pieces of uncertain evidence, such as provided by symptoms,
expert opinions, or sensor readings. Several of the proposed methods
for combining evidence are reviewed and criticized. We argue for the
position that (1) in general these proposed methods are inadequate,
(2) strictly speaking, the only justifiable solution is to carefully
model the situation, (3) a careful modelling of the situation
requires a distinction between ignorance and uncertainty, and (4)
drawing useful conclusions in the presence of ignorance may require
additional assumptions which are not derivable from the available
evidence.
The following paper on decision analysis in PPT has been presented at the AAAI 1997 Spring Symposion on Qualitative Preferences in Deliberation and Practical Reasoning (March 24-26, 1997, Stanford University).
Decision Analysis using Partial Probability Theory
(ps-file)
Abstract: We study the problem of making decisions under
partial ignorance, or partially quantified uncertainty. To represent
partial ignorance, we propose partial probability theory (PPT). The
natural extension to PPT of the ordinary maximum expected utility
(MEU) decision rule is characterized by the presence of a {\em
partial} preference order on the alternatives. We argue that decision
analysis should not be exclusively focused on optimizing but pay more
serious attention to satisficing and reasoning with assumptions.
Independence assumptions underlying Dempster's rule of combination. NAIC-88. Proceedings First Dutch Conference on Artificial Intelligence, M. van Someren and G. Schreiber (eds.). University of Amsterdam (1988) pp. 162-172.
The logic of actual obligation. An alternative approach to deontic logic. Philosophical Studies 55 (1989) 173-194.
A simplification of the completeness proofs for Guaspari and Solovay's R. Notre Dame Journal of Formal Logic 31 (1990) 44-63.
A computationally efficient approximation of Dempster-Shafer theory. International Journal of Man-Machine Studies 30 (1989) 525-536. Reprinted in Machine Learning and Uncertain Reasoning, B. Gaines and J. Boose (eds.), London: Academic Press (1990) pp. 461-472.
Conditionals, probability, and belief revision. M. Stokhof and L. Torenvliet (eds.), Proceedings Seventh Amsterdam Colloquium, Amsterdam: Institute for Language, Logic and Information, (1990) pp. 597-613.
The logic of objective knowledge and rational belief. Logics in AI -JELIA'90, J. van Eijck (ed.), LNCS 478, Berlin: Springer (1991) pp. 499-515.
On the justification of Dempster's rule of combination, Artificial Intelligence 48 (1991) 171-197.
A preferential model semantics for default logic, Symbolic and Quantitative Approaches to Uncertainty, R. Kruse and P. Siegel (eds.), LNCS 548, Berlin: Springer (1991) pp. 344-351.
Generalized Kripke models for epistemic logic. Theoretical Aspects of Reasoning About Knowledge: Proceedings of the Fourth Conference, Y. Moses (ed.). San Mateo CA: Morgan Kaufmann (1992) pp. 214-228.
Preference-based semantics for nonmonotonic logics. IJCAI-93 Proceedings of the Thirteenth International Joint Conference on Artificial Intelligence. San Mateo CA: Morgan Kaufmann (1993) pp. 584-589.
As Far as I Know. Epistemic Logic and Uncertainty. Dissertation, Utrecht University (1993). Published by the Department of Philosophy, Utrecht University as volume 7 of the series Queastiones Infinitae. (abstract)
Reasoning with Uncertainty in AI. Reasoning with Uncertainty in Robotics. Proceedings International Workshop RUR '95, Leo Dorst, Michiel van Lambalgen, Frans Voorbraak (eds.). LNAI 1093. Springer, Berlin: Springer (1996) pp. 52-90. (abstract, ps-file)
Evidence Combination in AI and Robotics . NAIC '96. Proceedings Eighth Dutch Conference on Artificial Intelligence, J.-J.Ch. Meyer and L.C. van der Gaag (eds.). Utrecht University (1996) pp. 369-379. (abstract, ps-file)
To: Reasoning with Uncertainty (Pionier-project).
Last update: April 10, 1997.