Blog Archives
These are the items posted in this seminar, currently ordered by their post-date, rather than by the event date. We will create improved views in the future. In the meantime, please click on the Seminar menu item above to find the page associated with this seminar, which does have a more useful view order.Heart of DARCness
Abstract. Alan Hajek has recently criticised the thesis that Deliberation Crowds Out Prediction (renaming it the DARC thesis, for ‘Deliberation Annihilates Reflective Credence’). Hajek’s paper has reinforced my sense that proponents and opponents of this thesis often talk past one other. To avoid confusions of this kind we need to dissect our way to the heart of DARCness, and to distinguish it from various claims for which it is liable to be mistaken. In this talk, based on joint work with Yang Liu, I do some of this anatomical work. Properly understood, I argue, the heart is in good shape, and untouched by Hajek’s jabs at surrounding tissue. Moreover, a feature that Hajek takes to be problem for the DARC thesis – that it commits us to widespread ‘credal gaps’ – turns out to be a common and benign feature of a broad class of cases, of which deliberation is easily seen to be one.
The rise and fall of accuracy-first epistemology
Accuracy-first epistemology aims to supply non-pragmatic justifications for a variety of epistemic norms. The contemporary roots of accuracy-first epistemology are found in Jim Joyce’s program to reinterpret de Finetti’s scoring-rule arguments in terms of a “purely epistemic” notion of “gradational accuracy.” On Joyce’s account, scoring rules are conceived to measure the accuracy of an agent’s belief state with respect to the true state of the world, and Joyce argues that this notion of accuracy is a purely epistemic good. Joyce’s non-pragmatic vindication of probabilism, then, is an argument to the effect that a measure of gradational accuracy so imagined satisfies conditions that are close enough to those necessary to run a de Finetti style coherence argument. A number of philosophers, including Hannes Leitgeb and Richard Pettigrew, have joined Joyce’s program and gone whole hog. Leitgeb and Pettigrew, for instance, have argued that Joyce’s arguments are too lax and have put forward conditions that narrowing down the class of admissible gradational accuracy functions, while Pettigrew and his collaborators have extended the list of epistemic norms receiving an accuracy-first treatment, a program that he calls Evidential Decision Theory.
In this talk I report on joint work with Conor Mayo-Wilson that aims to challenge the core assumption of Evidential Decision Theory, which is the very idea of supplying a truly non-pragmatic justification for anything resembling the Von Neumann and Morgenstern axioms for a numerical epistemic utility function. Indeed, we argue that none of axioms have a satisfactory non-pragmatic justification, and we point to reasons why to suspect that not all the axioms could be given a satisfactory non-pragmatic justification. Our argument, if sound, has ramifications for recent discussions of “pragmatic encroachment”, too. For if pragmatic encroachment is a debate to do with whether there is a pragmatic component to the justification condition of knowledge, our arguments may be viewed to attack the true belief condition of (fallibilist) accounts of knowledge.
Knowledge-Theoretic Aspects of Strategic Voting
It has long been noted that a voter can sometimes achieve a preferred election outcome by misrepresenting his or her actual preferences. In fact, the classic Gibbard-Sattherthwaite Theorem shows that under very mild conditions, every voting method that is not a dictatorship is susceptible to manipulation by a single voter. One standard response to this important theorem is to note that a voter must possess information about the other voters’ preferences in order for the voter to decide to vote strategically. This seems to limit the “applicability” of the theorem. In this talk, I will survey some recent literature that aims at making this observation precise. This includes models of voting under uncertainty (about other voters’ preferences) and models that take into account how voters may response to poll information.
The Humean Thesis on Belief
I am going to make precise, and assess, the following thesis on (all-or-nothing) belief and degrees of belief: It is rational to believe a proposition just in case it is rational to have a stably high degree of belief in it.I will start with some historical remarks, which are going to motivate calling this postulate the “Humean thesis on belief”. Once the thesis has been formulated in formal terms, it is possible to derive conclusions from it. Three of its consequences I will highlight in particular: doxastic logic; an instance of what is sometimes called the Lockean thesis on belief; and a simple qualitative decision theory.
Savage’s Subjectivism and Countable Additivity
This talk is mainly on the issue of additivity in Savage’s theory of subjective expected utility. It is divided into three parts. First, I will comment, by providing a brief historical survey, on Savage’s reasons for adopting finitely additive probability measure in his decision model. It will argue that Savage’s set-theoretic argument for rejecting countable additivity is inconclusive. In the second part, I will discuss some defects of finite additivity in Savage’s system. This will be followed, in the last part, by a detailed reconstruction and revision of Savage’s theory. It will be shown that Savage’s final representation theorem, which extends the derived utility function for simple acts to general acts, is derivable from the first six of his seven postulates provided countable additivity is in sight, a conjecture made by Savage himself.
Causal Decision Theory and Intrapersonal Nash Equilibria
Most philosophers today prefer ‘Causal Decision Theory’ to Bayesian or other non-Causal Decision Theories. What explains this is the fact that in certain Newcomb-like cases, only Causal theories recommend an option on which you would have done better, whatever the state of the world had been. But if so, there are cases of sequential choice in which the same difficulty arises for Causal Decision Theory. Worse: under further light assumptions the Causal Theory faces a money pump in these cases. It may be illuminating to consider rational sequential choice as an intrapersonal game between one’s stages, and if time permits I will do this. In that light the difficulty for Causal Decision Theory appears to be that it allows, but its non-causal rivals do not allow, for Nash equilibria in such games that are Pareto inefficient.
The Columbia-CUNY Workshop in Logic, Probability, and Games
There will be a meeting of this seminar on October 18 from 2 to 4 PM in room 4419. Haim Gaifman (Columbia) and Rohit Parikh (CUNY) will speak. Details will be announced next week.
This is a meeting of a joint CUNY-Columbia research group on Logic, Probability and Games.
Description: This workshop is concerned with applying formal methods to fundamental issues, with an emphasis on probabilistic reasoning decision theory and games. In this context “logic” is broadly interpreted as covering applications that involve formal representations. The topics of interest have been researched within a very broad spectrum of different disciplines, including philosophy (logic and epistemology), statistics, economics, and computer science. The workshop is intended to bring together scholars from different fields of research so as to illuminate problems of common interest from different perspectives. Throughout each academic year, meetings are regularly presented by the members of the workshop and distinguished guest speakers and are held alternatively at Columbia University and CUNY Graduate Center.