We are happy to announce a workshop on Explanation, Relevance, and Understanding, to take place June 1 via Zoom.
Theme: What makes information relevant to explain why something is the case? Many objections to classical accounts such as the DN account, the SR account, and simple causal accounts is that they falsely predict irrelevant information to be explanatorily relevant. But how shall we approach the topic of explanatory relevance if not in terms of causes, laws, or statistical patterns? In view of the recent interest in the notion of understanding, a promising idea is that information is explanatory relevant only if it helps to understand why a given phenomenon occurred. For instance, only those parts of the causal history of an event that contribute to understanding why the event occurred are explanatorily relevant. This exciting suggestion gives rise to a number of challenges relating, among other things, to the alleged objectivity of explanation and the connection between explanation and epistemic virtues.
The present workshop gathers experts on the topic of explanation and understanding to present their newest work on the issue.
Schedule:
3:00pm Finnur Dellsén (University of Iceland): Gaining Understanding
Abstract. This talk develops and defends the dependency modelling account of understanding, along with an associated probability-based epistemology. On this account, an agent understands a phenomenon to the extent that they accurately and comprehensively model the dependence relations (e.g. causal relations) that the phenomenon stands in, or fails to stand in, to other phenomena. I argue that this correctly implies that it is possible to increase one’s understanding without learning an explanation. I further argue that even if learning an explanation necessarily increases one’s understanding, this turns out to be a rather trivial thesis. Given time, I will end with a discussion of the epistemology of understanding, in which I develop a probabilistic theory of when a given proposition is ‘acceptable’ for the purposes of understanding.
4:45pm Angela Potochnik (University of Cincinnati): It’s All Relevant
Abstract. Two questions have been targeted in discussions of explanatory relevance: (1) the question of what sorts of dependence are explanatory, and (2) the question of which of those dependencies should feature in a given explanation. In this talk, I’ll aim to set aside (1) in order to focus on (2). I think this is possible insofar as current views about (1) tend to range from advocates of causal explanation to pluralists about explanatory dependence that want to also admit other forms of explanatory dependence, such as constitutive dependence and mathematical relationships, both of which leave (2) as a live issue. The question of which potentially explanatory dependencies should feature in an explanation crops up variously in philosophical debates about levels of explanation, structural explanation, causes vs. background conditions, and more. In this talk, I’ll argue that there is no way to determine which dependencies should be included in a given explanation without reference to the cognitive needs of those seeking explanation. This does not make what explains what a subjective matter, as this is consistent with an objective account of explanatory dependence. But it does make it necessary to accept the coexistence of multiple explanations and paves the way for the recognition of the explanatory value of a wider range of structural, large-scale, and stable features of the world.
6:30pm Kareem Khalifa (Middlebury College): Teasing Apart Explanatory Relevance
Abstract. The traditional project of analyzing explanation sought to provide necessary and sufficient conditions for explanation, typically under the further assumption that the explanatory relation was monistic and mind-independent. The most recent continuation of this project holds that counterfactual dependence plays the role of a monistic, mind-independent explanatory relation in a classical definition of explanation. I argue that counterfactual dependence is neither sufficient nor necessary for explanation. As an alternative, I propose a pluralistic account of explanation in which there are multiple mind-independent explanatory relations, each of which is sufficient for explanation, with only a weak necessary condition common to all explanations. This deflates the traditional project at the level of semantics. I conclude by showing how a monistic account of understanding can provide an epistemology and pragmatics of explanation that complements this pluralistic approach to the semantics of explanation.
Attendance is free, but please send an email to hamburgrelevance@gmail.com if you want to attend.