104. 'Old AI Meets New AI in the Logic of Scientific Discovery' – Scientific discovery is, for the most part, a neglected topic in the philosophy of science (Langley and Arvay 2019). Since around the middle of the last century, the received view has been that discovery is not governed by logic or, more generally, by rationality, but is a largely elusive and inscrutable process (Popper 1959). Thankfully, not every philosopher has been persuaded by this pessimistic view (Nersessian 2010). Given the recent cascade of developments in neural nets, this means that now more than ever we need to carefully re-evaluate our attitude towards this view. This talk aims to do precisely that by exploring how such developments affect, and how they ought to affect, the debate over the nature of scientific discovery. It will be argued that, their raw potential to make significant contributions to science notwithstanding, neural net techniques are unlikely to single-handedly reinstate the rationalist model of scientific discovery or indeed lead to mass automation. Rather, a more promising approach on both counts involves the combination of ‘old AI’ methods like automated theorem proving with neural nets. (Presented at the AAAI Spring Symposium Series, San Francisco (CA), USA, March 27-29, 2023).
103. 'Critical Thinking Education: Automated Theorem Proving to the Rescue' – It is difficult to overestimate the importance of critical thinking and more specifically logical reasoning in everyday life as well as in science. Logic plays both an implicit and an explicit role in shaping many of our beliefs, decisions and actions. Through logic we can decide crucial properties like the validity or invalidity of arguments and the consistency or inconsistency of sets of beliefs. In this talk, I consider some obstacles and opportunities that emerge in the context of automating semantic and syntactic methods of doing logic for pedagogical purposes. I first outline some historical developments in these areas. I then turn to my own work which includes a free multi-platform app called ‘The Logic Calculator’ and an automated theorem prover (work-in-progress) that seeks to make human-readable proofs easier to produce. The talk concludes with some open-ended questions that I hope the audience will help answer. (Presented at the CSTA Silicon Valley Chapter, Palo Alto (CA), USA, December 9, 2022).
102. 'Towards Impartial Assessments of Theories' – If we are to believe the Latours of this world, we must treat as pure fantasy the claim that empirical judgments empower scientists to impartial assessments of rival theories or models. Instead of truthfully representing their target systems, the accusation goes, empirical judgments are merely the result of an (elaborate) social negotiation between scientists. Otherwise put, on this view, empirical judgments are nothing more than social constructs. In this talk, I make two claims: (1) We must steer clear of such extremely pessimistic views. (2) We must nonetheless chart a course that acknowledges the existence of significant obstacles on the way to fully impartial assessments of rival theories or models. To motivate the first claim, I argue that the view that empirical judgments are mere social constructs is at best unfounded and at worst internally incoherent. To motivate the second claim, I argue that a significant obstacle to fully impartial assessments of rival theories or models is the coarseness of empirical variables. As an illustration, scientific models produced via machine-learning (ML) are discussed. It is suggested that beyond the familiar problems relating to noisy data, model selection, bias-variance trade-off and hyperparameter setting, the accuracy and even explainability of ML-produced models can be substantially impaired by the coarseness of the deployed features. (Presented at the SFSU Philosophy Seminar, San Francisco (CA), USA, December 6, 2022).
101. 'Modelling Analogical Reasoning: One-Size-Fits-All?' – One of the most common forms of reasoning in science is reasoning by analogy. Roughly speaking, such reasoning involves the transposition of solutions that work well in one domain to another on the basis of analogous features between the two domains. Sometimes such reasoning works (e.g. artificial and natural selection) and sometimes it doesn’t (e.g. Vulcan and Neptune). Two general reactions to the problem of modelling the logic of analogical reasoning have emerged as a result: There are those who attempt to construct increasingly complex but still universal models of analogical reasoning in order to better discriminate between cases where it works and cases where it doesn’t (Bartha 2010; Hesse 1966). And there are those who give up on the universal model approach and argue in favour of localised models (Norton 2021). In this talk, we assess the merits of each approach in the context of the Wittgensteinian family resemblance conception of scientific categories. Moreover, we assess the impact of computational attempts to articulate and operationalise analogical reasoning, particularly in the field of Artificial Intelligence (Prade and Richard 2014), on the debate between universalists and localists. (Presented at the Wittgenstein and AI Conference, London, UK, July 29-31, 2022).
100. 'Theory Change through a Logical Lens' – A familiar pattern can be seen in the history of science. Not long after a theory becomes established, the seeds of its demise are sown. In time, those seeds germinate into a full-blown rival theory, which supplants the earlier theory and resets the whole process. How long this pattern continues is unknown as the dynamics of theory change are somewhat opaque. In this talk, I endeavour to throw some light on those dynamics by placing some aspects of the formation, alteration and elimination of theories under a logical lens. Taking the scientific realism debate as a blueprint, I identify some important lessons concerning theory change and offer a number of history of science cases in support. I then put forth two quasi-logical notions, content weakening and content strengthening, which can aid the explication of the dynamics of theory change, particularly in relation to cases of (dis-/)confirmation. (Presented at the PSF2022 conference, Leusden, Netherlands, June 27-28, 2022).
99. 'The Study of Reasoning in Philosophy, Psychology and AI: In Search of Synergies' – The theoretical study of reasoning and its application to solve problems is at the heart of Philosophy and has been since ancient times. Although Philosophers continue to make contributions to this day, e.g. through the development of logics and the analysis of logical concepts, other fields have emerged that have revitalised the study of reasoning and provided valuable insight. Two notable fields are Psychology and Computer Science. The former, particularly the branch that deals with reasoning, has thrown empirical light on the vagaries and limitations of human thought. The latter, particularly the Argumentation AI branch, has advanced the frontiers of the study of reasoning by, among other things, implementing and testing diverse models of reasoning in silico. In this talk, I go in search of potential synergies between these three fields. I start by identifying some crucial terminological differences. I then proceed to highlight some key features of philosophical accounts of reasoning and of the related concepts of justification and explanation. Particular attention is paid to the underlying motivation for these features. I finally attempt to draw connections between these features and some features deemed important in the Psychology of Reasoning and in Argumentation AI (especially in the context of XAI applications). The talk's hoped-for outcome is the inducement of lesson and method cross-pollination across the three fields. (Presented at the Explainable AI Seminars, CLArg Group, Imperial College London, January 2022).
98. 'Conditional Reasoning and Propositional Logic: Some Empirical Results' – It is hard to overestimate the importance of logic in human affairs. Logic, whether implicitly or explicitly, underwrites many of the decisions we make in our daily lives, and is an integral part of the scientific method. It is also at the heart of various academic subjects, including computer science, mathematics and philosophy. Despite its ubiquity, there is surprisingly little empirical work to support its pedagogy. Most existing work investigates the effectiveness of logic tuition in increasing grades or performance in standardised tasks, but there is a lack of comparative analysis. We conducted an interactive online logic learning experiment to compare teaching strategies. The focus of our study is on conditional reasoning with propositional logic. We classify our teaching strategies across two dimensions: semantic-centric vs. syntactic-centric and visually-aided vs. non-visually aided. We present some initial results and discuss their broader significance. (Presented at the Logic Learning: Theoretical and Empirical Perspectives Conference, NCH, London, November 2021).
97. 'Physical Computation: A Tale of Two Notions' – Under what conditions does a physical system compute? Typical answers to this question pull in opposite directions. On the one hand, computation looks like the kind of thing virtually any physical system can do. After all, physical laws ensure that some states are followed by others in a rule-like manner. On the other hand, computation looks like the kind of thing that only a select few physical systems can do. After all, computing devices only emerged in recent human history. This paper aims to resolve the apparent tension between these answers by putting forth two complementary notions of physical computation. (Contributed talk, to be presented at the Philosophy of Science Association 2020 Biennial meeting, Baltimore, Maryland, November 2021 [postponed because of COVID]).
96. 'Machine-Made Jabberwocky' – The question of whether machines can be truly creative has been with us at least since the advent of modern computers. Although the tide seems to be turning, the naysayers still represent a sizeable share of the voices out there. On their view, machines are ultimately incapable of the deeply transformational creativity that human beings exhibit, especially that found in the most outstanding examples of us, e.g. a Mozart, a Dali or an Einstein. In this talk, I subject the kinds of reasons they offer to a careful examination. The focal point of my discussion is on cases of scientific creativity but I also make some relevant remarks about cases in the arts and the humanities. By and large, I find the kinds of reasons offered by the naysayers wanting and argue that machines should be able to exhibit the same, and indeed even superior, levels of creativity to humans. It is high time that we broaden our horizons and abandon such antiquated notions as the view that humanity sits at the apex of meaningful existence. (Invited talk, presented at the Philosophy and AI Workshop: An Interdisciplinary Dialogue, School of Philosophy, Religion and History of Science, University of Leeds, February 2021).
95. 'Structural Realism: A Reappraisal' – Perhaps the most influential realist view in recent years, structural realism’s appeal can be found in the ease with which it seems to explain away certain difficulties that afflict other, more traditional, versions of realism. Roughly, and somewhat generically formulated, it is the view that our epistemic and perhaps even our ontic commitments must be reduced to the structural features that successful
scientific theories ascribe to the unobservable world – see, for example, Votsis (2018). In this talk, I reappraise some of that appeal in light of recent challenges raised by a number of scholars, e.g. Vickers (2019) and Wray (2018), on both sides of the scientific realism debate. Particular emphasis is paid to historical considerations and I consider whether an update to my argument from structural continuity – see, for example, Votsis (2011a; 2011b) – is necessary. (Invited talk, presented at the Seoul National University Distinguished Lecture Series, Seoul, Korea, December 2020).
94. 'Learning (and Teaching) Logic: Some Theories, Evidence and Blind Spots' – This talk is based on research I'm conducting for a project I'm a co-PI on titled 'Learning Critical Reasoning: A Machine-Learning Perspective'. The project focuses on learning skills relating to conditionals in logic, a notoriously difficult area to learn (and teach). As the project is still in its early stages, the talk is restricted to a partial survey of existing theories and empirical evidence in the relevant areas which include Philosophy, Psychology, Computer Science and Education. I conclude with a consideration and brief discussion of some blind spots in the literature. For example, the literature seems to be missing experimental interventions on different methods taught in logic courses. That’s something we aim to rectify with our project as we plan to compare the efficacy of such methods. (Presented at the Cognitive Science Research Group (NCH), London, October 2020).
93. 'The Parallel Lives of Concepts and Theories' – Concepts, like theories, come in various shapes and sizes. Some are narrow, others broad. Some are rigorous, others irreparably tethered in intuition. Some embody ideals of simplicity and unity, others exhibit intricate and tangled parts. Concepts can also be said to perform their epistemic duties more or less adequately and tend to succeed one another in history. In this paper, I explore the parallel lives of scientific concepts and theories with a view to an improved understanding of the structure and dynamics that underlie their formation, alteration and elimination. Using the scientific realism debate as a template, I offer some practical suggestions as to how we ought to make decisions about concept choice. The general picture I draw is one of science that can learn from its past mistakes by utilising formal tools (particularly logic) to diagnose and remove defective elements. (Invited talk, presented at the Philosophy Department, University of Cyprus, February 2020).
92. 'Observational Judgment Convergence and Veridicality' – Observation reports are used throughout science to test theories. But for tests to carry real weight, the observation reports must be veridical. There are those who deny that observation reports are veridical. One major motivation for this approach is the theory-ladenness thesis. Put simply, theories distort the content of observation reports and hence such content cannot truthfully represent things about the world. In this talk, I examine the relationship between observational judgment convergence and veridicality. Although veridicality implies observational judgment convergence — i.e. two individuals who both correctly adjudge the same observational situation could not be in genuine disagreement — the same is not true the other way around. In a nutshell, convergence in observational judgments is necessary but not sufficient for their veridicality. Even so, I put forth some general arguments for the view that the most likely explanation for observational judgment convergence is the veridicality of the corresponding observation reports. Alternative explanations in the form of constructivist views are considered and dismissed as inadequate. (Invited talk, presented at the Philosophy Department, Northeastern University, November 2019).
91. 'Will the Real Pragmatism Please Stand up' – This talk has two aims: (1) To single out a couple of ways the pragmatism project is not going to work. (2) To identify a realistically-inclined pragmatism w/classical roots and to (cursorily) compare it to modern-day realism. (Invited talk, presented at the Reviving Instrumentalism in the Philosophy of Science conference, University of Durham, July 2019).
90. 'The Logic of the Scientific Realism Debate' – To be a scientific realist or indeed a scientific anti-realist is to make certain commitments. These commitments possess various characteristics. Chief among them is logical structure. More precisely, the commitments impose a web of (partly) inter-connected logical constraints on what can and cannot be legitimately asserted in their name. Alas, such constraints are not always heeded by the advocates of those commitments. There is thus a need to throw light on this web of logical constraints. In this talk, I attempt to do just that by identifying various parts of the logical structure of commitments in the debate over scientific realism. I focus on those that are broadly shared by scientific realists and even by soft anti-realists - e.g. the constructive empiricists. To give a few examples of the kinds of commitments I discuss, they include the following: that there is a mind-independent world, that scientific claims have truth values, that our best scientific theories have some success, that success can be measured in terms of truth content, that our best scientific theories are not true simpliciter and that to guarantee an increase in truth content, success must be preserved across theory change. I hope to show that, with only classical logic (supplemented with relevance constraints) as our guide, all sorts of interesting consequences emerge from these and other commitments. (Invited talk, presented at the Theoretical Philosophy Colloquium, Heinrich Heine Universitaet Duesseldorf, July 2019).
89. 'Concept Defectiveness and Amelioration' – Concepts, like theories, come in various shapes and sizes. Some are narrow, others broad. Some are rigorous, others irreparably tethered in intuition. Some embody ideals of simplicity and unity, others exhibit intricate and tangled parts. Concepts can also be said to perform their epistemic duties more or less adequately and tend to succeed one another in history. In this talk, I explore the parallel lives of scientific concepts and theories with a view to an improved understanding of the structure and dynamics that underlie their formation, proliferation and elimination. To be more precise, I take a closer look at what happens when scientific concepts rival each other and offer some practical suggestions as to how we might go about picking winners. Among the various cases under consideration, I include those that concern ceteris paribus clauses, reasoning by analogy and debates that are at an impasse. The general picture I draw is one of science that can learn from its past mistakes by utilising formal tools (particularly logic) to diagnose and remove defective elements, the ultimate aim being that of providing more refined concepts and, by extension, a better understanding of the world. (Keynote talk, presented at the Understanding Defectiveness in Science Conference, National Autonomous University of Mexico, June 2019).
88. 'Informed Voting' – One consequence of the rise of large-scale societies has been the division of labour. Such a division allows people to specialise in certain areas by training and developing skills and ideas over long periods. This in turn improves efficiency. Those who specialise at doing something, do it faster, better and/or with less energy. They also make more informed decisions about their respective domains. Indeed, we expect such people to be well-informed. We can draw on this expectation to formulate the following norm, aptly named the 'informed-ness norm': To increase the chances of effectively discharging domain-specific duties, one ought to be (as) well-informed in that domain (as practically possible). Obviously, this norm is not, and arguably should not, be adhered to with respect to every domain or decision. In this talk, I ask the question whether the informed-ness norm should be adhered to in the domain of political voting. I briefly make the case that it should and consider two voting systems, John Stuart Mill's plural voting and my own, that attempt to incorporate adherence to the norm. I then proceed to evaluate each system's pros and cons. I conclude the talk with some remarks about what needs to be the case before we adopt such voting systems. (Invited talk, presented at the Centre for Humanities Engaging Science and Society, University of Durham, May 2019).
87. 'Theory-ladenness: Testing the untestable?' – In this talk, I propose a way to experimentally test the thesis that observation is theory-laden. My proposal seeks to create conditions that compel test subjects with diverse theoretical backgrounds to resort to bare (or at least as bare as possible) observational judgments. Thus, if judgments made under those conditions are convergent across test subjects, the said convergence would lend credibility to the view that theory-neutral observations are feasible. This still leaves the question of why any such convergence exists unanswered. Towards the end of the talk, it is argued that the best explanation for observational judgment convergence is the veridicality of those judgments. (Invited talk, presented at the Center for Philosophy of Social Science TINT, University of Helsinki, February 2019).
86. 'Putting Theory-Ladenness to the Test' – This poster explores two experiment designs that seek to determine the extent to which, if at all, observation can be free from theory. The two designs are compared and found to be similar in certain ways. One particular feature critical to both is that they seek to create conditions that compel test subjects with diverse theoretical backgrounds to resort to bare observational skills. If judgments made on the basis of these skills converge, such convergence would provide support for the view that theory-neutral observations can be had. (Poster session, presented at CogSci 2018 conference, University of Wisconsin, Madison, July 2018).
85. 'Taking up Space: The Case of the Ether' – Philosophy has a tendency to remain aloof in relation to practical matters. But it need not! I am a great believer in the potential help the philosophy (and history) of science can offer science. In this talk, I try to obtain some lessons for theory choice and construction from the philosophy and history of science. I do so by considering realist reactions to historical challenges like the pessimistic meta-induction. I focus on ether because it has a long history and is associated with some pretty spectacular theorists and theories. (Invited talk, presented at the From Space to Spacetime Conference, University of Oxford, June 2018).
84. 'On the Brink: When Object-Level Debates Fail' – In this talk, I attempt to do three things. First, I offer a rough characterisation of the various kinds of opponents to object-level debates. Second, I consider (and defend against) two charges that have been directed at such a debate. Finally, I evaluate whether anything general can be said about the conditions under which object-level debates are worth having. (Invited talk, presented at the Debating Debates Workshop, New College of the Humanities , November 2017).
83. 'Computation: A Tale of Two Notions' – What is computation? At the heart of this question appears to lie a paradox. On the one hand, computation looks like the kind of thing virtually any physical system does. After all, physics ensures that some states are followed by other states in a rule-like manner. This view has come to be known as ‘pancomputationalism’. On the other hand, computation looks like the kind of thing that only emerged in recent human history. On this view, very few physical systems compute, namely those that were technologically designed to do so. We may call this ‘oligocomputationalism’. This talk aims to resolve the apparent paradox by putting forward two non-rivalling notions of computation: one that underwrites pancomputationalism and another that underwrites oligocomputationalism. It is argued that each notion is legitimate because it captures different uses of the term ‘computation’. (Contributed talk, presented at the Philosophy and Theory of Artificial Intelligence conference, University of Leeds, November 2017).
82. 'The Physics of Intrinsic Properties' – In this talk, I address the question whether natural science and in particular physics presupposes intrinsic properties. An argument that seems to favour an affirmative answer is put forth based on experimental practice considerations. Various complications are then pondered over and proposed solutions are evaluated. (Invited talk, presented at the Department of Philosophy, University of Mainz, August 2017).
81. 'Philosophical Debates about Models and Representations: What are they Good for?' – Numerous object-level questions have arisen in the context of discussing models and representations. For example: What kind of things are models? What is the nature of scientific representation? How can we learn from models? What kind of things are theories? How do models relate to theories? To each such object-level debate corresponds at least one meta-level one: Is arguing about what kinds of things models are worthwhile? Does it really matter (e.g. epistemically) what the relata of a scientific representation are? We can also formulate a very general one: Will any of the answers to the object-level questions throw light on the epistemology, metaphysics & methods of science? Callender and Cohen (2006) raise serious concerns about a number of discussions over scientific representations: "... some of the debates in the literature are concerned with non-issues" (67). In particular, they attempt to deflate 'the constitution problem' of scientific representation, namely the problem of what constitutes the representation relation between a model and the world? Although I agree with the general, meta-philosophical, tenor of Callender and Cohen's paper, I diverge on the things I complain about. In this talk, I throw doubt on some points of contention that appear in the literature on models & scientific representation. To be exact, I raise concerns having to do with the futility and superficiality of some arguments and distinctions. (Invited talk, presented at the Models and Explanations in Economics Workshop, University of Rostock, July 2017).
80. 'Will Tomorrow be Another Day?' – There are clear existential threats that must not be ignored. Perhaps the greatest of these relate to our invention of humanity-ending technologies. In this talk, I argue that the best way to tackle these threats is to add safeguards to the political process. To be precise, I proposed changes at all three levels of the political process, namely at the level of elected officials, political parties and voters. It may objected, in reply, that such changes are radical and that we are better off maintaining a 'if it ain't broke, don't fix it' attitude. Against this objection, I argue that some changes need making before it's too late as the kind of cases we are considering are humanity-ending. Or, to put it another way, unless we act now, tomorrow may not be another day. (TEDx talk, hosted by SouthBank International London, June 2017).
79. 'Artificial Intelligence and Philosophy: Some Themes' – In this talk, I explore some loosely connected themes from the philosophy of artificial intelligence. These include the relation between thinking and intelligence, the question of whether thinking machines would need to be given rights and the kind of rights that would be appropriate, the issue of whether or not we should severely restrict patents on AI, the amazing potential that such technologies have to accelerate scientific progress and, finally, the difficulties surrounding our understanding and detection of intelligent behaviour in both artificial and biological agents. (Invited talk, presented at the Real Time Club, London, May 2017).
78. 'Testing for Theory-Ladenness: The Stimulus Exchange Procedure (and Ostensive Learnability)' – Observation plays a central role in our everyday and scientific lives. Safeguarding its objectivity is therefore of paramount importance. Let us call ‘veridicalism’ the view that observational reports are largely truthful and that there exists a great deal of inter-subjective agreement concerning their content. Perhaps the biggest threat to this view is the so-called ‘theory-ladenness’ of perception and/or observation, an idea that has long been studied by both philosophers and psychologists. Roughly speaking, this is the idea that theoretical factors, broadly construed, influence the content of perceptual beliefs and observational reports. Such factors, it has been suggested, are most obviously found to be operating in divergence when we compare the observational reports of experts to those of laypersons. This talk proposes the design of some experiments whose aim is to determine whether differences in the content of expert vs. layperson observational reports, where these do indeed exist, can be removed under controlled conditions. Clearly, if such differences could be removed at least sometimes, theory-ladenness of this sort would pose less of a threat to inter-subjective agreement on, and ultimately to the objectivity of, observational reports. It is conjectured that such differences are indeed within our ability to expunge. What is more, it is argued that the content of the resulting observational reports preserves at least some of its evidential relevance. Finally, the approach is compared to Gerhard Schurz’s ostensive learnability criterion for theory-neutral observation concepts. It is argued that there is considerable common ground between the two approaches. (Invited talk, presented at the Celebratory Colloquium in Honor of Gerhard Schurz, University of Duesseldorf, December 2016).
77. 'A General Case for Scientific Realism... and for Anti-Realism' – A view has emerged in the last few years that has shaken the foundations of the scientific realism debate. According to this view, which has rapidly been gaining ground and which we here brand ‘particularism’, general arguments for or against scientific realism or anti-realism are doomed to fail. The war will be won or lost instead on the many battlefields where particular arguments and considerations reign supreme. Without doubt, this view has a lot going for it. In what follows, we explore the pros and cons of both particularism and generalism. More polemically, we make the case that despite their various faults, general arguments still offer much promise in the epic war between realists and anti-realists. To be precise, we argue for a type of generalism that heeds some of the lessons emerging from particularism without conceding the claim that details are the alpha and the omega in establishing the correct epistemic attitude towards hypotheses and their posits. (Contributed talk [with Jamie Rumbelow], presented at the Philosophy of Science Association 2016 Biennial meeting, Atlanta, November 2016).
76. 'Why is it Sensible to Trust the Senses?' – Empiricism is, without doubt, a venerable philosophical view. According to this view, and roughly speaking, we should only trust sensorially obtainable beliefs. Two broad challenges have emerged against it. The first is quite general and questions the very idea of putting trust in any belief, i.e. whether sensorially obtainable or not. The second embraces that idea wholeheartedly but questions its circumscription to the merely sensorially obtainable. This talk develops a line of reasoning that follows the latter of the two paths to challenging empiricism. But it does so with a twist. Instead of attempting to demonstrate the trustworthiness of (certain types of) non-sensorially obtainable beliefs, it (first) seeks to demonstrate the trustworthiness of the sensorially obtainable. This is an important and non-trivial task, as the grounds for this trustworthiness remain, at least partly, elusive. Appearances to the contrary, the majority of empiricist accounts either take the existence of those grounds for granted or offer a perfunctory nod about them. It turns out, or so it will be argued, that it is the satisfaction of certain wide-ranging principles that grounds the trustworthiness of the sensorially obtainable. Moreover, since satisfaction of these principles is not restricted to the sensorially obtainable, empiricism, as it is traditionally conceived, appears to be in dire straits. The talk ends on a more positive note with a suggested emendation to empiricism that aims to embody the said principles. (Invited talk, presented at the Departmental Colloquium, University of Duham, November 2016).
75. 'Is the Scientific Realism Debate Irredeemably Mired?' – Several philosophers have questioned the value of the scientific realism debate. Although the accusations are varied in content, they have been trickling in at a constant rate. The aim of this talk is to take part in the debate over whether the scientific realism debate is worth having. I begin with a short introduction of the scientific debate, distinguishing between broad and narrow construals as well as outlining the main positions, arguments and players. I then proceed to canvass the various accusations that have been launched against it, focusing on three in particular. I argue that although all three, and indeed the whole meta-debate, should be taken seriously, their proponents are rushing in their attempt to seal the debate’s fate. (Invited talk, presented at the Realism, Progress, and Cognitive Values in Science and Philosophy Workshop, University of Trieste, September 2016).
74. 'Materiality does not Equal Lack of Generality' – Norton (2003) develops a material theory of induction that urges us to go local. Why? Because inductive inferences in science are, according to him, “grounded in matters of fact that hold only in particular domains” (p. 647). This theory has been put to work by Saatsi (2009) who uses it to prop up the content-driven or local view of arguments for scientific realism. On this view, which has rapidly been gaining ground, general arguments for or against realism like the no miracles argument and the argument from the pessimistic meta-induction are doomed to fail. The war will be won or lost instead on the many battlefields where specific arguments, the kinds that cite material postulates, reign supreme. In this talk, I counter Saatsi’s anti-generalist tendencies while at the same, and prima facie paradoxically, supporting the central message behind the material theory of induction. (Contributed talk, presented at the 8th Quadrennial International Fellows Conference, University of Lund, July 2016).
73. 'Truly Undesirable Hypotheses' – Hypotheses may be undesirable for a number of reasons. Some hypotheses are just too slippery to be subjected to tests. They are what Popper has called 'unfalsifiable'. Others are just plain false; in Popperian terminology, these are hypotheses that have been falsified. Yet others suffer from ad hoc-ness. The focus of this talk is ad hoc hypotheses. I begin with a brief examination of some notable conceptions of ad hoc-ness in the literature. It is pointed out that there is a general problem afflicting most such conceptions, namely the intuitive judgments that are supposed to motivate them are not universally shared. Instead of getting bogged down in what ad hoc-ness exactly means, I shift the focus of the analysis to one undesirable feature often present in alleged cases of ad hoc-ness. I call this feature the ‘monstrousness’ of a hypothesis. A fully articulated formal account of this feature is presented by specifying what it is about the internal constitution of a hypothesis that makes it monstrous. (Invited talk, presented at the Department of Economics, University of Rostock, July 2016).
72. 'Measurements and Standards: The Case of the Meter' – The take-off point for this talk is an objection that originates in Wittgenstein, namely that we cannot determine the length of the standard meter as that very meter determines length. It is argued that such concerns vanish in the face of the latest conceptions of standard units like the meter. That’s because (almost all) standards nowadays are not particular pieces of matter but definitions that incorporate references to fundamental physical constants. The main part of the talk involves a discussion of several presumed advantages and disadvantages of this conceptual shift. It is argued that the advantages of the shift far outweigh any disadvantages and that the last vestiges of materiality will soon make way for a purely definitional approach to all standards. Let’s travel back to Wittgenstein’s time. How would one find out whether something was a meter long? By laying it against some sample meter like a ruler. And what ensured that these sample meters were a meter long? Calibration against other samples, themselves subjected to calibration in a hierarchy of sample meters whose apex was the standard meter in Paris. But what about the standard meter itself? In a well-known passage, Wittgenstein points out that the standard meter cannot be laid against itself: “There is one thing of which one can state neither that it is 1 metre long, nor that it is not 1 metre long, and that is the standard metre in Paris” (2009, p. 29e) [original emphasis]. The standard meter, he reasons, is a linguistic ‘instrument’. As such, it provides a means through which length can be represented, though it is not itself representable. It is thus illegitimate, he claims, to ask whether the standard meter is a meter long. Fast forward to today. As already mentioned, we nowadays rely not on material samples but on definitions that utilise fundamental physical constants. The standard meter is defined as follows: “The meter is the length of the path travelled by light in vacuum during a time interval of 1/299 792 458 of a second” (NIST). Thus, the issue of comparing a sample to itself that so puzzled Wittgenstein doesn’t even arise. The definitional approach has several presumed advantages and disadvantages. The main advantage is that it provides stability. By anchoring a standard to something invariant like a fundamental physical constant, e.g. the speed of light in a vacuum, the stability of the standard itself is bolstered. The price for such stability is that our new standards run the risk of being radically divorced from the world. Perfect vacuums, for example, are considered impossible. To address this problem, it is argued that even though some reference objects or conditions in definitions are ideal, it is still legitimate and non-trivial to use them on the basis of extrapolations from real objects and conditions. For example, we know how light slows down when passing through media whose refractive indices are progressively higher and from this knowledge we can reasonably extrapolate what the speed of light would be like if nothing impeded its path. The aforementioned and other such presumed advantages and disadvantages are discussed and the case is made that, on balance, the definitional approach offers the best way forward in matters of standardisation. (Contributed talk, presented at the PSF2016 conference, Doorn, May 2016).
71. 'How to Bring Forward a Desirable Future' – In this talk, I venture into the contested issue of patent regulation as this applies to AI-related technologies. I begin by pointing out that from our present point of view we can consider the future as being open to many possibilities. How do we steer ourselves toward a desirable future state of the world where there is universal mental and physical health, peace, justice, happiness, education, longevity, etc? Moreover, what is the fastest and safest route there? In other words, what do we need to change now to bring that future forward? Although innovation in AI holds amazing promise in bringing forward such a future, such innovation is impeded by patents. The only solution, I argue, is to abolish or at least drastically curtail patent rights in the domain of AI. (TEDx talk, hosted by the University of Nicosia, November 2015).
70. 'Measuring Unification' – Scientists tend to opt for simpler and more unified hypotheses. Such considerations are often viewed as at best pragmatic in matters of theory choice. In this talk, I put forth a novel conception and an associated measure of unification, both of which are demonstrably more than just pragmatic considerations. The discussion commences with a brief survey of some failed attempts to conceptualise unification. It then proceeds to an analysis of the notions of confirmational connectedness and disconnectedness, which are essential ingredients in the proposed conception of unification. Roughly speaking, the notions attempt to capture the way support flows / fails to flow between the content parts of a hypothesis. The more the content of a hypothesis is confirmationally connected, the more that content is unified. Since the confirmational connectedness of two content parts is determined by purely objective matters of fact, the proposed notion and measure of unification are themselves objective. (Contributed talk, presented at the European Philosophy of Science Association 2015 conference, Duesseldorf, September 2015).
69. 'Why Immaterial Standards Matter' – In a well-known passage in the Investigations, Wittgenstein makes the following claim: “There is one thing of which one can state neither that it is 1 metre long, nor that it is not 1 metre long, and that is the standard metre in Paris.” (2009, p. 29e) [original emphasis]. The standard meter, Wittgenstein reasons, is an ‘instrument’ of our language. Qua an instrument, it provides a means through which length can be represented, though it is not itself representable. It is thus illegitimate, he claims, to ask whether the standard meter is a meter long. I begin this talk by showing how Wittgenstein’s concerns become immaterial in the face of modern measurement theory. That’s because standards nowadays are set by definitions, not samples. I then proceed to explore several advantages of the definitional approach, focusing, among other things, on the stability it offers over the old sample-centric approach. (Contributed talk, presented at The Making of Measurement Conference, Centre for Research in the Arts, Social Sciences and Humanities, University of Cambridge, July 2015).
68. 'Can Theory-Laden Effects be Removed?' – Observation plays a central role in our everyday and scientific lives. Safeguarding its objectivity is therefore of paramount importance. Let us call ‘veridicalism’ the view that observational reports are largely truthful and that there exists a great deal of inter-subjective agreement concerning their content. Perhaps the biggest threat to this view is the so-called ‘theory-ladenness’ of perception and/or observation, an idea that has long been studied by both philosophers and psychologists. Roughly speaking, this is the idea that theoretical factors, broadly construed, influence the content of perceptual beliefs and observational reports. Such factors, it has been suggested, are most obviously found to be operating in divergence when we compare the observational reports of experts to those of laypersons. This talk proposes the design of a type of experiment whose aim is to determine whether differences in the content of expert vs. layperson observational reports, where these do indeed exist, can be removed under controlled conditions. Clearly, if such differences could be removed at least sometimes, theory-ladenness of this sort would pose less of a threat to inter-subjective agreement on, and ultimately to the objectivity of, observational reports. It is conjectured that such differences are indeed within our ability to expunge. What is more, it is argued that the content of the resulting observational reports preserves at least some of its evidential relevance. The hope is that through discussing these issues with fellow philosophers and psychologists the design of the proposed experiment will be refined prior to actually carrying it out. (Contributed talk, presented at the 23rd Annual Meeting of the European Society for Philosophy and Psychology conference, University of Tartu, July 2015).
67. 'How to Make a Long Theory Short' – Scientists tend to opt for simpler and more unified theories. In this talk, I put forth a novel conception of unification as well as an associated formal measure. I begin the discussion with a brief survey of some failed attempts to conceptualise unification. I then proceed to offer an analysis of the notions of confirmational connectedness and disconnectedness. These are essential to the proposed conception of unification. Roughly speaking, the notions attempt to capture the way support flows or fails to flow between the content parts of a theory. The more the content of a theory is confirmationally connected, the more that content is unified. Theories that make more strides toward unification, and, hence, are more economical in the way they capture the same phenomena, are thus to be preferred to those that make less strides for purely confirmational reasons. (Contributed talk, presented at the British Society for the Philosophy of Science Annual Conference, University of Manchester, July 2015).
66. 'Do you See what I See?' – As the title suggests, in this talk I explore the difficulties surrounding the question of whether or not different individuals perceive the same things in roughly the same ways. I argue that despite all sorts of differences in the contents of perceptions, we nonetheless do perceive the same things out there. The motivation for this view builds on John Locke's inverted spectrum argument. (Invited talk, presented at the IAI Academy, How the Light Gets in 2015 (Music and Philosophy Festival), Hay-on-Wye, May 2015).
65. 'Is the World a Massive Simulation?' – What if the world as we know it (potentially including ourselves) is a sophisticated computer simulation designed by an alien race? Perhaps they're running a scientific experiment. In this talk, I raise some issues concerning this and other radical sceptical arguments like it. These range from the specific notion of logical possibility in use to questions about the computational feasibility as well as the structural limits of such scenarios. (Invited talk, presented at the IAI Academy, How the Light Gets in 2015 (Music and Philosophy Festival), Hay-on-Wye, May 2015).
64. 'How to Really Win an Argument' – In this talk I offer some practical advice to the public on how to honestly win arguments. These include tips on how to: avoid being led on tangents, articulate one's suppositions, find common ground and use the method of indirect proof. (Invited talk, presented at the IAI Academy, How the Light Gets in 2015 (Music and Philosophy Festival), Hay-on-Wye, May 2015).
63. 'What Makes a Hypothesis Ad Hoc?' – Natural and social scientists alike, no matter whether they are experimenters or theoreticians, can hardly carry out research without having to think about ad hoc-ness. Given the concept’s ubiquity, one would imagine that it is rather well understood. Not quite. Though there is certainly agreement on what count as clear-cut cases of ad hoc hypotheses, e.g. the much-maligned Ptolemaic systems of astronomy, confusion abounds regarding what exactly makes a hypothesis or manoeuvre ad hoc. In this talk I attempt to wade through this confusion and offer some lucidity. I begin with a brief examination of some notable conceptions of ad hocness. I then point out that there is a general problem afflicting these conceptions, namely intuitive judgments that are supposed to motivate them are not always consistent. Instead of getting bogged down in an attempt to give a full-fledged analysis of the concept, which may not even be possible given the aforementioned tensions, I shift the focus to one undesirable feature, which I label ‘monstrousness’, often present in alleged cases of ad hoc-ness. A formal account of this feature is put forth by specifying what it is about the internal constitution of a hypothesis that makes it monstrous. The talk concludes with a discussion of some examples. (Invited talk, presented at the Philosophy Department, University of Montreal, November 2014).
62. 'Empiricism Unchained: Debunking the Instrument Conspiracy' – Observations made through instruments that cannot also be made with our unaided sensory organs lack epistemic credibility, claim the constructive empiricists. One well-known challenge to this view draws attention to the fact that distinct types of instruments have been known to yield the same or at least highly similar observational outputs. The implication, of course, is that the convergence of output is evidence of the ability of those instruments to detect real features of the world. To meet this challenge, the constructive empiricist attempts to argue that the convergence is an artefact of the practice of calibration. In this talk, I argue that this is desperate, conspiratorial, attempt to rule out the veridicality of the output of instruments. My inquiry is framed around a broader discussion of what makes unaided sensory organs epistemically credible. Surprisingly, constructive empiricists say nothing on this matter. Against this background, I put forth a proposal for what lends unaided sensory organs epistemic credibility and, unsurprisingly, argue that the same credibility is extended to several types of instruments. (Invited talk, presented at the Rotman Institute of Philosophy, University of Western Ontario, November 2014).
61. 'Methods and Universality' – Over the years several attempts have been made to put forth scientific methods with universal applicability. These attempts have been met with incredulity. Any such attempt, it is argued, is likely to fail given the substantial ontological differences between scientific disciplines as well as within a given scientific discipline across history. As a consequence, widespread pessimism has ensued over the existence of such methods. In this talk I endeavour to stem the pessimistic tide by arguing that we are already in possession of some universal methods and, moreover, that we are converging towards others, giving various examples along the way. (Contributed talk, presented at the Symposium on: 'The Scientific Method – Revisited', Philosophy of Science Association 2014 Biennial meeting, Chicago, November 2014).
60. 'Intelligence as Portability in Problem-Solving' – What is intelligence? Is it something that we measure when we conduct so-called IQ tests? Is it something that, no matter how it gets measured, is uniquely human? Does some form of the Turing test, an alleged indicator of the presence of machine intelligence, provide some help in answering the original question? Is there such a thing called ‘emotional intelligence’? If so, how is it related to traditional, i.e. non-emotional, intelligence? Much disagreement surrounds these and other related questions. In this talk, I address the first and most central of these questions by focusing on two traits that, as I argue, are ubiquitous in behaviour that we intuitively deem as intelligent, namely success in problem-solving and portability. I argue for a specific articulation of these traits and conclude that a conception of intelligence with this articulation at its foundations makes some headway in understanding the phenomenon under study better. (Contributed talk, presented at the International Association for Computing and Philosophy 2014 conference, Thessaloniki, July 2014).
59. 'Veridical Perception and Observation' – Philosophical debates have numerous departure points. I am interested in a rather rich departure point that takes not only the world of mental states for granted but also the existence of a mind-independent world populated with distinct things, some of which are embodied humans with brains and sensory organs. This departure point still leaves open the question whether our mental states about the mind-independent world are truthful. Let us call ‘veridicalism’ the view that perceptual beliefs and observational reports are largely truthful. In this talk, I argue for veridicalism by, among other things, examining in detail and ultimately deflating in import what many consider to be the view’s greatest threat, the so-called ‘theory-ladenness’ of perception and/or observation. More specifically, I argue that to the extent that theoretical factors influence the formation of perceptual beliefs and observational reports, as theory-ladenness demands, that influence is typically not detrimental to their veridicality or at least not irreversibly so. (Keynote talk, presented at the Experience and Reality Conference, Catholic University in Ružomberok, Slovakia, June 2014).
58. 'The Metaphysical Status of Logical Principles' – This talk mounts a defence of the view that logic can, and in actual fact does, univocally and definitively answer questions about the validity of at least some inferences. This is tantamount to saying that some rules (and potentially axioms) are the right ones. More controversially, I argue that their rightness is determined by the physical world itself. Indeed, I argue that the right logic, but obviously not our conception of it, is itself a structural feature of the world. (Invited talk, presented at the Aspects and Prospects of Realism in the Philosophy of Science and Mathematics Conference, University of Athens, March 2014).
hoc-ness and Monstrousness'
- The aim of this talk is to throw
some light on the notion of ad hoc-ness and its value to scientific
methodology. In discussing the notion, I focus and attempt to explicate
one particular undesirable characteristic associated with it, namely
what I dub ‘monstrousness’. Roughly speaking, monstrousness reflects
the degree to which parts of a hypothesis are unnaturally joined
together. (Talk presented at the Unification
and Coherence workshop, University of Duesseldorf, January 16
Inferentialist Account of Confirmation' - The aim of this
talk is to defend the inferentialist view from a challenge that
originates in predictivism. It is argued that predictivism and its
challenge fail because the non-inferential elements it introduces
invariably lead to the issuing of contradictory confirmational
judgments. (Talk to be presented at the Workshop
on Inferentialism in
Epistemology and Philosophy of Science, UNED Madrid, November
'Science with Artificially Intelligent Agents: The Case of
Gerrymandered Hypotheses' - Barring some civilisation-ending
man-made catastrophe, future scientists will likely incorporate fully
fledged artificially intelligent agents in their ranks. Their tasks
will include the conjecturing, extending and testing of hypotheses. If
we are to hand over at least some of the aforementioned tasks to
artificially intelligent agents, we need to find ways to make explicit
and ultimately formal, not to mention computable, the more obscure of
the methods that scientists currently employ with some measure of
success in their inquiries. This talk puts forward a fully articulated
formal solution to the problem of how to conjecture new hypotheses or
extend existing ones such that they do not save phenomena in
gerrymandered or ad hoc ways. (To be presented at the 2nd Conference on
the Philosophy and Theory of Artificial Intelligence (PT-AI 2013),
University of Oxford, September 21-22 2013).
'Logic as Ultra-Physics' - The number of rival logical
systems is growing without an end in sight. This has proved to be
a mixed blessing. On the one hand, we have a rich set of formal tools
that allows us to
model inferences in a variety of ways. On the other hand, the existence
of rival logical
systems threatens to undermine logic’s role as a univocal and
definitive arbiter of
disagreements over the validity of inferences. If, for any given
inference, one can always
find a logical system that sanctions its validity and another that
forbids it, then it seems that
the aforementioned role no longer befits logic. The most that we can
hope for are intrasystem
evaluations of the validity of inferences. The consequences for
rational debate are
dire. Disputes in philosophy, science and beyond run the risk of
turning into trivial squabbles
as anybody who finds themselves in a logical pickle may be able to slip
away to a more
agreeable logical system. The aim of this talk is to mount a defence of
the view that logic
can, and in actual fact does, univocally and definitively answer
questions about the validity
of at least some inferences. This is tantamount to saying that some
rules (and potentially
axioms) are the right ones. If you like, they are the ones that would
fill the pages of a book
on the one ‘true’ logic. More controversially, I argue that their
rightness is determined by
the physical world itself. Indeed, I argue that the right logic, but
obviously not our
conception of it, is itself a structural feature of the world. For
obvious reasons I call the
emerging view ‘logic as ultra-physics’. As a case study of this
ultra-physics, I utilise the
principle of non-contradiction. (Invited talk, presented at the Departmental
California State University Los Angeles, October 10 2013).
'Positivism in the 21st Century' - In this talk I consider how a version of positivism can survive and even thrive in the 21st century. To be precise, I propose and defend a liberalised conception of observability and an associated, and accordingly liberalised, conception of empiricism. ‘Universal observability’ and ‘universal empiricism’ unchain themselves from the burdens of traditional conceptions of experience while remaining firmly tethered to what, it is argued, is the true source of epistemic credibility in the senses. (Invited talk, presented at Aldo Antonelli's
graduate seminar, University of California Davis, October 8
'Empiricism Unchained' - Empiricism has a long and venerable
history. Aristotle, the Epicureans, Sextus Empiricus, Francis
Bacon, Locke, Hume, Mill, Mach and the Logical Empiricists, among
others, represent a long line of historically influential empiricists
who, one way or another, placed an emphasis on knowledge gained through
the senses. In recent times the most highly articulated and influential
edition of empiricism is undoubtedly Bas van Fraassen’s constructive
empiricism. Science, according to this view, aims at empirically
adequate theories, i.e. theories that save all and only the observable
phenomena. Roughly put, something is observable in van Fraassen’s view
if a member of the human epistemic community can detect it with their
unaided senses. Critics
have contested this notion, citing, among other reasons, that most of
what counts as knowledge in
natural science concerns things that are detectable only with
instruments, i.e. things that are
unobservable and hence unknowable by van Fraassen’s lights.
Beg-the-question accusations fly
back and forth. As a consequence a stalemate has ensued. In this talk,
I put forth a
liberalised conception of observability and an associated, and
accordingly liberalised, conception of
empiricism. ‘Universal observability’ and ‘universal empiricism’, as I
call them, unchain themselves from
traditional conceptions of experience while remaining firmly tethered
to what, I argue, is the
true source of epistemic merit in the senses. (Invited talk, presented
at the Bay
Area Philosophy of Science seminar, San
Francisco State University, October 03 2013).
Scientific Method' - In this talk, I argue, contrary to
popular belief, that there is such a thing as the scientific method and
that we already possess some of its principles or at least approximate
versions of them. The popularity of the opposite view can be traced
back to the fact that most attempts to identify the scientific method
involve an overly strong conception and are therefore bound to fail. I
propose a weaker conception, one that maintains that there is core
methodology shared across all domains of inquiry while at the same time
allows for variation on the periphery. (Presented at the British
Society for the Philosophy of Science Annual Conference, University of
Exeter, July 4-5 2013).
in Confirmation' - The study of confirmation is the study
of the conditions under which a piece of evidence supports, or ought to
support, a hypothesis as well as of the level of that support. There
are two major kinds of confirmation theories, objective and subjective.
Objective theories hold that confirmation questions are settled via
purely objective considerations. Subjective ones hold that at least
some non-objective considerations come into play. With some exceptions
(see, for example, Williamson 2010), most confirmation theorists
nowadays opt for subjective theories. The pessimism over objective
theories is most probably due to the fact that it has proved very hard,
some may even say impossible, to find reasonable principles that decide
every question about confirmation in purely objective terms. The aim of
this talk is to reverse some of that pessimism by putting in place some
cornerstones in the foundations for an objective theory of
confirmation. This is achieved by considering lessons not from the
failures of subjective theories, which, no doubt, there are many, but
rather from the failures of predictivism, a mini theory of confirmation
that is typically conceived of as objective. (Presented at the
of Science in a Forest (PSF2013) Triennial Conference,
Leusden, Netherlands, May 23-25 2013).
Monsters and the Frankenstein Theory of Confirmation' - This
talk concerns the highly vexing issue of how a confirmation theory
ought to handle post-hoc monsters, that is, post-hocly constructed or
modified hypotheses like Velikovsky's theory or Ptolemaic astronomy.
One approach to this issue has been to demonise post-hocness itself,
arguing that no hypothesis earns support from evidence that has been
used in its construction or modification. Another approach has been to
attempt to segregate the monstrous from the non-monstrous post-hoc
hypotheses and to argue that only the latter earn support from
accommodated evidence. In this talk, I'd like to put forth a more
subtle approach which I call the 'Frankenstein' theory of confirmation.
According to this approach, even post-hoc monsters earn confirmation
from accommodated evidence but the confirmation earned does not spread
evenly throughout the content of such hypotheses. (Invited talk
presented at the Logos
Colloquium, Logic, Language and Cognition
Research Group, University of Barcelona, April 18 2013).
Houdini Argument for Intrinsic Properties' - The aim of this
is two-fold. First, to motivate some desiderata for an adequate
conception of the intrinsic vs. extrinsic property distinction. Second,
to try to answer the question whether any scientific properties qualify
as intrinsic (in a sense that satisfies the above desiderata) through a
series of related thought-experiments. The thought-experiments center
around the idea of shielding objects to prevent them from causal
interactions with other objects and seeing what, if anything, remains
invariant and is therefore a good candidate for being intrinsic.
(Invited talk presented at the Metaphysics
of Scientific Realism
Workshop, Department of Philosophy and History of Science, University
of Athens, March 1-2 2013).
47. 'The Language of Science' – Scientists, much like their non-scientific brethren, record and communicate their theories and results through the medium of language. A pertinent question that arises in this context is whether the choice of language and hence the choice of ontology can be guided by more than merely pragmatic considerations. Carnap (1950) categorically denies that it can. His central worry is that there is no external or independent standpoint, i.e. no extra-linguistic framework, from which one may adjudicate such a question. An affirmative answer to the question meets several other hurdles. For example, the fact must be confronted that up till now the language of science has been rather fragmentary as each scientific discipline’s language is to some extent autonomous from the others and there are at least some conceptual and ontological variations even within given scientific disciplines. Beyond these factual concerns there are also more principled ones. Two prominent concerns that emerge in philosophy of language discussions are the gavagai case and the grue paradox. In this talk I build on previously published work (see Votsis 2012) in an attempt to address all of these concerns and hence to motivate an affirmative answer to the above question. (Invited talk, presented at the Theoretical Philosophy Colloquium, University of Duesseldorf, January 2013).
46. 'Επανεκτιμόντας την Εμβέλεια των Προσδιορίστικων Παραγόντων' – Στην ομιλία αυτή προσπαθώ να αποφύγω το παραπάνω αδιέξοδο προτείνοντας ότι αυτό που δίνει επιστημική αξία στις προτάσεις παρατήρησης, δίνει επίσης επιστημική αξία σε πολύ ευρύτερο φάσμα προσδιοριστικών παραγόντων από ότι οι κατασκευαστικοί εμπειριστές θα επέτρεπαν. Η έκβαση είναι ότι ο κατασκευαστικός εμπειρισμός αναιρείται εκ των έσω, δηλ. χώρις να χρειάζεται οι ρεαλιστές να υποστηρίξουν ότι ο προσδιορισμός της αλήθειας του περιεχομένου θεωριών που αφορά μη-παρατηρήσιμες οντότητες μπορεί να προέλθει είτε από το (1) ολιστικά (holistically) από αληθείς προτάσεις παρατήρησης ή είτε από το (2) από θεωρητικές αρετές, όπως η απλότητα (simplicity). (Contributed talk, presented at the 2nd Panhellenic Conference of Philosophy of Science, University of Athens, November 2012).
Empiricism' - In this talk, I consider and reject van
Fraassen’s conception of observability and corresponding brand of
empiricism. I put forth an alternative conception that seeks to allay
the realist’s concerns about the validity of instrument-based
observation in science yet preserves vital empiricist sensitivities.
Along with the new conception of observability I lay the foundations
for a new form of empiricism. Universal empiricism, as I call it,
divorces itself from traditional conceptions of experience while
remaining wedded to what is epistemically meritorious about empiricism,
namely the idea that truth-conducive contact with the environment is
the ultimate judge of knowledge. (Presented at the Philosophy
of Science Association Twenty-Third Biennial Meeting, San Diego
in my absence by Otavio Bueno], November 15 2012).
44. 'Πρωτότυπες Προβλέψεις και η Θεωρία της Επικύρωσης' – Ο στόχος της παρούσας εργασίας είναι να αξιολογήσει κατά πόσον οι πρωτότυπες προβλέψεις (novel predictions), σε αντίθεση με τις μη-πρωτότυπες, θα πρέπει να παρέχουν περισσότερη, ή ακόμα και μοναδική, επικύρωση / υποστήριξη για τις υποθέσεις που τις εκδίδουν. Αρχίζω με ένα γενικό κίνητρο για την ιδέα των πρωτότυπων προβλέψεων. Αυτό λαμβάνει τη μορφή μίας πρόκλησης. Συγκεκριμένα, τι επιπρόσθετο χρειάζεται από ένα κατάλληλο είδος συμπερασματικής (inferential) σχέσης μεταξύ υπόθεσης και αποδεικτικών στοιχείων για να κερδίσει μία υπόθεση μερική (ή περισσότερη από ότι θα έπρεπε διαφορετικά) επικύρωση / υποστήριξη. Δύο οικογένειες πρωτότυπων πρόβλεψεων εξετάζονται και απορρίπτονται. Αυτό μας οδηγεί σε μία συζήτηση περί τη σωστή μορφή μίας θεωρίας της επικύρωσης. Προτείνω τρία (αναγκαία αλλά όχι επαρκή) ζητούμενα που μια αντικειμενική θεωρία πρέπει να ικανοποιεί. Αν ο χρόνος επιτρέπει, θα διερευνήσω τι συνέπειες έχουν αυτά τα αποτελέσματα για τη συζήτηση του Ρεαλισμού. (Invited talk, presented at the Σεμινάριο του Τομέα ΦΘΕΤ, Εθνικό και Καποδιστριακό Πανεπιστήμιο Αθηνών, October 2012).
43. 'A Frame-Theoretic Primer on Reduction in Science' – The concept of reduction is central to the philosophy of science and to science itself. Intuitively speaking, a branch of science or a scientific theory is said to reduce to another branch of science or scientific theory when the ideas, ontology, laws and predictions of one are accounted in some way by the other. In this talk, we provide a frame-theoretic account of the concept of reduction. We first reconstruct the classical conception of reduction in frame-theoretic terms. This renders reduction as a sort of mapping relation between frames. We then consider a number of objections raised against the classical conception as well as proposed solutions. It is argued that although some progress has been made with neo-classical conceptions of reduction, various problems remain unsolved. We end the paper by proposing further modifications to the neo-classical accounts, modeling such modifications in frame-theoretic terms. (Contributed talk [with Gerhard Schurz], presented at the Concept Types and Frames in Language, Cognition, and Science (CTF12), Duesseldorf, August 2012).
Structuralist Theory of Reference' - This talk is divided
into three parts. The first part concerns the clash between existing
conceptions of reference. It is argued that although in conflict there
is a sense in which these conceptions are legitimate in different
contexts. Even so, some contexts are more demanding than others and, as
a consequence, put constraints on the appropriateness of the concept of
reference. In the context of the scientific realism debate, one
important constraint is the ability to provide an adequate account of
the phenomena surrounding the reference of scientific terms in cases of
theory change or of full-blown scientific revolution. The second part
reflects on what happens to concepts of reference when specific
versions of realism and anti-realism are endorsed. The emphasis here is
on the most promising such versions of late, namely structural realism
and empiricist structuralism. In spite of their differences, both of
these views put forth a structuralist epistemology that, as it turns
out, forces our conceptions of reference to take into account the
relations that the objects we wish to refer to stand in with respect to
other objects. Finally, the third part considers the ways in which our
attempts to refer to things in the world appear to fall short or indeed
do so. The focus here is on puzzles relating to the indeterminacy of
reference. Two such puzzles are discussed and dismissed. At the end of
the talk it is conceded that reference is in a sense indeterminate but
that this indeterminacy springs from structuralist limitations on
knowledge and is not to be feared.- (Invited talk presented at the Reference
and Scientific Realism Symposium, Wuhan University, August 17
Feeble, Few and Far Between Coincidences Argument for Realism'
- The no coincidences argument for realism holds that we can infer the
truth or approximate truth of
a theory when it reaches a certain level of success. It would be a
cosmic coincidence, the realists
claim, if a theory were to enjoy such success and yet not be true. In
this talk, I focus on the most
prominent realist conception of the level of success required to
licence inferences to truth or
approximate truth. The conception I refer to involves the demand that
theories generate novel
predictions. I argue that, as it stands, this conception is too weak to
deflect a rather simple
challenge. I then propose ways to strengthen the realist conception in
the hope that it is better able
to overcome whatever challenges we throw at it. (Presented
British Society for the Philosophy of Science Annual Conference,
University of Sterling,
July 5-6 2012).
for Scientific Realism: Some Lessons from Confirmation Theory'
- The most commonly cited argument for scientific realism is the
so-called ‘no miracles’ argument. According to this argument, it is
highly implausible to claim that the predictive success enjoyed by some
scientific theories is the product of a long series of lucky
coincidences. A more plausible, indeed some argue the only plausible,
claim is that the corresponding theories are true, or, at the very
least, contain some non-negligible truth content. The majority of
realists deem novel predictive success, roughly the ability of a theory
to predict hitherto unknown types of phenomena, to be particularly
telling in favour of the second claim. In this talk, I argue against
the superiority of novel as opposed to non-novel predictive success. I
do so by pointing out that objective standards in confirmation theory
can only be had if confirmational assessments remain invariant under
anything other than the evidence and the hypothesis under
consideration, something that is not true in accounts of novel
predictive success. After laying the foundations of what I take to be
the correct conception of confirmation relations, I argue that support
from evidence to different parts of a theory does not spread as broadly
as has been popularly maintained. Among other things, this conception
of confirmation relations has crucial consequences for the defence of
scientific realism, consequences that I plan to explore in some depth
during the last part of my talk. (Invited talk presented
Mind, Language, Knowledge: The Perspective of Philosophy, University of
June 29 2012).
Care about the Scientific Realism Debate?'
- In this talk, I try to provide motivation for why one ought to take
the scientific realism debate seriously, paying particular attention to
two groups: philosophers of science and scientists. Among other things,
it is argued that various debates in the philosophy of science as well
as in science turn out to involve, sometimes even inadvertently,
substantial epistemic or metaphysical claims of the kind being debated
in the scientific realism debate. (Plenary talk presented
7th Quadrennial International Pittsburg Fellows Conference, University
June 12-14 2012).
- Scientific realists often assert that our best scientific theories
and models provide true or approximately true descriptions of facts
about nature and that they cut nature at its joints. The latter
assertion presupposes, among other things, that the physical domains
investigated by such theories and models are structured in a unique
way. More metaphorically put, that nature has joints! Let us call this
the ‘uniqueness assumption’. As is customary in philosophy no
assumption is safe from scrutiny. The idea has been floated that nature
has no joints. Frigg (2006), for example, suggests that “the physical
world does not come sliced up” (p. 56). Let us call this the
‘non-uniqueness assumption’. In this talk I attempt to articulate a
view that submits to the non-uniqueness assumption and yet is able to
maintain realist credentials. (Presented
Perspectivalism Workshop, University of Ghent,
January 19-20 2012).
NB: I have now turned this
talk into a paper ‘Putting
Realism in Perspectivism’, 2012, Philosophica,
vol. 84: 85–122.
as a Guide to Falsity?' - Participants in the debate about
whether simplicity is a guide to truth or merely pragmatically useful
typically wrangle over two problems: (1) how to weigh simplicity
against other virtues like strength and fitness and (2) whether there
is a unique measure for simplicity that straps it to truth. I would
like to put forth a third problem: (3) Even if problems (1) and (2)
were solved, it is far from clear
whether the simplest theory out of an available class of competitors
would always be the one closest
to the truth. (To be presented
at the 14th Congress
of Logic, Methodology and Philosophy of Science, Nancy, July
Realism and Causation: An Unhappy Marriage?' - It has
recently been objected that structural realism, in its various guises,
is unable to adequately account for causal phenomena (see, for example,
Psillos 2006). In this talk, I consider whether structural realism has
the resources to address this objection. (To be presented at the British
Society for the Philosophy of Science Annual Conference,
Sussex, July 7-8 2011).
Models' - Among the main aims of science are to predict and
explain the world. In order to pursue those aims, scientists employ
theories, models, equations and the like to represent features of the
world. How are we to understand this representation relation?
Supporters of the semantic view of theories typically construe the
representation relation in one of two ways: (i) in terms of some notion
of morphism or (ii) in terms of some notion of similarity. In this
talk, I take a closer look at a number of objections mounted against
(i) and (ii). I argue that on the whole such objections are misguided
for they demand representation in science to meet loose standards that
the critics conceive of as appertaining to representation in art.
Indeed, I argue that if we were to take such a demand seriously it
would lead to runaway models of scientific representation that are of
no clear benefit to the debate over what makes a scientific theory,
model or equation represent its target domain informatively and
adequately. (Invited talk presented
at the Seminario
di Logica e Filosofia della Scienza, Università di Firenze,
May 5 2011).
the Methods of Science' - In this talk, I examine when and
why we should trust scientific theories. I start off by considering a
number of methods for deciding when to trust beliefs in the context of
everyday life. I then compare these methods to those utilised in the
context of science. It turns out that despite some differences there
are plenty of common practices towards good believing in the two
contexts. Indeed in various cases it can be argued that the practices
of science are more stringent versions of those we employ in everyday
life. At least with respect to these cases then one cannot endorse
(either explicitly or implicitly) the validity of everyday life
practices but reject the analogous ones in science. (Invited talk
presented at the Dipartimento
di Filosofia, Università di Pisa, May 4 2011).
Realism meets the Social Sciences' - Structural realism is
arguably one of the most influential movements to have emerged in
philosophy of science in the last decade or so. Advocates of this
movement attempt to answer epistemological and/or ontological questions
concerning science by arguing that the key to all such questions is the
mathematical formalism of a theory. This is so, according to structural
realists, because the mathematical formalism encodes all and only what
is important about a theory’s target domain, namely its structure.
Almost without exception, discussions of structural realism centre on
the natural sciences and in particular on modern physics. Given that a
number of other sciences are less – indeed in some cases much less –
mathematised than modern physics, does structural realism have anything
informative to say about them? In this talk, I take up the task of
articulating what structural realists ought to say about the social
sciences if they are to consider themselves as offering a coherent
philosophy for the whole of science. (Invited talk presented at the Economics
and Institutional Change Research Seminar, Institute for
Advanced Studies (IMT) Lucca, May 3 2011).
and Science: Past, Present and Future' - Philosophy and
science have a rather intricate relationship. In this talk I will make
some tentative steps towards answering a number of questions that are
pertinent to this relationship in the hope of throwing some further
light on it. The questions are as follows: (1) In what respects, if
any, are the subject matter, aims, methods and achievements of the two
endeavours similar? (2) How, if at all, has the development of the one
influenced that of the other? (3) To what extent are they currently
interacting? (4) What does the future hold for science and philosophy?
(Invited talk presented at the Metaphysical
Society's Annual Symposium, Trinity College Dublin, April 13
and Ontic Commitments: In Perfect Alignment?' - The epistemic
form of structural realism asserts that our knowledge of the world is
restricted to its structural features. Several proponents of this view
assume that the world possesses non-structural features; features
which, according to their view, cannot be known. In other words, they
assume that there is, or, there ought to be (on the basis of normative
arguments in epistemology), always a gap between our epistemological
and ontological commitments. The ontic form of structural realism
denies that this is, or ought to be, the case. Proponents of this view
argue that the perfect alignment of epistemological and ontological
commitments is a highly desirable meta-
theoretical feature. They argue this on the basis of the prima facie
sensible principle that our ontological commitments ought never to
overreach our epistemic ones. Naturally the issue of alignment
transcends the debate between the epistemic and the ontic structural
realists. Is it in principle impossible for there to be circumstances
under which we ought to subscribe to the misalignment of
epistemological and ontological commitments? What do the different
answers to this question entail for ontic structural realism?
(Invited talk presented at the Structuralism
Workshop, John J. Reilly Center for Science, Technology, and Values,
University of Notre Dame, November 17-20 2010).
Prospective Stance in Realism' - Scientific realists
endeavour to secure inferences from empirical success to approximate
truth by arguing that despite the demise of empirically successful
theories the parts of those theories responsible for their success do
in fact survive theory change. If, as some anti-realists have recently
suggested, those parts of theories that are responsible for their
success are only identifiable in retrospect, namely as those that have
survived, then the realist approach is trivialised for now success and
survival are guaranteed to coincide. The aim of this talk is to
counter this argument by identifying successful theory parts
independently from their survival. (Presented
at the Philosophy
of Science Association 2010 Biennial Conference, Montreal,
November 4-6 2010).
Logic of Crucial Experiments' - Although Duhem’s thesis that
in physics crucial experiments are impossible contains some grains of
truth in it, its effects have been greatly exaggerated. In this talk I
argue against this and other associated theses by pointing out the
various ways in which these theses can be curtailed. In the process of
doing so, I examine a few recent attempts to overcome the problems
posed by these theses and identify their strengths and weaknesses.
at the Philosophy
of Scientific Experimentation:
A Challenge to Philosophy of Science, University of Pittsburgh,
October 15-17 2010).
Realism: From an Epistemological Point of View' - Structural
realism is a rather popular view in philosophy of science. As with many
popular views, sprouting is never far behind. No sprout has had as much
grip on the view’s image as ontic structural realism. Indeed its
supporters have such a stranglehold that ‘structural realism’ has
almost become a byword for their views. In this talk, I want to redress
this imbalance by returning to structural realism’s humble epistemic
beginnings to examine exactly what made the view so attractive in the
first place. To this effect, I will reconstruct several arguments –
some of which little known – proposed in the early part of the
twentieth century in support of the epistemic version of structural
realism. Not wanting to dwell too much on the past, I will then switch
to more recent arguments both for and against the position. A careful
evaluation of these arguments will hopefully provide useful information
as to what form, if any, epistemic structural realism must take in
order to be a viable alternative to its direct competitors, namely
standard scientific realism and constructive empiricism. (Invited talk
at the Lunchtime
Colloquium, Center for Philosophy of Science, University of Pittsburgh,
September 28 2010).
27. 'Heat in
Inter-Theory Relations' - If the realists are right, not only
did certain theoretical parts of the caloric theory survive into our
modern conception of heat but these parts are in fact solely
responsible for the success the caloric theory enjoyed. I test this
claim against two of the caloric theory’s successes, namely the
explanations (i) that matter expands by heating and contracts by
cooling and (ii) that a special kind of heat (i.e. latent heat) is
involved in changes of state. Take (i) as an illustration. The caloric
explanation of this phenomenon has the same structure as the kinetic
one. As the quantity of heat – caloric in the one case, kinetic energy
in the other – is increased/decreased the force generated – repulsive
in the caloric case, pressure in the kinetic case – increases/decreases
and that in turn leads to an increase/decrease in the volume needed.
Thus the caloric explanation was successful because it had managed to
get the structure of such processes right, even though the specifics of
the ontology were wrong, i.e. the existence of caloric and its
repulsive force. This result tallies well with a special kind of
realism, namely structural realism. (Presented
at the British
Society for the Philosophy of Science Annual Conference, University
College Dublin , July 8-9 2010).
Representation and Perspective' - Critics of the semantic
view of theories have, among other things,
demurred that isomorphic specification is not sufficient for the
representation of at least some physical systems. The same physical
system will often, if not always, be amenable to representation via
different non-isomorphic models. Thus a construal of theories as sets
of structures does not seem sufficient to uniquely identify all target
systems. Structural realists face the same objection. Their endorsement
of the view that physical objects may only be specified up to
isomorphism means that they are as susceptible to this objection as
semantic theorists. In this talk I aim to rescue semantic theorists and
structural realists from this and other closely related objections by
endorsing a perspectivalist approach towards scientific representation.
(Invited talk at the
Research Colloquium, University of Bochum, June 17 2010).
about Scientific Understanding and Explanation as a Structural Realist'
- Structural Realism is a viewpoint in the scientific realism debate.
In its epistemological guise it holds
that our knowledge of the physical world is at best structural. More
precisely, we can only know the
physical world up to isomorphism. In its ontological guise it explains
this structural limitation to our
knowledge by appeal to an ontology which is itself in some sense or
other wholly structural.
Although research into structural realism is booming, little has been
said about what its implications
are for scientific understanding and explanation. In this talk I
explore these implications and argue
that at least when it comes to the natural sciences what counts as
understanding and explanation
has taken a highly abstract and mathematical turn that is very much in
line with the structural realist
pronouncements. (Presented at the Understanding
and the Aims of Science, Young Scholars Section, Lorentz
Center, University of Leiden, May 31 - June 4 2010).
Pessimistic Meta-Inductivist: A Sheep in Wolf’s Clothing?'
- Realists assert that when a successful theory is abandoned, not all
of its components are
discarded but only those that are inessential or idle for the theory’s
success. So long as the essential components survive into the new
theory there is
no cause for alarm. More precisely, an outdated theory T which enjoyed
some measure of success must, according to the realist, be: (i)
partially true precisely because some of its theoretical claims are
responsible for its success and (ii) superseded by a (strictly) more
approximately true theory T* which, of course, preserves T’s successful
theoretical claims. In this talk I test this requirement of realism
against the background of the outdated caloric theory of heat and its
successor the kinetic theory. (Presented at the Philosophy
of Science in a Forest (PSF2010), Dutch Association for the Philosophy
May 14 - 15 2010).
Double Life of Evidence: From the Streets to the Labs
- An integral part of the schooling of scientists, especially
experimental ones, is the cultivation of the significance and role of
scientific evidence. Naturally this schooling is not conducted in
vacuuo. Budding scientists already have experiences of, and intuitions
about, the use of evidence in everyday life. In this
talk I examine the extent to which everyday life evidential practices
are continuous with scientific ones. I begin by offering a tentative
formulation of the continuity hypothesis: Most, if not all, good (i.e.
practically successful) evidential practices in everyday life have
better performing or at least equally-well performing analogues in
science AND most, if not all, good evidential practices in science have
at best equally-well performing analogues in everyday life. I then
proceed to illustrate some cases of continuity, where good evidential
practices in science (e.g. calibration) have everyday life analogues.
(Part of a Symposium on Evidence I co-organised with Giora Hon,
Maarten van Dyck, Dave Lagnado and Jan Willem Romeijn for the European Philosophy
of Science Association Biennial Conference 2009, Free University of
Amsterdam, Oct 21-24 2009).
Realism: Invariance through Theory Change'
- Structural realists of nearly all stripes endorse the structural
continuity claim. Roughly, this is the idea that the structure of
successful scientific theories survives theory change because it has
latched on to the structure of the world. In this talk I elaborate,
elucidate and modify the structural continuity claim and its associated
argument. I do so without presupposing a particular conception of
structure that favours this or that kind of structural realism but
instead by concentrating on neutrally formulated historical facts. The
result, I hope, throws light on what a structural realist must do to
evidentially benefit from the historical record of science. (Presented
at the Congrès
triennal de la SOPHA 2009, University of Geneva, Sept 2-5
Caloric Under a Frame-Theoretic Spotlight'
- In this joint work with Gerhard Schurz we conduct a frame-theoretic
investigation of the respects in which the central concept of the
caloric theory of heat has survived into modern accounts of
thermodynamics despite the theory’s demise in the latter half of the
century. We first present a brief account of the development of the
caloric theory as well as that of its competitor, the motion theory of
heat. We then compare the two theories’ explanatory and predictive
successes, paying particular attention to the role their central
concepts played in facilitating those successes. The comparison will be
performed to evaluate whether or not (i) some parts of the caloric
theory are in some sense approximately true and (ii) the term ‘caloric’
can be said to refer to a modern counterpart posit. Our conjecture is
that to the
extent that the caloric theory enjoyed genuine success, the structural
parts responsible for that success have been incorporated into the
kinetic theory of heat. (Presented
at the Second
Conference on Concept, Types and Frames in Language, Cognition and
Science, University of Duesseldorf,
Aug 24-26 2009).
Ρεαλισμός: Ιστορική Συνοχή και τα Όρια της’ - Σύμφωνα με το
γνωσιολογικό είδος του δομικού ρεαλισμού στην καλύτερη περίπτωση
μπορούμε να έχουμε γνώση της δομής του κόσμου. Μιλώντας
πρόχειρα, κατά τον ισχυρισμό αυτό η δομή των επιτυχημένων επιστημονικών
θεωριών επιβίωνει μέσα από τις επιστημονικές επαναστάσεις επειδή έχει
αγκιστρωθεί πάνω στη πραγματική δομή του κόσμου. Με άλλα λόγια, η δομή
διατηρείται μέσα από την αλλαγή θεωριών γιατι είναι αληθής ή
τουλάχιστον αληθής κατά προσέγγιση – απ’εδώ και στο εξής θα εκφράζω
αυτή τη διαζευκτική φράση ως ‘(κατά προσέγγιση) αληθής’. Οι οπαδοί του
ισχυρισμού δομικής συνοχής συχνά δίνουν σιωπηρή έγκριση στον αντίστροφο
ισχυρισμό, δηλ. στο ότι η διατήρηση της δομής των επιτυχημένων
επιστημονικών θεωριών συνεπάγεται την (κατά προσέγγιση) αλήθεια τους.
Σε αυτή την ομιλία στοχεύω να αποσαφηνίσω, να επιφέρω βελτιωτικές
μετατροπές και να επεκτείνω τον ισχυρισμό της δομικής συνοχής και το
συνδεδεμένο επιχείρημα του. (Presented
at the 5th
Pan-Hellenic Conference in the History, Philosophy and Teaching of
Natural Sciences, University of Cyprus, June 11-14 2009).
Ruminations on Theoretical Term Reference' - In this talk I
examine the concepts of referential success and referential continuity
as they are used to assert or deny claims about theoretical term
reference. In particular, I examine the intuitions that motivate
different theoretical accounts of such concepts. In contrast to
existing approaches, I argue that even when such intuitions are
conflicting they play an evidential role in lending credence to
distinct referential concepts. What is more, I argue that some of these
concepts are useful in making sense of the historical record of science
and in evaluating scientific realist claims. (Invited talk presented at
Seminar in Epistemology and Philosophy of Science, University of Tilburg,
March 3 2009).
Scope of Fiction:
Comments on Tim Button’s ‘Where Fiction Ends and Reality Begins’
' - Suppose you want to distance yourself from fiction, i.e.
suppose you want no commitment to the literal truth of a fictional
sentence φ. Suppose further that you want to be able to treat all sorts
of discourses as fiction, i.e. not just literary fiction but also
ethics, mathematics, science, parts thereof, etc. Tim Button considers
and rejects a number of fictionalist views that could be applicable to
any of these discourses, namely the paraphrastic approach, the extended
fiction approach, the pretence fiction approach and the spotty scope
approach. Although I agree with quite a few of the conclusions that
Button draws, I find some of his motivation and arguments problematic.
Conference at Logos (Logic, Language and Cognition Research Group),
University of Barcelona, June 19-21 2008).
Theories: Up Close and Personal' - In this talk I extend my
critique of Bogen and Woodward's claim that we do not (and perhaps
cannot) use theories to infer, predict or explain observations. I do so
by demonstrating that paradigmatic cases of novel prediction could not
have been made unless the relationship between data and theories is
more direct than Bogen and woodward would have us believe. (Presented
at the conference Data
- Phenomena - Theories: What's the notion of a scientific phenomenon
good for?, University of Heidelberg, September 11-13 2008).
Loss: A Dilemma' - In this talk, I present anti-realist
advocates of Kuhn loss with an unattractive dilemma: Either Kuhn loss
has historical instantiations but is innocuous to the epistemic
commitments of the scientific realist or it is a real threat to those
commitments but has no historical instantiations. (Presented at
European Congress of Analytic Philosophy, Krakow, Aug 21-26
Empiricism' - In this talk, I put forth a broader conception
of observability that seeks to allay the realist’s concerns about
knowledge in natural science yet panders to vital empiricist
sensitivities. Along with the new conception of observability I propose
a new form of empiricism. Ecumenical empiricism, as I call it, divorces
itself from traditional conceptions of experience while remaining
wedded to the idea that reliable detection of our surroundings has
precedence over all other forms of knowledge. (Presented at the Joint Session
of the Aristotelian Society and Mind Association, University of Aberdeen,
July 11-14 2008).
Problem of Unconceived Alternatives?' - Kyle Stanford (2006)
puts forth a new challenge to scientific realism, the problem of
unconceived alternatives (PUA). He claims that it is a much more
powerful challenge than traditional arguments from underdetermination
because it is well supported by historical evidence. Contra Stanford, I
argue that the abundant evidence comes at great expense, for in order
to obtain it he turns PUA into an ineffectual challenge. (Presented at
Society for the Philosophy of Science, University of St. Andrews,
July 10-11 2008).
the Intuitions: Polylithic Reference' - Different theories of
reference aspire to satisfy conflicting intuitions. Assuming that
intuitions play a crucial role in pinning down the concept of
reference, two options become available: Either establish a consistent
set of intuitions by rejecting at least some of them or find a radical
way to accommodate all of them. The former option has been the primary
focus of research up to now. I will explore the latter option, arguing
that reference might not be a monolithic notion. With this aim in mind,
I sketch a hierarchy of concepts of reference, each of which satisfying
different intuitions and standards of successful reference. (Presented
at the Theoretical
Frameworks and Empirical Underdetermination Workshop, University of
Duesseldorf, April 10-12 2008).
12. 'Making Contact
with Observations' - Following Bogen and Woodward’s
influential ‘Saving the Phenomena’, many philosophers claim that
theories do not (and perhaps cannot) entail, predict or explain
observations. Utilising various case studies, I argue that observation
statements can often be derived straight from the theory because the
right auxiliaries are in place. (Presented at the First
Conference of the European Philosophy of Science Association,
Complutense University Madrid, November 15-17 2007).
Observation-Ladenness of Theory' - This talk contests the
purity of theories assumed in discussions of theory-ladenness, arguing
instead that theories and theoretical terms can be afflicted by
observation-ladenness. (Presented at the Joint
Session of the Aristotelian Society and Mind Association, University of
Bristol in July 2007).
Realism 2.0' - In this talk, I explore new sources of
support for Epistemic Structural Realism, as well as suggest various
adjustments, tackle certain threats, discuss neglected issues, and,
last but not least, try to put things in perspective. (Presented at the
of Physics Research Seminar, University of Oxford on Nov. 9
Continuity and its Limits' - This talk explores some of the
limits faced by structural realism in its claims of structural
continuity through scientific theory change. (Presented at the
Institute for the History and Foundations of Science (IHFS), Department
of Physics & Astronomy, Utrecht University in June
the Same Things' - This talk motivates a positive answer to
the question 'Whether different people experience the same public
things?' (Presented at the Erasmus
for Philosophy and Economics, University of Rotterdam in May
Evidence from Observation' - In this talk I contest the claim
that theories even when accompanied by suitable theoretical auxiliaries
cannot be directly tested via observations. (Presented at the
History and Philosophy of Science Seminar Series, University of Leeds
in March 2006).
Scientific Explanation, or How to Make the Realist Raft Float'
- This talk re-evaluates the role intuitions play in the notions of
scientific explanation and explanatory power. (Presented at the
Perspectives on Scientific Understanding, Free University of Amsterdam
in August 2005).
Equivalence' - This talk explores the limits and
consequences of the underdetermination and empirical equivalence
theses. (Part of it was presented at the
British Society for the Philosophy of Science Annual Conference,
University of Manchester in July 2005).
04. 'The Upward
Path to Structural Realism' - My aim here is threefold: (1)
to evaluate part of Psillos’ offence on the Russellian version of
epistemic structural realism (ESR), (2) to elaborate more fully what
Russellian ESR involves and (3) to suggest improvements where it is
indeed failing. (Presented at the Philosophy
of Science Association Nineteenth Biennial Conference, University of
Texas - Austin in November 2004).
Centre or Offstage' - In this talk, I criticise Psillos'
strategy against the pessimistic meta-induction and in particular his
conception of what makes theoretical terms (in)dispensable for their
respective theories. (Accepted for presentation at the 8th Summer
Symposium on the Philosophy of Chemistry and Biochemistry, University
of Durham in August 2004).
- This talk sketches a correspondence principle that: (a) bodes well
with some central episodes in the history of science and (b) can fend
off accusations of triviality. (Accepted for presentation at the
British Society for the Philosophy of Science Annual Conference,
University of Kent in July 2004).
the History of Science Cannot Teach Us' - This talk
criticises the view that the preservation of a theoretical component is
a necessary and/or sufficient condition of its approximate truth/truth.
(Presented at 12th.
International Congress of Logic Methodology and Philosophy of Science,
Oviedo in August 2003).