Popper: two philosophical analogies (Part I: Kant, Popper and Certainty)


Continuing my series of posts on Popper’s ‘Logic of Scientific Discovery’, I thought some reflections were due. I’ve technically finished the monograph, but I then realised that I have another 200 pages of Appendices and other such suffix-type notes that Popper wished to add to what he seemingly percieved to be his masterpiece. Popper sought to reflect more on his system of science, and in some ways hold the fort on certain issues, or even to think differently on some topics (but not so far as to change his mind significantly).

Throughout the book, Popper elaborates his views through elongated footnotes, such footnotes are pages long at times, and are comparable to the kind of footnotes that theologian Karl Barth left in his works. It almost looks sloppy on first sight, but the need for addressing tangential issues is important not only for clarifying Popper’s views for scholarly or exegetical purposes, but to also anticipate his critics (or perhaps, it amy seem, to respond to them). There is an interesting footnote for instance, pertaining to a discussion of truth where he starts with something like ‘since this publication, my friend Afred Tarski had informed me of his work on truth…’, with sentences like that, I am exceptionally frightened by the more highly technical aspects of this work.

I’d like to address some remarks on the conclusion of the monograph, as well as on one of the appendices, which consists of a piece written about Popper’s thought on falsification in relation to verification theory. I will frame my considerations by way of analogies, between Popper’s thought and that of Kant as well as Popper and Einstein.

Popper and Kant: Fallibility and Apodictic certainties

When we speak of ‘Knowledge’, we can mean a whole range of things. I take it for granted that the English word ‘knowledge’ in the philosophical tradition relates to the historical correlate terms of ‘episteme’ or ‘erkenntnis’. Epistemology usually, in the traditions of  Hume or Kant pertain to a specific kind of thing, facts, propositions and usually unchanging things. It is not the facts that change, it is whether we are right or wrong. This leaves aside the important question of what other forms of things we would normally consider as knowable are excluded, for instance, social knowledge or how facts are mediated by heirarchies, or introspective notions of self-knowledge: how is it to love someone as a form of knowing? This are issues which are valid but take place in a context some time after Popper.

Scientific knowledge and knowledge simpliciter

One question pertinent to the Vienna philosophers, or perhaps even to Early Modern philosophy, is whether epistemology pertains to to knowledge simpliciter. If we talk about knowledge simpliciter, we can include all things that we intuitively or construe as knowing. So, ‘Lois Lane loves Clark Kent’ or Batman’s moral conviction for vigilante justice count as forms of knowledge. It is more than suggestable to consider that when the modern philosophers, and especially the Vienna philosophers were doing epistemology, they were not so much thinking of knowledge simpliciter, but science as the paradigm case for what is knowable.

I pose an analogy with Kant and Popper, because the latter takes this seemingly for granted. Epistemology in the ‘Logik is scientific knowledge. Popper’s system is a system of science, if Popper were thinking about wider forms of knowledge, he’d probably want to use a different account than applying his mechanics of falsification and probability axioms to propositions or thoughts such as ‘I’m hungry’. Kant on the other hand seems to consider knowledge simpliciter. In Kant’s own period, his epistemology resmbles something of what we might now consider (albeit anachronistically) as a cognitive science, or foundations of cognition (without the neuroscience), Kant’s philosophy was part of a system, yes, but a system of understanding the human in a fundamentally holistic way. For Kant, epistemology was an important part of the way a being can know of the world, but this cognitive process in his transcendental Idealism also fed into his moral and ‘aesthetic’ theory. Popper has no such ambitions. In this way Popper and Kant are disanalogous.

To constantly make comparisons between Kant and Popper (which I have done) would seem to be disanalogous to Kant’s larger systematic concerns. However, an analogy can be made of a certain reading of Kant. If we are to consider Kant as having in mind (as well as ‘epistemology as knowledge simpliciter’) natural science as the paradigm case of knowledge, or what good agents should process as knowledge without reasonable doubts (as compared to Cartesian or Humean Doubt), then a good interpretative case can be made to frame Kant as a philosopher concerned with Scientific Epistemology. There are a few distinct reasons to adopt this view:

  • Kant’s explicit and implicit references to the success of the Newtonian ‘philosophy’. On the one hand Kant admires its success and sees it in a way that his philosophy should aspire to. Newton’s philosophy is a mix of rationalism in his use of formalisations and mathematical generalisations of reality which are not perceptually derived, as well as empiricism, in the fact that these observations pertain to the empirical, and are scrutinised by the empirical.
  • An extra note: despite Kant’s interest in the success of Newton’s natural philosophy, he takes to a certain disagreement to Newtonian method as he sees it. This is the primary concern of ‘Metaphysical Foundation of Natural Science’.
  • Kant’s familiarity with Lavousier’s emerging theory of oxygen against phlogiston is influential in the sections which relate to the ‘systematicity’ thesis, or in more Kantian terms, what he construes as the positive role of reason. Which is described towards the end of the First Critique.
  • Kant’s systematicity thesis is also described in the Metaphysical Foundations of Natural Science, the fact that Kant would expand his First Critique to wider concerns about natural science strongly suggests that Kant himself valued a connection between the conception of epistemology with ‘natural science as knowledge’
  • Lets say we don’t accept that it is the Historical Kant’s view that epistemology should take to scientific knowledge as its paradigm case: there is an historical connection between Kant’s writings on his considerations of natural science (in the Critical period) with the later work of the so-called ‘Neo-Kantian’ movement in the 19thC such as Hermann Cohen, Ernst Cassirer, and even Rudolf Carnap and Hans Reichenbach (despite the fact that both Carnap and Reichenbach made much effort to distance themselves from the more ‘metaphysical’ reputation of Kant).
  • Popper would be reacting to a ‘form’ of the neo-kantian influence of Reichenbach and Carnap, and in doing so shows his own influence. Popper himself admits of the Falsification and demarcation issue not to arise from Hume (as verificationists claim Hume), but more from Kant – this issue refers to different passages of Kant than I am concerned with for this post.

(but I digress)

At the end of the Logik, Popper makes a case for a distinct way of looking at scientific knowledge. Forget truth, science is about degrees and corroboration. Science is a matter of probabilistic models telling us whether a claim has more credence or less credence. Popper fleshes out this account by starting off with his initial conditions of demarcation and falsification, and then introduces a model of probability where the more instances or regularities of a formalised phenomena aggregates our belief in it. Truth and falsity are out of the picture. Science does not look like the Enlightenment certainties of old, but then again maybe it never did.

Many of the modern philosophers believed in truth as a bivalent affair. Something was either true or false. Further to this, scientific knowledge was often linked to necessary knowledge, and special kinds of scientific knowledge (such as mathematics/theoretical physics) was of the highest certainty, and once we get it correct, relates to the a priori necessary truths of reality. Popper will have none of this, however and this is where perhaps Kant and Popper will part ways permanently. For Kant, apodictic certainties were an important part of true knowledge, Kant’s icon of a good claim knowledge would have been how Lavousier’s oxygen trumped phlogiston, or the success of the Newtonian science. Ironically, it is in Popper (and we shall later see in Einstein), that it is the limit of the applicability of scientific notions that is the exact reason why falsification should be adhered. In short, for Kant, certainty was the paradigm bliss of knowledge with Newton’s physics as an example of it. For Popper, a post-Newtonian world where it was expected that our best theories today would probably not be our best theories tomorrow, Popper goes into some detail about his ‘reservations’ of how Quantum Mechanics is interpreted.

I’ve often tried to pose that any good theory of knowledge (that takes scientific knowledge as its paradigm case) must take into account the fact of theory change. I’ve often posed that Kant’s notion of systematicity, with the notion of what he calls ‘reflective judgment’ in the third critique, as an openess to accept that what our fundamental organisation of a priori concepts are can and will change. But to also accept the world of science as a series of certainties puts this a priori openness into repute. Often people such as Reichenbach have criticised Kant for his adherence to the ‘certainty’ of Newton’s physics to base his transcendental aesthetic ( the claim that space and time are assumed before experience) commits himself to metaphysical claims about space that are no longer applicable (namely, Euclidean geometry, which Newton himself presumes). Kant’s systematicity thesis has a lot to arm itself with, but Popper shows that an 18thC theory finds problems with holding to certainty in a 20thC age. Kant was a rationalist philosopher in the age of Newton. Post-Newton, Popper was a rationalist in the age of Einstein.

(Next post: Einstein and Popper)

Lies, Damned lies, and…Karl Popper’s logic of science?

Continuing my series of posts following my reading of Popper’s ‘Logic of Scientific Discovery’, I think that I have just finished the more difficult part of the book. Popper wrote a large section on his unique theory of probability, many of the nuances I have to admit due to my lack of reading, are lost to me. I think a fair few things are important to say:

1. There has been a large consensus among many philosophers I’ve known, that a logical theory of probability is lacking in various ways compared to a consideration of more conventional mathematical notions about trying to understand statistical functions, interesting that among the mathematicians that I’ve known this conviction is not held. For this reason I might consider that this probability approach, at least in terms of the contemporaries of Popper’s time, were not in the majority. How such an approach now would hold, however is a much more detailed response. Formal approaches are the fashion in many areas of contemporary philosophy.

2. Popper should be read in context with as a point of comparison and contrast: the probability accounts of Carnap, Von Mises and Keynes (as in, J.M. Keynes the legend of Economics). Each of these thinkers had a particular aim for their thought around probability. Carnap integrates probability in a logic of science approach while the wider context of von Mises and Keynes are as theoreticians and practicioners of an applied numerical science; physics in one and economics in the other. Construing probability in such a wide light and in the audience of philosophical methodology shows the real ecclecticism and interdisciplinarity of the time.

3. Popper moves away from talk about truth. Starting off the book with a discussion concerning the limits of science, namely through falsification and demarcation, Popper then moves on to try and consider how to make positive claims of science. Often a reply to a discussion of falsification is a claim to the effect of: if our method concerns what shouldn’t be admissible as a scientific claim, what can we say is an admissible claim?

Here is where the probability account comes in. Instead of a distinct bivalent set of values of whether a claim is true or false, we are led to a notion of degrees of credence in probability. Perhaps we can never talk of appropriately ‘verifying’ a theory, but we can talk in terms of falisification and a positive notion of what he calls ‘corroboration’. Claims are suitably corroborated in terms of its instances and a calculus that applies a certain set of purely statistical assumptions. The game of science, then, is moved away from talk of truth to talk of corroborated estimates of what we may deem to be factual. It is this notion of corroboration which is the answer to the negation of verification.

4. Contemporary science, it should be said, relies on a great deal of statistical work. Everything from sociology and economics, to chemistry and climate science involve the gathering of statistics to establish predictions and models. Any good theory of science worth its salt needs to acknowledge the common contemporamous practice of science. The 20thC turn to statistics in the methodological literature is very much sensitive. I would go further to say that it would be a desiderata to acknowledge that the practice of science now uses these machinations.

5. One thought I advance: to what extent are the standards of rigour for corroboration internal to the discourse and its practitioners or that they are sufficiently generic to account for all discourses? I suspect that the notion of statistical accuracy and range has a lot of pragmatism involved depending on what is being measured, or predicted. This discussion is slightly addressed when Popper relates his notion of probability with the (emerging at the time) Quantum Mechanics.


A pile in the swamp

I’ve been reading Karl Popper’s ‘The Logic of Scientific Discovery’ very slowly over the past few months. I feel live I’ve made some headway so I’ll make some notes. Firstly I’ll consider how people normally think of Popper, and then how I’ve read him. I find a certain disconnection or incompatibility in the two.

The putative Popper

Popper is often understood as advocating a thesis of ‘falsification’. Falsificationism is often seen as a tack to mention after talking about ‘Verificationism’. Verifiationism itself is a philosophical bastardry. Verificationism (as ‘championed’ by Ayer) is the illegitimate son of what the Vienna Circle became associated with in the so-called label ‘Logical Positivism’. I think there is a lot of damage done by Upper school and degree lecturers in the ‘textbook’ depictions of the so-called ‘Logical positivism’, which is mischaracterised as being firstly a thesis that the only true statements are analytic or synthetic propositions. This not only ignores the fact that the analytic/synthetic distinction is suspect, and falsely puts forward implicitly that it’s stricly defined. Analyticity, or the classification of non-empirical synthetic statements became a thorny issue toward the late 20thC.

Characterising Verificationism as coming from the Vienna group of philosophers undermines their complexity and the great achievements they made. There is a lot of good historical scholarship to work past this historical misunderstanding, and with the birth of movements such as Experimental Philosophy (advocated by the likes of Knobe et al) or Formal Philosophy (epistemology, philosophy of action, probability) advocated by people such as Hendricks, we can see the true heritage of what wealth in the ideas and projects of the Vienna group that there really is. In this light, Popper seems an afterthought.

Falsification is seen as a thesis which is in some turn a reversal of verification. In a very superficial way this makes sense. Verificationism is an empiricist thesis which goes the way of asserting that if a claim cannot be verified in some way then it is meaningless and not a proper object of (scientific) enquiry. Taken as a universal thesis beyond the domain of science, it makes metaphysics, ethics, religious language and the supernatural as nonsensical domains of thought. However admittedly the onus must be to explain ethics and religion despite the fact that they don’t refer to anything. Which leads to views such as expressionism in meta-ethics, which is an interesting discussion in itself.

Falsification is almost seen as a fork to the knife of verification. Putatively, falsification is seen in verificationist terms: something is meaningless if it cannot be falsified. Verification has its own problems so if one is to make it a negative, one could seemingly avoid the paradoxes and issues that the former entails. Showing that something is so vague that it cannot possibly be falsified makes it meaningless because it doesn ot pertain to something relevant enough. While this explanation seems true enough, ot does not go really into the very complex character of Popper’s ‘Logic of Science’. Popper is apparently a philosopher who is read by actual scientists and is praised outside of philosophy. I wonder however, how much these non-philosophers really understand the so-called ‘falsificationist’ philosophy of Popper, at least as it comes from ‘The Logic of Scientific Discovery’. As a further read beyond the simple falsification dictum shows a deeply metaphysical and rationalist character more akin to Carnap and dare I even say, Kant’s philosophy of science [a blog named Noumenal Realm should expect every post to be on Kant].

Popper’s Falsification beyond ‘falsification’

Popper’s ‘Logic of Scientific Discovery’ (Henceforth Logik) is a work that I must admit I hardly understand, there are historical issues pertaining to the scientists and philosophers that Popper refers to that in themselves are topics for wider discussion. It should be said though, that his agenda is in a wider context that defined a large part of the Viennese philosophers, and philosophically oriented scientists of the day (if there is a distinction between the two). Popper’s Logik pushes my apprehension of formal logic to the limit and a lot of his work does involve a lot of formal niceties that deserve good analysis.

Most people (rightly) point out that the initial step in Popper’s process is to distinguish between science and non-science. Demarcation is essential in characterising that is the proper object of analysis. Popper has a view that stratifying claims in terms of higher and lower dimensionality, or extensions of reference, is a means to seperate the specificity and generality of claims. This is a highly important part of his system, as it is a way of establishing falsification.

An example Popper gives in this ‘stratification’ is a set of claims about orbit. We can start off with pre-Kepler observations about heavenly bodies going around the sun, and then make more specific (and ‘weaker’) claims about the nature of the orbit. When we make claims of greater specificity, we go a dimension higher than the initial vague terminologies of pre-Kepler or prescientific/folk observations so that they become more refined. There is an epistemic weighting to this ordering system. In order to falsify a claim (x), the falsifying thesis (y) must be pertain to the dimensionality appropriate toe refer to (x) in this aformentioned schematisation. The more specific a claim is, the more difficult it is to falsify it, on the other hand, there are different levels of how a claim may be relevant. As a proposition gets ‘higher’ in the dimensionality so described, its domain of reference changes. Initially we may speak of how we see planets orbit, but as claims become more advanced, we may talk about the nature of the orbit, and say, go into the level of (non-Euclidean) geometry as we say that an orbit is parabolic rather than circular.

Perhaps in our contemporary context this is played out in how specified areas of physics have become. The likes of particle physics, for instance, has gone to a domain where observation is difficult and speculation takes place on a level of mathematical entities. This is not to say that it is not possible to experiment or observe empirically, but it takes a much more sophisticated level in terms of the equipment and options. As a friend once said in a different context: science these days has gone far beyond boiling one’s own piss.

Popper intricately describes the formal relations of the higher and lower levels. a lower level can disprove a higher level (e.g. parabolic orbit (specific) falsifies proposing ‘circular’ or ’round’ orbit (general/vague description)). As well, it must take a proposition of the same level or higher specii to be disproven. Perhaps the story of physics in the late 19thC to the 20thC is a story of how a higher level explanation trumps the past theory. Often this may take place in terms of an experiment to show the limits of how far a particular dimensionality can explain, or shows a level where it falls apart.

It should be said that this notion of ‘dimensionality’ where propositions are levelled into a hierarchy of specificity is much like Kant’s systematicity proposal where ‘concepts’ are diagnosed in terms of propositions of greater and lesser generality. There are differences however. I am not entirely clear if the dimensionality in Popper’s thesis is a distinction between ‘better’ and ‘worse’ explanations (where general means ‘undefined or vague or broad reference’ to concepts like a ’round orbit’ as opposed to a non-euclidean parabolic), or, if there is a ‘final theory’ where propositions range from observations which are specific instances, to higher, generalised claims which account for a family of propositions. As propositions become more generalised they also become more formal, and mathematical. So, in Kant’s dimensionality we presume a ‘final theory’ (to use Hawking’s terminology) where specificity refers to empirical instances and generality refers to the mathematical/formal world. In Popper’s dimensionality, generality refers to the empirical level, which is in varying degrees, correct or incorrect. Specificity accounts for the increasingly formal nature of a theoretical proposition, as well as the ground on which it can be falisfied and the probability of its truth.

Something should be said about probability at this point. I found probability a hard topic to understand, partly because I’m aware Carnap has a hand in Popper’s thought here. Popper advocates, as did Carnap, a notion of logical probability, where we allocate a probability in terms of a credence of belief. The calculi of how such a probability is constructed I’ve not quite yet addressed in the book yet (I’m not sure if it will be addressed, or if I’d understand). Many people have skewed away from the notion of Logical probability for various reasons, seeming too close to Logicism would be a start, a discussion on the foundations of mathematics would be another way into this issue. Those issues are beyond my comprehension, but I will say that Popper addresses logical probability in terms of falsification, and the dimensionality thesis. The degree of probability depends on the degree of falsification potential of a thesis, in addition, the dimensionality of a proposition also has an accord on the logical probability of this thesis. In this way does Popper have a certain richness to compare with Kant’s philosophy of science.

Introducing logical probability makes Popper’s notion of dimensionality (or what I might call systematicity in a sense), pertain to how to decide on a theory’s rightness or wrongness, on how to distinguish between science and non- science, or on how to determine the falsification potential of a thesis. For Kant, dimensionality does not include logical probability but the assumption of the ‘as if’ presumption of a final theory. Popper has no such pretension. The idealism of a final theory is truly gone in the age of Popper, but that is not to say that presuming a final theory is to assert that ours is the final theory. Comparing Kant and Popper has much potential in this regard.