Links: ‘Valuing the Humanities’ discussion, and Munk debate with Blair/Hitchens

In answering a query made by a previous comment, I submit two links pertinent to the Munk Debate on religion. The motion advanced was: religion is a force for good in the world. Promoting the case was Tony Blair, former UK prime minister and against, Christopher Hitchens. Here is the official Munk foundation webpage and here is a podcast from CBC with what I think is an editorial. The latter is open access and the former asks for a fee to download or stream the full debate.

Another link to put up here. Michael made a post last week concerning the ‘Valuing the Humanities’ panel discussion. The British Philosophical Association has uploaded a downloadable mp3 here with the full discussion.


Valuing the Humanities: a panel discussion

Yesterday I went to see a panel discussion at the London School of Economics under the auspices of the Forum for European Philosophy. There was a fairly varied panel present. Martin Rees, a person with a great many titles: Astronomer Royal, President of the Royal Society, Lord of Ludlow, Master of Trinity College Cambridge and of course, a professor in Astrophysics. An unusual choice was a Richard Smith, known as a one time editor of the British Medical Journal and by his own testimony holds a passionate appreciation of literature, philosophy and poetry. The other ‘humanist’ panel were Prof. Martha Nussbaum, who is well known for her work in international social issues such as gender relations in India and her recent work seems to involve some familiarity with south-east Asia. Finally and by no means least was a certain Professor James Ladyman, a philosopher of science who has also been known in broadsheet media as a stern critic of the managerial style of academia since well before the annoucement of the Browne Report.

Some interesting points were made, some of which give a lot of historical and cultural context which is often forgotten by the knee-jerk and short-termist politics and journalism of the present day.

Democracy and the humanities

Nussbaum made the point that the humanities have a vital role in well functioning democracies. The humanities have an important role even outside democratic countries. Nussbaum gives the example of China, Singappore and India, which has invested much into their technical and vocational institutions which provide professional qualifications. These countries, according to Nussbaum, have realised the worth of humanism and the skills that come from learning about the humanities and have introduced courses as a crucial component to vocational/professional training. These skills are important to the corporate world in the understanding of other people and how one might deal with different personalities or castes.

How to make an appeal

A case should be made for the intrinsic value of the humanities, but in the discourse of public reason, it is important to appeal to a terminology and set of factors that politicians and the public would be appealed by, which would involve an instrumental form of reasoning. As such, many academics have to subordinate or submit to the logic of capital in how funding is allocated. This means that if one is to concede to academic funding and the mechanisms which organise it, certain key factors are hindered, such as freedom of the pen, and allowing the authority of the management-style administration of the university.

Professor Ladyman noted about the absurdity of the management reasoning, in one meeting he noted that the university authorities proposed that they had to do something to ‘invigorate’ the economy, or how important philosophy is in fighting extremism, for instance, with tackling movements like intelligent design proponents. Ladyman pointed the absurdity in this: while he is being asked how to tackle extremism as an academic, Bristol City Council has cut a scheme to teach english to Somali immigrants: one of the most vulnerable groups in the community. Ladyman makes the important point that the agenda imposed by HEFCE of ‘impact’ for ‘good research’ is absurd. The standards of excellent scholarship are internal to the subject of study. Also, the ‘impact’ of an intellectual development is hardly known immediately. Ladyman cited a great many examples:

  • G.H. Hardy, author of ‘Pure Mathematics’ once said that ‘Quantum Mechanics and Relativity’ have no relevance during his contemporamous early 20thC. Now, think about superconductivity, or the GPS technology. Consider the potential impacts of quantum computing. Impact isn’t the implicit reason why good research comes about, sometimes it is just about knowing more about a specific subject.
  • Bertrand Russell’s theory of descriptions and work on logic and foundations of mathematics became the staples of further mathematical logic, and spruned on developments in artificial languages and essentially foundational issues in computer science.
  • Philosophers who during the Enlightenment talked of the fundamental political values of ‘life, liberty and the pursuit of happiness’ led to the intellectual emergence of the founding of the United States, as well as the liberal democracy around Europe. Consider the ‘impact’ of a certain Second Treatise of Government

The case of the sciences

Lord Rees came from a completely different background to talk about the humanities, but unusually for a panel discussion (especially one which would involve philosophers), there was widespread agreement about the value of the humanities, but practically speaking, it had to take until the motion of increased tuition fees was passed before any reactionary talk happened.

Consider by contrast the ‘Science is Vital’ campaign, which launched well before the Browne report came out. The Science is Vital campaign anticipated that there would be a threat to cuts and many figures rallied for the cause, such as Guardian’s ‘Bad Science’ Columnist Ben Goldacre, one-time ‘Belle de Jour’ blogger Brooke Magnanti and Richard Dawkins. There was not such a co-ordinated effort for the humanities. An open question was asked: Why is this? The suggestion made by a few audience members was because the humanities are not a unified body. You will have philosophers who often define themselves as ‘analytic’ and deriding ‘continental’ philosophers (I’m guilty of engaging in this line of thought), and social-theory leaning intellectuals in the arts who are so much talk and not enough action. The lack of action and a co-ordinated political effort is a testimony of the challenges that the Humanities face in creating a decent campaign.

It is up to the unusual suspects to speak up for the humanities. A certain Astronomer Royal, for instance, or medical scientists like Richard Smith who emphasise the importance of humanism in health. Richard Smith spoke of how inappropriate it is to train people to be doctors from the age of 18 where they have little life experience or humanistic education. In the US, for comparison, medics have to complete a liberal arts education before technical training.

Context over the short term

The issue of the government deficit is in a sense a distraction. The Humanities were at risk long before the Lehman Brothers’ crash. The Humanities were under threat from within and without. The decision to increase tuition fees will be implimented in a few years, a time where there may or may not still be an economic crisis. This decision is made on a reasoning of short-termism but its impact is long term. It is not exclusively the blame of the present coalition government, as the state of the education system has faced challenges and decisions made by the conservative and labour government s of the past three decades. Students, who are 18-21 years old are the least to have such a grasp of historical context. Its important to know of the past struggles and campaigns for higher education over the past few decades.

The panel discussion ended with an amusing quip by chair Mark Lawson (famous for his broadcasting on Radio 4) who said: I have to go and interview Ronnie Corbett now. An interesting juxtaposition to end a discussion titled ‘valuing the humanities’.


More on Popper

As I’m trawling through more of Popper’s ‘Logic of Scientific Discovery’, I’ve come to learn more about this work than was putatively understood about it. As previously stated in my post on Popper, this work is far more than simply the ‘falsification thesis’ as normally construed. Falsification also is realised in a number of innovative ways (such as the ‘dimensionality’ of scientific sentences described previously). Falsification is also important in considering revisability.


Popper comes forward with a notion of simplicity, which goes something like: the more general a claim is and the more it can capture with a shorter expression, the better. This basically captures what simplicity is. This might sound obvious, but the contrasting position of conventionalism poses a situation where a thesis which is initially a generalised formulation may find ways of being contradicted and undermined, and in order for a body of theory to survive such empirical challenges it must introduce ‘auxillary statements’ to introduce consistency.

Comparing theories

If one is to compare one theoretical system with another, the factors for deciding one over the other concerns, what accounts for more truth and which does not. A simpler theory over one with too many auxillary hypotheses would be better because of the greater explanatory power of a simpler theory using less to explain just as much, or more. Auxillary hypotheses are also increasingly suspect as the greater number of caveats introduced are exactly introduced to avoid being contradicted by the real world. That is not to say that a real life theory may need auxillary hypotheses, after all, a physical theory needs to explain. These comparative factors are merely between idealised cases.

Probability considerations

I must admit this is a part I least understand. Popper introduces a set of probabilistic concerns that would establish a credence of a theory, some of the constraints are fairly non contraversial: a proposition should not conflict with other true statements, a claim that contradicts a true one should have a lower probability. Probability is introduced as having logical constraints. In this section Popper addresses the work of Richard Von Mises and Maynard Keynes in their work on probability. My initial suspicion was that logical constraints on probability were more about what is a factor in discounting the likelyhood of an event, rather than a positive thesis on how to construct probabilities. I perhaps retract this initial thought in the consideration of introducing two formal axioms of probability, which seem more logical than mathematical. I must make a note of connecting why people hold to much disdain of logical approaches to probability (and Carnap I add as a correlated philosopher on this issue), to the more contemporamous methods of probability in philosophy. Popper seems to introduce a distinction between ‘objective and subjective’ probabilities and seems to say that while objective approaches are better (as he so construes it), subjective approaches can be useful as well in certain cases.

I need to read more on probability. I am quite confounded in general about probability, but I see it as interesting that Popper introduces probability in an almost systematic way in his logic of science. Perhaps probability (or belief credence calculus) is an essential part to a system of science. We’ve come a long way from Kansas Hume in this issue.

A pile in the swamp

I’ve been reading Karl Popper’s ‘The Logic of Scientific Discovery’ very slowly over the past few months. I feel live I’ve made some headway so I’ll make some notes. Firstly I’ll consider how people normally think of Popper, and then how I’ve read him. I find a certain disconnection or incompatibility in the two.

The putative Popper

Popper is often understood as advocating a thesis of ‘falsification’. Falsificationism is often seen as a tack to mention after talking about ‘Verificationism’. Verifiationism itself is a philosophical bastardry. Verificationism (as ‘championed’ by Ayer) is the illegitimate son of what the Vienna Circle became associated with in the so-called label ‘Logical Positivism’. I think there is a lot of damage done by Upper school and degree lecturers in the ‘textbook’ depictions of the so-called ‘Logical positivism’, which is mischaracterised as being firstly a thesis that the only true statements are analytic or synthetic propositions. This not only ignores the fact that the analytic/synthetic distinction is suspect, and falsely puts forward implicitly that it’s stricly defined. Analyticity, or the classification of non-empirical synthetic statements became a thorny issue toward the late 20thC.

Characterising Verificationism as coming from the Vienna group of philosophers undermines their complexity and the great achievements they made. There is a lot of good historical scholarship to work past this historical misunderstanding, and with the birth of movements such as Experimental Philosophy (advocated by the likes of Knobe et al) or Formal Philosophy (epistemology, philosophy of action, probability) advocated by people such as Hendricks, we can see the true heritage of what wealth in the ideas and projects of the Vienna group that there really is. In this light, Popper seems an afterthought.

Falsification is seen as a thesis which is in some turn a reversal of verification. In a very superficial way this makes sense. Verificationism is an empiricist thesis which goes the way of asserting that if a claim cannot be verified in some way then it is meaningless and not a proper object of (scientific) enquiry. Taken as a universal thesis beyond the domain of science, it makes metaphysics, ethics, religious language and the supernatural as nonsensical domains of thought. However admittedly the onus must be to explain ethics and religion despite the fact that they don’t refer to anything. Which leads to views such as expressionism in meta-ethics, which is an interesting discussion in itself.

Falsification is almost seen as a fork to the knife of verification. Putatively, falsification is seen in verificationist terms: something is meaningless if it cannot be falsified. Verification has its own problems so if one is to make it a negative, one could seemingly avoid the paradoxes and issues that the former entails. Showing that something is so vague that it cannot possibly be falsified makes it meaningless because it doesn ot pertain to something relevant enough. While this explanation seems true enough, ot does not go really into the very complex character of Popper’s ‘Logic of Science’. Popper is apparently a philosopher who is read by actual scientists and is praised outside of philosophy. I wonder however, how much these non-philosophers really understand the so-called ‘falsificationist’ philosophy of Popper, at least as it comes from ‘The Logic of Scientific Discovery’. As a further read beyond the simple falsification dictum shows a deeply metaphysical and rationalist character more akin to Carnap and dare I even say, Kant’s philosophy of science [a blog named Noumenal Realm should expect every post to be on Kant].

Popper’s Falsification beyond ‘falsification’

Popper’s ‘Logic of Scientific Discovery’ (Henceforth Logik) is a work that I must admit I hardly understand, there are historical issues pertaining to the scientists and philosophers that Popper refers to that in themselves are topics for wider discussion. It should be said though, that his agenda is in a wider context that defined a large part of the Viennese philosophers, and philosophically oriented scientists of the day (if there is a distinction between the two). Popper’s Logik pushes my apprehension of formal logic to the limit and a lot of his work does involve a lot of formal niceties that deserve good analysis.

Most people (rightly) point out that the initial step in Popper’s process is to distinguish between science and non-science. Demarcation is essential in characterising that is the proper object of analysis. Popper has a view that stratifying claims in terms of higher and lower dimensionality, or extensions of reference, is a means to seperate the specificity and generality of claims. This is a highly important part of his system, as it is a way of establishing falsification.

An example Popper gives in this ‘stratification’ is a set of claims about orbit. We can start off with pre-Kepler observations about heavenly bodies going around the sun, and then make more specific (and ‘weaker’) claims about the nature of the orbit. When we make claims of greater specificity, we go a dimension higher than the initial vague terminologies of pre-Kepler or prescientific/folk observations so that they become more refined. There is an epistemic weighting to this ordering system. In order to falsify a claim (x), the falsifying thesis (y) must be pertain to the dimensionality appropriate toe refer to (x) in this aformentioned schematisation. The more specific a claim is, the more difficult it is to falsify it, on the other hand, there are different levels of how a claim may be relevant. As a proposition gets ‘higher’ in the dimensionality so described, its domain of reference changes. Initially we may speak of how we see planets orbit, but as claims become more advanced, we may talk about the nature of the orbit, and say, go into the level of (non-Euclidean) geometry as we say that an orbit is parabolic rather than circular.

Perhaps in our contemporary context this is played out in how specified areas of physics have become. The likes of particle physics, for instance, has gone to a domain where observation is difficult and speculation takes place on a level of mathematical entities. This is not to say that it is not possible to experiment or observe empirically, but it takes a much more sophisticated level in terms of the equipment and options. As a friend once said in a different context: science these days has gone far beyond boiling one’s own piss.

Popper intricately describes the formal relations of the higher and lower levels. a lower level can disprove a higher level (e.g. parabolic orbit (specific) falsifies proposing ‘circular’ or ’round’ orbit (general/vague description)). As well, it must take a proposition of the same level or higher specii to be disproven. Perhaps the story of physics in the late 19thC to the 20thC is a story of how a higher level explanation trumps the past theory. Often this may take place in terms of an experiment to show the limits of how far a particular dimensionality can explain, or shows a level where it falls apart.

It should be said that this notion of ‘dimensionality’ where propositions are levelled into a hierarchy of specificity is much like Kant’s systematicity proposal where ‘concepts’ are diagnosed in terms of propositions of greater and lesser generality. There are differences however. I am not entirely clear if the dimensionality in Popper’s thesis is a distinction between ‘better’ and ‘worse’ explanations (where general means ‘undefined or vague or broad reference’ to concepts like a ’round orbit’ as opposed to a non-euclidean parabolic), or, if there is a ‘final theory’ where propositions range from observations which are specific instances, to higher, generalised claims which account for a family of propositions. As propositions become more generalised they also become more formal, and mathematical. So, in Kant’s dimensionality we presume a ‘final theory’ (to use Hawking’s terminology) where specificity refers to empirical instances and generality refers to the mathematical/formal world. In Popper’s dimensionality, generality refers to the empirical level, which is in varying degrees, correct or incorrect. Specificity accounts for the increasingly formal nature of a theoretical proposition, as well as the ground on which it can be falisfied and the probability of its truth.

Something should be said about probability at this point. I found probability a hard topic to understand, partly because I’m aware Carnap has a hand in Popper’s thought here. Popper advocates, as did Carnap, a notion of logical probability, where we allocate a probability in terms of a credence of belief. The calculi of how such a probability is constructed I’ve not quite yet addressed in the book yet (I’m not sure if it will be addressed, or if I’d understand). Many people have skewed away from the notion of Logical probability for various reasons, seeming too close to Logicism would be a start, a discussion on the foundations of mathematics would be another way into this issue. Those issues are beyond my comprehension, but I will say that Popper addresses logical probability in terms of falsification, and the dimensionality thesis. The degree of probability depends on the degree of falsification potential of a thesis, in addition, the dimensionality of a proposition also has an accord on the logical probability of this thesis. In this way does Popper have a certain richness to compare with Kant’s philosophy of science.

Introducing logical probability makes Popper’s notion of dimensionality (or what I might call systematicity in a sense), pertain to how to decide on a theory’s rightness or wrongness, on how to distinguish between science and non- science, or on how to determine the falsification potential of a thesis. For Kant, dimensionality does not include logical probability but the assumption of the ‘as if’ presumption of a final theory. Popper has no such pretension. The idealism of a final theory is truly gone in the age of Popper, but that is not to say that presuming a final theory is to assert that ours is the final theory. Comparing Kant and Popper has much potential in this regard.


a news digest of yesterday

Dear Readers,

I thought that I’d share two stories, and one blog post. The blog post is from lovely Chris Bateman of ‘Only a Game’ and ‘International Hobo’ (a nickname apt for Michael were it not already taken). Bateman, a regular commentator and probably the non-bot only reader of this blog, has written a post on the introduction of new gadets to increase the longevity of this present generation of games consoles: the Nintendo Wii, Xbox 360 and Playstation 3. We were originally planning a post on the consumer nature of console gaming and their extended longevity. After reading this post, we’ve decided to maintain silence on a post on technology and the role of the consumer, as this is far more eloquent.

Two other stories I thought worth commenting. Johnny Marr, one time member of band ‘The Smiths’, has mentioned that British PM Cameron is ‘not allowed’ to be a fan of The Smiths. What an utterly pretentious move! But it is also one which strives to preseve a sense of authenticity which is exuded once from a band, that eventually becomes utterly chewed up and consumed by (guess who?) the consumer into their own semiolgical set of meanings.

Similarly, there’s a story about the Manic Street Preachers being tipped to play on the pinnacle of demagoguery, BBC pseudo-‘reality’ television Strictly Come Dancing. These kinds of stories are the basis from which some theorising can be made on the notion of authenticity. Authenticity is an issue mentioned in a previous post, which seems to appropriate so many meanings and foci. Is it the fan that confers authenticity? How does one maintain atuhenticity against the threat of ‘selling out’? Is commercial success and popularity a ‘critical’ or ‘aesthetic’ flaw? If so, what kind of absurd hypocrisies do fans have of their favourite band? This seems confusing to me, as I increasingly consider the notion of ‘ironic distance’.


P.S. Many congratulations to Bateman and his partner for the oncoming ‘family level up’ mentioned in his recent post. As probably the only non-bot reader, we all wish him well.

Wikileaks as the hostile ‘other’

I have resisted writing a post on the Wikileaks phenomenon for quite some time. Partly because I’ve not made up my mind as to whether they are liberators and a stronger social critic by the evidences they release alone than any ‘theorist’ or countercultural stirrer, or, if as the official dominant discourses say: they are a threat to international security on a variety of fronts. I’ll leave that topic for another day, and more evaluation.

One thing that can be said of late is that the latest leaks of the diplomatic cables, and a proposal that there is even more data which will come from banks and energy companies which promises to change our perspective on world affairs permanently, is that it is certainly an interesting and unique situation. It is interesting how virtually all nations (except noble Ecuador) have called for Wikileaks’ head Julian Assange’s proverbial and literal head. If one is to buy the mainstream media story, Wikileaks is some universal threat which would in some ironic way (ironic in that the documents reveal many diplomatic tensions) unite everyone against a common enemy. I am reminded of two insights, one philosophical, and one literary.

1. The case of aliens. In the last chapter of Paul Churchland’s ‘Matter and Consciousness’, the subjects concern somewhat eccentric or ‘new’ philosophical issues concerning consciousness and the mind. If we are introduced to a consciousness which by virtue of other reasons is somehow entirely unlike us (e.g. it is an artificial life, or nonhuman or ‘post-human’ life form), all differences between human beings are diminished as the ‘other’ which is largely unlike us, highlights the similarities human beings have. That can be a good thing, but it can also undermine the subtleties of difference that make individuals unique in a positive way. I am interested in how this ‘other’ of wikileaks will fare as it emerges as a political actor in the global world. In a sense, it is like terrorism, or multi-nation coalistions in that it a non geographical actor.

2. I am also reminded of the character ‘Adrian Veidt’ from the Watchmen comic. Whose strategy is (spoiler alert) to dissolve the impending doom of the cold war by posing threat of an ‘other’ for world nations to unify against and in so doing pursue a course for peace. Veidt’s notion of heroism was of a dark, almost Pax Romana manner. In order to save the world, he must make something so big that everyone feels threatened enough to forget their disputes.

This looks like an interesting turn in historical events. I just hope Assange doesn’t consider himself to be an Ozymandias figure.