David Hilbert on Unification

At the end of David Hilbert’s Mathematical Problems, Hilbert goes into the details for his motivations for what we may call unity of science thesis. These reasons are as poignant today in my view as they were in his own time. Motvations could be summarised thusly:

 

  • Divergences/fracturing mathematics into subdisciplines will mean specialised areas will not engage with other areas outside their specialism

  • The most important innovations are driven by simplicity, more refined tools and less complication.

 

The first thesis is a problematic of overspecialisation and genrefication of any kind of academic research. Becoming so niche that one is essentially writing for a peer group that is too specific and few. Perhaps this is inevitable in the world of industrial research and constant innovation. If we are to believe that subdisciplines and specialisation are a necessity, then we cannot understand Hilbert’s second thesis, of parsinomy. Granted, more needs to be elaborated if such a unification thesis were to work. Unification has its own problems, but there is a bonus to clarity and it is a matter of fact that many great scientific innovations are of the sort that unify and simplify seemingly irrelevant areas (Maxwell Equations or Relativity for example).

 The conclusion of Hilbert’s lecture is as follows:

The problems mentioned are merely samples of problems, yet they will suffice to show how rich, how manifold and how extensive the mathematical science of today is, and the question is urged upon us whether mathematics is doomed to the fate of those other sciences that have split up into separate branches, whose representatives scarcely understand one another and whose connection becomes ever more loose. I do not believe this nor wish it. Mathematical science is in my opinion an indivisible whole, an organism whose vitality is conditioned upon the connection of its parts. For with all the variety of mathematical knowledge, we are still clearly conscious of the similarity of the logical devices, the relationship of the ideas in mathematics as a whole and the numerous analogies in its different departments. We also notice that, the farther a mathematical theory is developed, the more harmoniously and uniformly does its construction proceed, and unsuspected relations are disclosed between hitherto separate branches of the science. So it happens that, with the extension of mathematics, its organic character is not lost but only manifests itself the more clearly.

But, we ask, with the extension of mathematical knowledge will it not finally become impossible for the single investigator to embrace all departments of this knowledge? In answer let me point out how thoroughly it is ingrained in mathematical science that every real advance goes hand in hand with the invention of sharper tools and simpler methods which at the same time assist in understanding earlier theories and cast aside older more complicated developments. It is therefore possible for the individual investigator, when he makes these sharper tools and simpler methods his own, to find his way more easily in the various branches of mathematics than is possible in any other science.

The organic unity of mathematics is inherent in the nature of this science, for mathematics is the foundation of all exact knowledge of natural phenomena. That it may completely fulfil this high mission, may the new century bring it gifted masters and many zealous and enthusiastic disciples! [David Hilbert, 1900]

 


Advertisements

Popper: two philosophical analogies (Part I: Kant, Popper and Certainty)

Introduction

Continuing my series of posts on Popper’s ‘Logic of Scientific Discovery’, I thought some reflections were due. I’ve technically finished the monograph, but I then realised that I have another 200 pages of Appendices and other such suffix-type notes that Popper wished to add to what he seemingly percieved to be his masterpiece. Popper sought to reflect more on his system of science, and in some ways hold the fort on certain issues, or even to think differently on some topics (but not so far as to change his mind significantly).

Throughout the book, Popper elaborates his views through elongated footnotes, such footnotes are pages long at times, and are comparable to the kind of footnotes that theologian Karl Barth left in his works. It almost looks sloppy on first sight, but the need for addressing tangential issues is important not only for clarifying Popper’s views for scholarly or exegetical purposes, but to also anticipate his critics (or perhaps, it amy seem, to respond to them). There is an interesting footnote for instance, pertaining to a discussion of truth where he starts with something like ‘since this publication, my friend Afred Tarski had informed me of his work on truth…’, with sentences like that, I am exceptionally frightened by the more highly technical aspects of this work.

I’d like to address some remarks on the conclusion of the monograph, as well as on one of the appendices, which consists of a piece written about Popper’s thought on falsification in relation to verification theory. I will frame my considerations by way of analogies, between Popper’s thought and that of Kant as well as Popper and Einstein.

Popper and Kant: Fallibility and Apodictic certainties

When we speak of ‘Knowledge’, we can mean a whole range of things. I take it for granted that the English word ‘knowledge’ in the philosophical tradition relates to the historical correlate terms of ‘episteme’ or ‘erkenntnis’. Epistemology usually, in the traditions of  Hume or Kant pertain to a specific kind of thing, facts, propositions and usually unchanging things. It is not the facts that change, it is whether we are right or wrong. This leaves aside the important question of what other forms of things we would normally consider as knowable are excluded, for instance, social knowledge or how facts are mediated by heirarchies, or introspective notions of self-knowledge: how is it to love someone as a form of knowing? This are issues which are valid but take place in a context some time after Popper.

Scientific knowledge and knowledge simpliciter

One question pertinent to the Vienna philosophers, or perhaps even to Early Modern philosophy, is whether epistemology pertains to to knowledge simpliciter. If we talk about knowledge simpliciter, we can include all things that we intuitively or construe as knowing. So, ‘Lois Lane loves Clark Kent’ or Batman’s moral conviction for vigilante justice count as forms of knowledge. It is more than suggestable to consider that when the modern philosophers, and especially the Vienna philosophers were doing epistemology, they were not so much thinking of knowledge simpliciter, but science as the paradigm case for what is knowable.

I pose an analogy with Kant and Popper, because the latter takes this seemingly for granted. Epistemology in the ‘Logik is scientific knowledge. Popper’s system is a system of science, if Popper were thinking about wider forms of knowledge, he’d probably want to use a different account than applying his mechanics of falsification and probability axioms to propositions or thoughts such as ‘I’m hungry’. Kant on the other hand seems to consider knowledge simpliciter. In Kant’s own period, his epistemology resmbles something of what we might now consider (albeit anachronistically) as a cognitive science, or foundations of cognition (without the neuroscience), Kant’s philosophy was part of a system, yes, but a system of understanding the human in a fundamentally holistic way. For Kant, epistemology was an important part of the way a being can know of the world, but this cognitive process in his transcendental Idealism also fed into his moral and ‘aesthetic’ theory. Popper has no such ambitions. In this way Popper and Kant are disanalogous.

To constantly make comparisons between Kant and Popper (which I have done) would seem to be disanalogous to Kant’s larger systematic concerns. However, an analogy can be made of a certain reading of Kant. If we are to consider Kant as having in mind (as well as ‘epistemology as knowledge simpliciter’) natural science as the paradigm case of knowledge, or what good agents should process as knowledge without reasonable doubts (as compared to Cartesian or Humean Doubt), then a good interpretative case can be made to frame Kant as a philosopher concerned with Scientific Epistemology. There are a few distinct reasons to adopt this view:

  • Kant’s explicit and implicit references to the success of the Newtonian ‘philosophy’. On the one hand Kant admires its success and sees it in a way that his philosophy should aspire to. Newton’s philosophy is a mix of rationalism in his use of formalisations and mathematical generalisations of reality which are not perceptually derived, as well as empiricism, in the fact that these observations pertain to the empirical, and are scrutinised by the empirical.
  • An extra note: despite Kant’s interest in the success of Newton’s natural philosophy, he takes to a certain disagreement to Newtonian method as he sees it. This is the primary concern of ‘Metaphysical Foundation of Natural Science’.
  • Kant’s familiarity with Lavousier’s emerging theory of oxygen against phlogiston is influential in the sections which relate to the ‘systematicity’ thesis, or in more Kantian terms, what he construes as the positive role of reason. Which is described towards the end of the First Critique.
  • Kant’s systematicity thesis is also described in the Metaphysical Foundations of Natural Science, the fact that Kant would expand his First Critique to wider concerns about natural science strongly suggests that Kant himself valued a connection between the conception of epistemology with ‘natural science as knowledge’
  • Lets say we don’t accept that it is the Historical Kant’s view that epistemology should take to scientific knowledge as its paradigm case: there is an historical connection between Kant’s writings on his considerations of natural science (in the Critical period) with the later work of the so-called ‘Neo-Kantian’ movement in the 19thC such as Hermann Cohen, Ernst Cassirer, and even Rudolf Carnap and Hans Reichenbach (despite the fact that both Carnap and Reichenbach made much effort to distance themselves from the more ‘metaphysical’ reputation of Kant).
  • Popper would be reacting to a ‘form’ of the neo-kantian influence of Reichenbach and Carnap, and in doing so shows his own influence. Popper himself admits of the Falsification and demarcation issue not to arise from Hume (as verificationists claim Hume), but more from Kant – this issue refers to different passages of Kant than I am concerned with for this post.

(but I digress)

At the end of the Logik, Popper makes a case for a distinct way of looking at scientific knowledge. Forget truth, science is about degrees and corroboration. Science is a matter of probabilistic models telling us whether a claim has more credence or less credence. Popper fleshes out this account by starting off with his initial conditions of demarcation and falsification, and then introduces a model of probability where the more instances or regularities of a formalised phenomena aggregates our belief in it. Truth and falsity are out of the picture. Science does not look like the Enlightenment certainties of old, but then again maybe it never did.

Many of the modern philosophers believed in truth as a bivalent affair. Something was either true or false. Further to this, scientific knowledge was often linked to necessary knowledge, and special kinds of scientific knowledge (such as mathematics/theoretical physics) was of the highest certainty, and once we get it correct, relates to the a priori necessary truths of reality. Popper will have none of this, however and this is where perhaps Kant and Popper will part ways permanently. For Kant, apodictic certainties were an important part of true knowledge, Kant’s icon of a good claim knowledge would have been how Lavousier’s oxygen trumped phlogiston, or the success of the Newtonian science. Ironically, it is in Popper (and we shall later see in Einstein), that it is the limit of the applicability of scientific notions that is the exact reason why falsification should be adhered. In short, for Kant, certainty was the paradigm bliss of knowledge with Newton’s physics as an example of it. For Popper, a post-Newtonian world where it was expected that our best theories today would probably not be our best theories tomorrow, Popper goes into some detail about his ‘reservations’ of how Quantum Mechanics is interpreted.

I’ve often tried to pose that any good theory of knowledge (that takes scientific knowledge as its paradigm case) must take into account the fact of theory change. I’ve often posed that Kant’s notion of systematicity, with the notion of what he calls ‘reflective judgment’ in the third critique, as an openess to accept that what our fundamental organisation of a priori concepts are can and will change. But to also accept the world of science as a series of certainties puts this a priori openness into repute. Often people such as Reichenbach have criticised Kant for his adherence to the ‘certainty’ of Newton’s physics to base his transcendental aesthetic ( the claim that space and time are assumed before experience) commits himself to metaphysical claims about space that are no longer applicable (namely, Euclidean geometry, which Newton himself presumes). Kant’s systematicity thesis has a lot to arm itself with, but Popper shows that an 18thC theory finds problems with holding to certainty in a 20thC age. Kant was a rationalist philosopher in the age of Newton. Post-Newton, Popper was a rationalist in the age of Einstein.

(Next post: Einstein and Popper)

Lies, Damned lies, and…Karl Popper’s logic of science?

Continuing my series of posts following my reading of Popper’s ‘Logic of Scientific Discovery’, I think that I have just finished the more difficult part of the book. Popper wrote a large section on his unique theory of probability, many of the nuances I have to admit due to my lack of reading, are lost to me. I think a fair few things are important to say:

1. There has been a large consensus among many philosophers I’ve known, that a logical theory of probability is lacking in various ways compared to a consideration of more conventional mathematical notions about trying to understand statistical functions, interesting that among the mathematicians that I’ve known this conviction is not held. For this reason I might consider that this probability approach, at least in terms of the contemporaries of Popper’s time, were not in the majority. How such an approach now would hold, however is a much more detailed response. Formal approaches are the fashion in many areas of contemporary philosophy.

2. Popper should be read in context with as a point of comparison and contrast: the probability accounts of Carnap, Von Mises and Keynes (as in, J.M. Keynes the legend of Economics). Each of these thinkers had a particular aim for their thought around probability. Carnap integrates probability in a logic of science approach while the wider context of von Mises and Keynes are as theoreticians and practicioners of an applied numerical science; physics in one and economics in the other. Construing probability in such a wide light and in the audience of philosophical methodology shows the real ecclecticism and interdisciplinarity of the time.

3. Popper moves away from talk about truth. Starting off the book with a discussion concerning the limits of science, namely through falsification and demarcation, Popper then moves on to try and consider how to make positive claims of science. Often a reply to a discussion of falsification is a claim to the effect of: if our method concerns what shouldn’t be admissible as a scientific claim, what can we say is an admissible claim?

Here is where the probability account comes in. Instead of a distinct bivalent set of values of whether a claim is true or false, we are led to a notion of degrees of credence in probability. Perhaps we can never talk of appropriately ‘verifying’ a theory, but we can talk in terms of falisification and a positive notion of what he calls ‘corroboration’. Claims are suitably corroborated in terms of its instances and a calculus that applies a certain set of purely statistical assumptions. The game of science, then, is moved away from talk of truth to talk of corroborated estimates of what we may deem to be factual. It is this notion of corroboration which is the answer to the negation of verification.

4. Contemporary science, it should be said, relies on a great deal of statistical work. Everything from sociology and economics, to chemistry and climate science involve the gathering of statistics to establish predictions and models. Any good theory of science worth its salt needs to acknowledge the common contemporamous practice of science. The 20thC turn to statistics in the methodological literature is very much sensitive. I would go further to say that it would be a desiderata to acknowledge that the practice of science now uses these machinations.

5. One thought I advance: to what extent are the standards of rigour for corroboration internal to the discourse and its practitioners or that they are sufficiently generic to account for all discourses? I suspect that the notion of statistical accuracy and range has a lot of pragmatism involved depending on what is being measured, or predicted. This discussion is slightly addressed when Popper relates his notion of probability with the (emerging at the time) Quantum Mechanics.

Michael

More on Popper

As I’m trawling through more of Popper’s ‘Logic of Scientific Discovery’, I’ve come to learn more about this work than was putatively understood about it. As previously stated in my post on Popper, this work is far more than simply the ‘falsification thesis’ as normally construed. Falsification also is realised in a number of innovative ways (such as the ‘dimensionality’ of scientific sentences described previously). Falsification is also important in considering revisability.

Simplicity

Popper comes forward with a notion of simplicity, which goes something like: the more general a claim is and the more it can capture with a shorter expression, the better. This basically captures what simplicity is. This might sound obvious, but the contrasting position of conventionalism poses a situation where a thesis which is initially a generalised formulation may find ways of being contradicted and undermined, and in order for a body of theory to survive such empirical challenges it must introduce ‘auxillary statements’ to introduce consistency.

Comparing theories

If one is to compare one theoretical system with another, the factors for deciding one over the other concerns, what accounts for more truth and which does not. A simpler theory over one with too many auxillary hypotheses would be better because of the greater explanatory power of a simpler theory using less to explain just as much, or more. Auxillary hypotheses are also increasingly suspect as the greater number of caveats introduced are exactly introduced to avoid being contradicted by the real world. That is not to say that a real life theory may need auxillary hypotheses, after all, a physical theory needs to explain. These comparative factors are merely between idealised cases.

Probability considerations

I must admit this is a part I least understand. Popper introduces a set of probabilistic concerns that would establish a credence of a theory, some of the constraints are fairly non contraversial: a proposition should not conflict with other true statements, a claim that contradicts a true one should have a lower probability. Probability is introduced as having logical constraints. In this section Popper addresses the work of Richard Von Mises and Maynard Keynes in their work on probability. My initial suspicion was that logical constraints on probability were more about what is a factor in discounting the likelyhood of an event, rather than a positive thesis on how to construct probabilities. I perhaps retract this initial thought in the consideration of introducing two formal axioms of probability, which seem more logical than mathematical. I must make a note of connecting why people hold to much disdain of logical approaches to probability (and Carnap I add as a correlated philosopher on this issue), to the more contemporamous methods of probability in philosophy. Popper seems to introduce a distinction between ‘objective and subjective’ probabilities and seems to say that while objective approaches are better (as he so construes it), subjective approaches can be useful as well in certain cases.

I need to read more on probability. I am quite confounded in general about probability, but I see it as interesting that Popper introduces probability in an almost systematic way in his logic of science. Perhaps probability (or belief credence calculus) is an essential part to a system of science. We’ve come a long way from Kansas Hume in this issue.

A pile in the swamp

I’ve been reading Karl Popper’s ‘The Logic of Scientific Discovery’ very slowly over the past few months. I feel live I’ve made some headway so I’ll make some notes. Firstly I’ll consider how people normally think of Popper, and then how I’ve read him. I find a certain disconnection or incompatibility in the two.

The putative Popper

Popper is often understood as advocating a thesis of ‘falsification’. Falsificationism is often seen as a tack to mention after talking about ‘Verificationism’. Verifiationism itself is a philosophical bastardry. Verificationism (as ‘championed’ by Ayer) is the illegitimate son of what the Vienna Circle became associated with in the so-called label ‘Logical Positivism’. I think there is a lot of damage done by Upper school and degree lecturers in the ‘textbook’ depictions of the so-called ‘Logical positivism’, which is mischaracterised as being firstly a thesis that the only true statements are analytic or synthetic propositions. This not only ignores the fact that the analytic/synthetic distinction is suspect, and falsely puts forward implicitly that it’s stricly defined. Analyticity, or the classification of non-empirical synthetic statements became a thorny issue toward the late 20thC.

Characterising Verificationism as coming from the Vienna group of philosophers undermines their complexity and the great achievements they made. There is a lot of good historical scholarship to work past this historical misunderstanding, and with the birth of movements such as Experimental Philosophy (advocated by the likes of Knobe et al) or Formal Philosophy (epistemology, philosophy of action, probability) advocated by people such as Hendricks, we can see the true heritage of what wealth in the ideas and projects of the Vienna group that there really is. In this light, Popper seems an afterthought.

Falsification is seen as a thesis which is in some turn a reversal of verification. In a very superficial way this makes sense. Verificationism is an empiricist thesis which goes the way of asserting that if a claim cannot be verified in some way then it is meaningless and not a proper object of (scientific) enquiry. Taken as a universal thesis beyond the domain of science, it makes metaphysics, ethics, religious language and the supernatural as nonsensical domains of thought. However admittedly the onus must be to explain ethics and religion despite the fact that they don’t refer to anything. Which leads to views such as expressionism in meta-ethics, which is an interesting discussion in itself.

Falsification is almost seen as a fork to the knife of verification. Putatively, falsification is seen in verificationist terms: something is meaningless if it cannot be falsified. Verification has its own problems so if one is to make it a negative, one could seemingly avoid the paradoxes and issues that the former entails. Showing that something is so vague that it cannot possibly be falsified makes it meaningless because it doesn ot pertain to something relevant enough. While this explanation seems true enough, ot does not go really into the very complex character of Popper’s ‘Logic of Science’. Popper is apparently a philosopher who is read by actual scientists and is praised outside of philosophy. I wonder however, how much these non-philosophers really understand the so-called ‘falsificationist’ philosophy of Popper, at least as it comes from ‘The Logic of Scientific Discovery’. As a further read beyond the simple falsification dictum shows a deeply metaphysical and rationalist character more akin to Carnap and dare I even say, Kant’s philosophy of science [a blog named Noumenal Realm should expect every post to be on Kant].

Popper’s Falsification beyond ‘falsification’

Popper’s ‘Logic of Scientific Discovery’ (Henceforth Logik) is a work that I must admit I hardly understand, there are historical issues pertaining to the scientists and philosophers that Popper refers to that in themselves are topics for wider discussion. It should be said though, that his agenda is in a wider context that defined a large part of the Viennese philosophers, and philosophically oriented scientists of the day (if there is a distinction between the two). Popper’s Logik pushes my apprehension of formal logic to the limit and a lot of his work does involve a lot of formal niceties that deserve good analysis.

Most people (rightly) point out that the initial step in Popper’s process is to distinguish between science and non-science. Demarcation is essential in characterising that is the proper object of analysis. Popper has a view that stratifying claims in terms of higher and lower dimensionality, or extensions of reference, is a means to seperate the specificity and generality of claims. This is a highly important part of his system, as it is a way of establishing falsification.

An example Popper gives in this ‘stratification’ is a set of claims about orbit. We can start off with pre-Kepler observations about heavenly bodies going around the sun, and then make more specific (and ‘weaker’) claims about the nature of the orbit. When we make claims of greater specificity, we go a dimension higher than the initial vague terminologies of pre-Kepler or prescientific/folk observations so that they become more refined. There is an epistemic weighting to this ordering system. In order to falsify a claim (x), the falsifying thesis (y) must be pertain to the dimensionality appropriate toe refer to (x) in this aformentioned schematisation. The more specific a claim is, the more difficult it is to falsify it, on the other hand, there are different levels of how a claim may be relevant. As a proposition gets ‘higher’ in the dimensionality so described, its domain of reference changes. Initially we may speak of how we see planets orbit, but as claims become more advanced, we may talk about the nature of the orbit, and say, go into the level of (non-Euclidean) geometry as we say that an orbit is parabolic rather than circular.

Perhaps in our contemporary context this is played out in how specified areas of physics have become. The likes of particle physics, for instance, has gone to a domain where observation is difficult and speculation takes place on a level of mathematical entities. This is not to say that it is not possible to experiment or observe empirically, but it takes a much more sophisticated level in terms of the equipment and options. As a friend once said in a different context: science these days has gone far beyond boiling one’s own piss.

Popper intricately describes the formal relations of the higher and lower levels. a lower level can disprove a higher level (e.g. parabolic orbit (specific) falsifies proposing ‘circular’ or ’round’ orbit (general/vague description)). As well, it must take a proposition of the same level or higher specii to be disproven. Perhaps the story of physics in the late 19thC to the 20thC is a story of how a higher level explanation trumps the past theory. Often this may take place in terms of an experiment to show the limits of how far a particular dimensionality can explain, or shows a level where it falls apart.

It should be said that this notion of ‘dimensionality’ where propositions are levelled into a hierarchy of specificity is much like Kant’s systematicity proposal where ‘concepts’ are diagnosed in terms of propositions of greater and lesser generality. There are differences however. I am not entirely clear if the dimensionality in Popper’s thesis is a distinction between ‘better’ and ‘worse’ explanations (where general means ‘undefined or vague or broad reference’ to concepts like a ’round orbit’ as opposed to a non-euclidean parabolic), or, if there is a ‘final theory’ where propositions range from observations which are specific instances, to higher, generalised claims which account for a family of propositions. As propositions become more generalised they also become more formal, and mathematical. So, in Kant’s dimensionality we presume a ‘final theory’ (to use Hawking’s terminology) where specificity refers to empirical instances and generality refers to the mathematical/formal world. In Popper’s dimensionality, generality refers to the empirical level, which is in varying degrees, correct or incorrect. Specificity accounts for the increasingly formal nature of a theoretical proposition, as well as the ground on which it can be falisfied and the probability of its truth.

Something should be said about probability at this point. I found probability a hard topic to understand, partly because I’m aware Carnap has a hand in Popper’s thought here. Popper advocates, as did Carnap, a notion of logical probability, where we allocate a probability in terms of a credence of belief. The calculi of how such a probability is constructed I’ve not quite yet addressed in the book yet (I’m not sure if it will be addressed, or if I’d understand). Many people have skewed away from the notion of Logical probability for various reasons, seeming too close to Logicism would be a start, a discussion on the foundations of mathematics would be another way into this issue. Those issues are beyond my comprehension, but I will say that Popper addresses logical probability in terms of falsification, and the dimensionality thesis. The degree of probability depends on the degree of falsification potential of a thesis, in addition, the dimensionality of a proposition also has an accord on the logical probability of this thesis. In this way does Popper have a certain richness to compare with Kant’s philosophy of science.

Introducing logical probability makes Popper’s notion of dimensionality (or what I might call systematicity in a sense), pertain to how to decide on a theory’s rightness or wrongness, on how to distinguish between science and non- science, or on how to determine the falsification potential of a thesis. For Kant, dimensionality does not include logical probability but the assumption of the ‘as if’ presumption of a final theory. Popper has no such pretension. The idealism of a final theory is truly gone in the age of Popper, but that is not to say that presuming a final theory is to assert that ours is the final theory. Comparing Kant and Popper has much potential in this regard.

Michael

“Philosophy of science is as useful to scientists as ornithology is to birds. “

Philosophy of science is as useful to scientists as ornithology is to birds. So says Richard Feynman, apparently. Alan Sokal, in a recent interview with Jullian Baggini, wrote that this analogy is suggestive of the lack of epistemic merit that philosophy has to the structuring and adding of new knowledge to physics. Analogies like this are apt for the contribution that philosophy has to physics, granted; but I have found it wanting in other cases.

A musician, who was largely an autodidact once said to me that he did not care very much for music theory as it did not fit with performance skills and apprehension as a musician. I fell silent, not bothering to tell him that he was playing music predominantly in a mixolydian mode, while utilising tritones, ostinati, parallel 5ths, 8ths, dominant sevenths, suspensions, passing notes, arpeggiations, and so on…

I can appreciate the view that being steeped in a particualr style limits one and the musical options that they have. I have recently started to play the guitar, and I like playing on blues scales. This is largely to impress my friends at my ability to naturally create riffs and hooks, but there is another sense in which I communicate my utter disdain for a style by its ease, there is a sense of comfort and familiarity when I play a ragtime. I’m not very good at sightreading Bach, even less if I attempted Beethoven or Chopin. Joplin and Lamb, by contrast, are a joy to practice at sight, this is because of my own insecurity as a piano player, but also there is a joy in seeing the immediate fruit of one’s labour by my immediate apprehension of the musical style and its playing ease. There is not as much ease, by contrast, in heavier romantic styles.

In short, sometimes knowing the rules of the game enhances our performance as players. This is certainly true for olympic or professional atheletes; who, while being introduced to a professional level normally at university or younger; sometimes furnish their career with a doctoral thesis that relates either to their performance or training as an athelete. Our inspiration may come from other things; engineers and technologists can sometimes draw their innovations from the observation of nature.

Coming back to the philosophy example, a later point was made that physics is just as successful and unhindered by philosophy. Physicists like Feynman and Wolpert are distinctly anti-philosophical, in contrast to the likes of Einstein, or if one really wants to go back, Newton. Newton after all, had written about his empiricist leanings and nature of his methodology. Kant reacts critically to Newton’s ’empiricist’ methodology, but not the results. This kind of philosophical engagement of a physicist, by the standards of the day, were by no means amateur and are taken seriously by philosophers today.

The so-called philosophically oriented physicists of the 20thC, by contrast, are not terribly interesting in terms of our contempoary philosophical tools. Einstein’s ‘Spinozism’ has been talked about by the likes of Dawkins and Hitchens, as a caveat so as not to be interpreted into religious terms. Having an understanding of Spinoza’s metaphysics, by contrast, is not even addressed. Spinoza’s approach to life was one of emotional calm against the overwhelming and sometimes uncontrollable temperaments that we suffer in life. One of the enjoyments that we can have in life is an apprehension of the unity of nature that is, in his metaphysics, how the nature of our inner consciousness subsumed in no small part to the larger reality as a whole, as well as the underlying propositional language that both support. This may sound mystical, but really, it is a form of naturalism. The two prejudices that Spinoza’s philosophy had were: admitting that his metaphysics was fundamentally correct, and we put scientific development and knowledge on a pedestal. None of this is really addressed in the ‘Einsteinian’ view so bastardised by the atheist popularisers.

Stephen Hawking’s own popular books try to establish a so-called philosophically interested reading of M-theory, string theory and general relativity. There are moments where his reading is somewhat patchy. But perhaps the real thing that is important, and that Hawking succeeds in, is making the current understanding of science understandable to a general public. This is what I would consider the most socially important thing that phyiscists can do outside of their work. Sokal’s perspective by contrast is one where physicists do their science between monday-saturday and then their speculation on a sunday. What succeeds about Hawking’s presentation is that the physics is presented in a manner that has religious and humanistic dimensions, rather than one of a technical ‘philosophical’ merit. Does the universe have a beginning? Does the universe have an end? What is our place in the grand order of things? Is there life beyond earth? Physics goes on well without philosophy’s involvement, however, it should be attributed to the death of the polymath that there are less physicists more interested in philosophy. The rise of continental philosophy that fails to acknowledge the work in physics with any real expertise is also a reason why physicists may dislike philosophy as a whole, that is the whole point of the Sokal hoax in a sense.

Perhaps the most interesting, and important thing that physicists can do for the public is to be understood. Conspiracies such as the moon landing being fake, or the belief that miniature black holes will destroy the universe; are harmful to science, harmful to reason and pander to a mindset that hurts rationalism and rationality.

Michael

Pessimistic metainduction

The argument against scientific realism originating from Laudan goes something like the following:

i. All past scientific theories have been deemed false
ii. If all scientific theories have been deemed false, then our most current scientific theory T should be expected to be superceded by a superior T’
iii. We have no reason to believe in the entities of a current theory T

This seems to me true but trivial, I would agree to some extent, but we may still have a scientific realism.

What if, however, we found that from an inductive inference, that we do not believe in a whole gamut of religions (lets say the set of all religions minus 1 or 2); if we have reasons (although unique to those religions themselves) to be convinced of the falsity of a religion or spiritual philosophy, can we judge it rational to dismiss religion in general from the inductive inference that all other religions are false?

Dawkins often puts it in an interesting way: we hardly believe in the deities of Thor or Zeus, and most Westerns hardly would believe in the Hindu Gods; we might dismiss it to the confine of culture to our belief in religion, corollorary to that, we may say that the cultural appeal gives us reason less to believe in the truth of that religion but more a testament to factors such as ethnic and cultural identity. Could pessimistic metainduction be made for an argument against religious belief? It seems almost as convincing as it is for the argument against scientific realism…

Sinistre