Tuesday, 10 August 2010
This mini e-book has now been largely superceded by the text copy of my book:
Not even trying: the corruption of real science
which was published by the University of Buckingham Press, 2012.
The story of real science: its rise, decline and death
Bruce G Charlton
Professor of Theoretical Medicine, University of Buckingham, UK
The following mini-e-booklet of about 20,000 words is derived from a daily blog I wrote in June-August 2010, which can be found at: http://charltonteaching.blogspot.com .
Readers are welcome to print-out this booklet for their own use.
Charlton BG. (2010) The story of real science: its rise, decline and death. http://thestoryofscience.blogspot.com [plus access date]
The Owl of Minerva flies only at dusk, and we did not know what science was or how it worked (in a historical, philosophical and sociological sense) until real science was already well-advanced towards destruction.
For me, real science is the core of the modern world. It is the creator and driving force of ‘growth’ and social change; intellectually it is the crowning glory of modernity; but indirectly science is responsible for most of the distinctive horrors of the past couple of centuries.
My (very basic, summary) understanding of the rise of real science was that it came from Pagan Greece (epitomized by Aristotle), then through the early Christian theologians. Then the Roman Catholic West professionalized philosophy (scholasticism), distinct from theology – so, the Great Schism marked the true beginning of modernity.
Then natural science separated from philosophy around the time of Galileo and moved to Northern Europe where it first became large, visible and noticeably distinct from about the 17th century.
There were agrarian and industrial revolutions in Britain; and from around 1800 a new world was apparent, characterized by growth in science, technology, the economy, and human capability: modernity.
Since then real science has, with each generation, broken-up into smaller and smaller specializations and become more and more professionalized until – at some point over the past several decades – science stopped being real and evolved into its current state of merely a variant of generic bureaucracy. This was increasingly clear from the 1960s.
Science always was present, but modernity happened because real scientific breakthroughs came so thick-and-fast that increasing efficiency out-ran increasing population – and humanity escaped the Malthusian Trap (c.f. A Farewell to Alms, by Gregory Clark).
At first real science grew fast – especially in the populations of Northern Europe (perhaps for genetic reasons?) by recruiting from the increasing pool of ‘creative geniuses’ motivated to do science (who knows why?).
But science out-grew the supply yet continued to grow by professionalizing; and recruiting less- and less-talented, weaker- and weaker-motivated personnel until these became a large majority and rendered ‘science’ hostile to highly motivated creative geniuses.
In other words there was a period of centuries over which science gradually emerged from the intellectual ruling class priesthood, as it separated from religion and then philosophy; to emerge briefly as an autonomous social system – from around the nineteenth to early twentieth century.
And from the mid-twentieth century and accelerating up until now, ‘science’ is being ever-more-rapidly re-absorbed into the secular ruling intellectual class bureaucracy.
There are no hard dividing lines, and the overlapping of generations further blurs the transitions – but now it is approximately correct to say that real science is dead. Or at least the real science is so swamped, diluted, and overwhelmed by bureaucratic pseudo-science with trends all in that direction - that this looks like a good time to write its epitaph and analyze what has happened.
First Edition 10 August 2010
Second edition 13 August 2010
The decline and death of science
Looking back on 25 years – I wasn’t actually doing science
I began actually *doing* science in 1984 - or, at least, that's what I thought I was doing.
I have worked in and across a variety of fields: neuroendocrinology in relation to psychiatry, the adrenal gland (especially from 1989), epidemiology (from about 1991), evolutionary psychology (from 1994), systems theory (from about 2001)...
In all of these areas and others I found serious problems with the existing scientific literature: errors, inconsistencies, wrong framing of problems. And after all, this is what science is supposedly about, most of the time - providing the negative feedback to correct the wrong stuff. (Providing new lines of work is what we would prefer to do, but most people never do or did achieve this.)
My assumption was that - as the years rolled by - I would have the satisfaction of seeing the wrong things tested and discredited, and replaced with more-correct information. So that overall, and in the long term, science would progress. That is what was supposed to happen.
Well, it hasn't happened.
When I look at these areas of science now, and compare them with the area as they were when I entered them, I see no progress whatsoever: the problems, errors, misunderstandings and stupidity which were there years ago are still there - in many cases amplified and elaborated. In many cases things are in a much worse state than they were when I first entered the field: error and falsehood have not been suppressed or corrected, but have instead thriven.
There is no evidence of progress.
So, I must conclude that although I believed I was participating in something called science, something that I believed I understood from the writings of Bronowski, Popper, Hull - I wasn't really doing science at all.
I was 'going through the motions' of doing science, but the machinery was broken, and the work I was trying to do, and the work of those whom I respected, was a free-spinning-cog: maybe it was published, maybe it was cited, maybe it was funded, maybe people made careers from doing it - but in the end it was just a private hobby. It did not make any difference. We were deluded.
We could perhaps console ourselves by hoping that we have left a residue of analysis or data in 'the scientific literature'; but with 25000 journals plus, it really is a needle in a haystack. And isolated fragments of analysis and data cannot really be understood if they do happen to be rediscovered; cut off from the discourse which generated them, bits and pieces of science don't make sense to an outsider.
Now it might be argued that although science was not really going-on in the bits I knew, it *was* going on elsewhere; and that is true. To some extent, probably a very small, shockingly small, extent. But it really is scary to contemplate how whole areas of science (and I have known several) can chug-along for decades, with their participants busily doing... *stuff*; all of them apparently thinking, believing that they are doing science, when they are in fact doing no such thing.
What *are* they doing? - What were we doing, in those branches of science in which I participated? Glass Bead Games spring to mind (from the novel by Herman Hesse) - in the sense of intellectual exercises wholly detached from reality; but really that is far too elevated and elite a concept for the industrial scale drudgery of Big Science. It is reasonable to consider something like top-level string theory, or analytic philosophy, or even postmodern literary theory as true Glass Bead Games – but not the millions of drones in medical research or physics.
Mainstream Big Science is most reminiscent of a Soviet Union era organizations – such as the grossly unprofitable Polish glass factory I saw on TV being inspected by John Harvey Jones in his TV show Troubleshooter. It was producing vast quantities of defective drinking glasses which nobody wanted to buy or to use – and was simply piling them up in gigantic stacks around the building – wasting everybody’s time and taking-up all the useful space.
When Harvey Jones was asked what to do, how to make the business profitable, he said the first essential step was to STOP MAKING THE GLASSES.
Stop *now*: this very minute, he sadi. Switch-off the production line, send all the factory workers home (on full pay) for a few weeks, and *then* begin sorting it out.
But so long as the workers were coming in-and-out and beavering away; the paperwork was being completed (in triplicate); and the masses of defective glasses were being churned-out, piling-up and blocking the aisles and preventing anything useful being done – there was no hope.
This is the problem of science today – it has been bloated by decades of exponential growth into a bureaucratically dominated heavy industry soviet factory characterized by vastly inefficient mass production of shoddy goods. And it is trundling along, hour by hour, day by day; masses of people going to work, doing things, saying things, writing things…
Science is hopelessly and utterly un-reformable while it continues to be so big, continues to grow-and-grow, and continues uselessly to churn out ever-more of its sub-standard and unwanted goods.
Switch it off: stop making the defective glasses: now...
Was scientific progress a result of genius?
Scientific progress is talked about in three main ways, depending on the numbers/ proportion of the population involved in generating this progress:
1. Genius - 10s to 100s of people per generation – a fraction of 1 percent of the population.
Science is the product of a relatively small number of geniuses - without whom there would be no significant progress.
Therefore an age of scientific progress can be boiled down to the activity of tens or hundreds of geniuses; and the history of science is a list of great men.
2. Elite - 1000s to 10,000s of people per generation – a few percent of the population
Science is the product of an elite of highly educated and trained people, usually found in a relatively small number of elite and research-orientated institutions, linked in an intensely intercommunicating network. Without this elite, and these elite institutions, there would be no significant progress.
The history of science is a history of institutions.
3. Mass - 100,000s to millions of people per generation – a large percent of the population, most ideally.
Science is the product of a 'critical mass' of scientifically orientated and educated people spread across a nation or culture; and whose attitudes and various skills add or synergize to generate scientific progress. If society is not sufficiently 'scientific' in this sense, then there will not be significant progress.
The history of science is a history of gradual transformation of populations - mainly by educational reform.
A (common) twist on this is the idea that humans have vast untapped potential - and that this potential might somehow be activated - e.g. by the right kind of education; leading to an elite of geniuses, or a mass-elite, or something...
Perhaps the mainstream idea nowadays is a mushy kind of belief/ aspiration that science is essentially elite but that the elite can be expanded indefinitely by education and increased professionalization.
Another variant is that scientific progress began as based on genius, then became elite-driven, and nowadays is a mass ('democratic') movement: however, this is merely a non-historical description of what has actually happened (more or less) - underpinned by the assumption that scientific progress has indeed been maintained.
But I do not accept that assumption of continued progress (given the vastly increased level and pervasiveness of hype and dishonesty in science).
Certainly there seem to be historical examples of scientific progress without need for a prior scientific mass of the population, or even a pre-existing elite gathered in elite institutions.
Of course, nowadays there are no geniuses in science, so admitting that genius is necessary to significant scientific progress entails admitting that we are not making progress.
Nonetheless, my reading of the history of science is that a sufficient supply of genius is necessary to significant scientific progress (although history has not always recorded the identities of the presumed geniuses) – at any rate, science has often made significant progress without elites in the modern sense, and elites often fail to make progress.
Human capability peaked decades ago, and has since declined.
I suspect that human capability reached its peak or plateau around 1965-75 – at the time of the Apollo moon landings – and has been declining ever since.
This may sound bizarre or just plain false, but the argument is simple. That landing of men on the moon and bringing them back alive was the supreme achievement of human capability, the most difficult problem ever solved by humans. 40 years ago we could do it – repeatedly – but since then we have *not* been to the moon, and I suggest the real reason we have not been to the moon since 1972 is that we cannot any longer do it. Humans have lost the capability.
Of course, the standard line is that humans stopped going to the moon only because we no longer *wanted* to go to the moon, or could not afford to, or something…– but I am suggesting that all this is BS, merely excuses for not doing something which we *cannot* do.
It is as if an eighty year old ex-professional-cyclist was to claim that the reason he had stopped competing in the Tour de France was that he had now had found better ways to spend his time and money. It may be true; but does not disguise the fact that an 80 year old could not compete in international cycling races even if he wanted to.
Human capability partly depends on technology. A big task requires a variety of appropriate and interlocking technologies – the absence of any one vital technology would prevent attainment. I presume that technology has continued to improve since 1975 – so technological decline is not likely to be the reason for failure of capability.
But, however well planned, human capability in complex tasks also depends on ‘on-the-job’ problem-solving – the ability to combine expertise and creativity to deal with unforeseen situations.
On the job problem-solving means having the best people doing the most important jobs. For example, if it had not been Neil Armstrong at the controls of the first Apollo 11 lunar lander but had instead been somebody of lesser ability, decisiveness, courage and creativity – the mission would either have failed or aborted. If both the astronauts and NASA ground staff had been anything less than superb, then the Apollo 13 mission would have led to loss of life.
But since the 1970s there has been a decline in the quality of people in the key jobs in NASA, and elsewhere – because organizations no longer seek to find and use the best people as their ideal but instead try to be ‘diverse’ in various ways (age, sex, race, nationality etc). And also the people in the key jobs are no longer able to decide and command, due to the expansion of committees and the erosion of individual responsibility and autonomy.
By 1986, and the Challenger space shuttle disaster, it was clear that humans had declined in capability – since the disaster was fundamentally caused by managers and committees being in control of NASA rather than individual experts.
It was around the 1970s that the human spirit began to be overwhelmed by bureaucracy (although the trend had been growing for many decades).
Since the mid-1970s the rate of progress has declined in physics, biology and the medical sciences – and some of these have arguably gone into reverse, so that the practice of science in some areas has overall gone backwards, valid knowledge has been lost and replaced with phony fashionable triviality and dishonest hype. Some of the biggest areas of science – medical research, molecular biology, neuroscience, epidemiology, climate research – are almost wholly trivial or bogus. This is not compensated by a few islands of progress, eg in computerization and the invention of the internet. Capability must cover all the bases, and depends not on a single advanced area but all-round advancement.
The fact is that human no longer do - *can* no longer do many things we used to be able to do: land on the moon, swiftly win wars against weak opposition and then control the defeated nation, secure national borders, discover ‘breakthrough’ medical treatments, prevent crime, design and build to a tight deadline, educate people so they are ready to work before the age of 22, block an undersea oil leak...
50 years ago we would have the smartest, best trained, most experienced and most creative people we could find (given human imperfections) in position to take responsibility, make decisions and act upon them in pursuit of a positive goal.
Now we have dull and docile committee members chosen partly with an eye to affirmative action and to generate positive media coverage, whose major priority is not to do the job but to avoid personal responsibility and prevent side-effects; pestered at every turn by an irresponsible and aggressive media and grandstanding politicians out to score popularity points; all of whom are hemmed-about by regulations such that – whatever they do do, or do not do – they will be in breach of some rule or another.
So we should be honest about the fact that human do not anymore fly to the moon because humans cannot anymore fly to the moon. Humans have failed to block the leaking oil pipe in the Gulf of Mexico because we nowadays cannot do it (although humans would surely have solved the problem 40 years ago, but in ways we can no longer imagine since then the experts were both smarter and more creative than we are now, and these experts would then have been in a position to do the needful).
There has been a significant decline in human capability. And there is no sign yet of reversal in this decline, although reversal and recovery is indeed possible.
But do no believe any excuses for failure to do something. Doing something is the only proof that something can indeed be done.
Only when regular and successful lunar flights resume can we legitimately claim to have achieved approximately equal capability to that which humans possesed 40 years ago.
Measuring human capability: Moonshot versus 'Texas Sharpshooter'
The reason that the Moonshot was a valid measure of human capability is that the problem was not chosen but imposed.
The objective of landing men on the moon (and bringing them safely back) was not chosen by scientists and engineers as being something already within their capability – but was a problem imposed on them by politicians.
The desirability of the Moonshot is irrelevant to this point. I used to be strongly in favour of space exploration, now I have probably turned against it – but my own views are not relevant to the use of the Moonshot as the ultimate evidence of human capability.
Other examples of imposed problems include the Manhattan project for devising an atomic bomb – although in this instance the project was embarked upon precisely because senior scientists judged that the problem could possibly, maybe probably, be solved; and therefore that the US ought to solve it first before Germany did so. But, either way, the problem of building an atomic bomb was also successfully solved.
Again, the desirability of atomic bombs is not the point here – the point is that it was a measure of human capability in solving imposed problems.
Since the Moonshot, there have been several major problems imposed by politicians on scientists that have *not* been solved: finding a ‘cure for cancer’ and ‘understanding the brain’ being two problems at which vastly more monetary and manpower resources (although vastly less talent and creativity) have been thrown than was the case for either the Moonshot or Manhattan Project.
The Gulf of Mexico oil leak is another imposed problem. And, so far, this has not been solved.
But modern technological advances are *not* imposed problems; they are instead examples of the Texas Sharpshooter fallacy.
The joke of the Texas Sharpshooter is that he fires his gun many times into a barn door, then draws a target over the bullet holes, with the bulls-eye over the closest cluster of bullet holes.
In other words the Texas Sharpshooter makes it look as if he had been aiming at the bulls-eye and had hit it, when in fact he drew the bulls-eye only after he took the shots.
Modern science and engineering is like that. People do research and development, and then proclaim triumphantly that they have achieved whatever-happens-to-come-out-of-R&D, and then they spin, hype and market whatever-happens-to-come-out-of-R&D as if it were a major breakthrough.
In other words, modern R&D triumphantly solves a retrospectively designated problem, the problem being generated to validate whatever-happens-to-come-out-of-R&D.
The Human Genome Project was an example of Texas Sharpshooting masquerading as human capability. Sequencing the human genome was not solving an imposed problem, nor any other kind of real world problem, but was merely doing a bit faster what was already happening.
Personally, I am no fan of Big Science, indeed I regard the success of Manhattan Project as the beginning of the end for real science.
BUT those who are keen that humanity solve big problems and who boast about our ability to do so; need to acknowledge that humanity has apparently become much *worse*, not better, at solving big problems over the past 40 years – so long as we judge success only in terms of solving imposed problems which we do not already know how to solve, and so long as we ignore the trickery of the many Texas Sharpshooters among modern scientists and engineers.
The Texas Sharpshooter society of secular modernity
The Texas Sharpshooter fallacy is a joke which suggests that the TS fires his gun many times into a barn door, then draws a target over the bullet holes, with the bulls-eye over the closest cluster of bullet holes to make it look as if he had been aiming at the bulls-eye and had hit it, when in fact he drew the bulls-eye only after he took the shots.
But in fact the sharpshooter fallacy is unavoidable and everywhere, it characterizes secular modern society throughout, because secular modern society has no aim but instead idealizes process and retrofits aim to outcome.
Secular moderns - in public discourse - 'believe in' things like freedom, or democracy, or equality, or progress - but these are processes, not aims. Aims are retrospectively ascribed to whatever emerges from process.
It happens all the time: liberation of slavery emerged from the American Civil War therefore people retrospectively ascribe liberation as its purpose. Destruction of the Jewish concentration camps emerged from the second world war, so the liberation of the Jews is ascribed as its purpose.
Libertarians 'believe in' freedom not as a means to some end, but as a process which by definition leads to the best ends; so that they 'believe in' whatever comes out of the process. People are made free, stuff happens as a result, then that stuff is retrospectively defined as good - because it is the output of a free system. In other words, the Texas Sharpshooter fallacy.
In democracy, people believe in a system, there are elections and so on, someone is elected, stuff happens – much of which appears to be disastrous but good or bad it is retrospectively defined as the best outcome because it is the result of democracy (or it is a bad outcome because the system was not real democracy, or not full democracy, or democracy had been subverted or corrupted. If it is not a democracy then it is bad, because it lacks the process.). In other words, the Texas Sharpshooter fallacy.
The example I gave was science. The modern attitude is that the best thing is for science to be free and to do what science does, and whatever comes out of the process of science is retrospectively defined as 'truth'. In practice, science is defined as what scientists do, and what scientists do is defined as generating truth. In other words, the Texas Sharpshooter fallacy.
Or law. Law is a process, and justice is defined as that which results from the process of law. Lacking a transcendental concept of justice, nothing more can be said.
Or education. What is education? The answer is ‘what happens at school and college’. And whatever happens at school and college is what counts as education. Since what happens at school and college changes, then the meaning of education changes. But since education is not aiming at anything in particular, these changes cannot be evaluated. Whatever happens is retrospectively defined as what needed to happen.
Or economics. Economic ‘growth’ is pursued as the good, and whatever comes out of economics is defined as prosperity. What people 'want' is known only by what they get - their wants are retrospectively ascribed. If what is measured and counted grows, then this counts as growing prosperity. So the economy fifty years ago wanted more A, B and C but the modern economy instead provides X, Y and Z – however, economists retrospectively re-draw the target around X,Y and Z and proclaim the triumph of economics. TSF.
Certainly this was the kind of view I held and argued for in the past, indeed not very long ago. It seemed ‘paradoxical’ even then, but it is not just paradoxical - it is nonsense.
The primacy of process is simple nonsense - it is simply trying to do without aims because all aims point to the necessity for underpinning transcendental justifications for those aims. (Or else aims are arbitrary and subjective statements.)
Secular modernity is fundamentally based on the Texas Sharpshooter fallacy, and the fallacy is simple and obvious. However, since the fallacy is intrinsic and pervasive, it must be concealed, and it is concealed.
If it happened to Classics, it could happen to science
I find that people simply cannot take seriously that Science would collapse down to a small fraction of its current (vast, bloated) size.
Yet there is a recent precedent for the collapse of the dominant intellectual culture: Classics.
The study of Greek and Roman culture - language, history, literature, philosophy - was the dominant intellectual activity in the West for a couple of millennia. It was the mark of A Gentleman, especially a Scholar - the most high status form of knowledge, the main subject taught at the best schools and universities.
In England, when it was the top country and culture, Classics pretty much monopolized the curriculum in the Public Schools, Grammar Schools and Oxford University (Cambridge focused on mathematics - but had plenty of Classicists too). New subjects like Science had to fight for space in the curriculum.
Right up into the mid 20th century, the most prestigious degree in England was an Oxford four year Classics degree - the premier 'qualification' for elite ruling class professions. This was detectable even as recently as thirty years ago, and the 'two cultures' debate of the late 1950s and early 1960s marked the tipping point when Science began to dominate Classics in general cultural discourse.
Classics have now dwindled to the status of a hobby, taught in few schools and never given much prominence. Most UK universities have all-but abandoned the subject - only a handful of courses at a few places can find undergraduates with any background or competence in Latin (even fewer in Greek); so most modern 'Classics' degrees are built on no foundations in three years, from zero.
Advocates of Classics find it ever harder to justify their subject as worthy of study - certainly there is no automatic deference towards it, no assumption of its superiority.
So, in the space of about 250 years, from the time of Samuel Johnson - when he was apologetic about writing in English rather than Latin and focusing his dictionary on the vernacular - until now, Classics have dwindled to insignificance in general culture.
While Classics was quietly dwindling in importance for a few hundred years (at least since Shakespeare outstripped all rivals using the vernacular) this was becoming ever more apparent from the mid 19th century, and at least as recently as the time of the great English Classics professor (and poet) Houseman (1859-1936) it looked as if the subject was on the verge of a breakthrough (using 'modern' scholarship). And of course classical scholarship has continued throughout all this, pouring out research books and scholarly articles for a dwindling audience of other scholars.
My point is that if it seems unimaginable that Science could dwindle in a few decades from dominance into insignificance then think about what happened to Classics. The signs are already there for those who look behind the hype.
Of course a scientist feels that the real importance of Classics was trivial compared with Science - the modern world depends on Science. Quite true, but then the ancient world depended on Classics, and the collapse of Classics was linked with the collapse of traditional society.
The collapse of Science is linked with the collapse of modernity - both as cause and consequence.
Scientific knowledge is mostly hidden in plain sight
A couple of years ago I published an editorial which, amongst other things, noted that the history of IQ research showed how the subject had been 'hidden in plain sight' since about the mid-1960s -
I said: "It seems that even in modern times, and in a liberal democratic society such as the UK where information is freely and easily accessible, scientific knowledge can apparently be ‘disappeared’ when it comes into conflict with the dominant socio-political agenda: can become, as it were, ‘hidden in plain sight’."
My conclusion was:
"Since this area of science [IQ research] has so been comprehensively ‘disappeared’ from public consciousness in the face of socio-political pressure, it seems probable that other similarly solid and vital domains of scientific knowledge may also be hidden in plain sight."
Taking on board this lesson has been a slow process for me. But the implications of what I know happened to IQ cuts at the very root of the pretentions of liberal democracy.
It really is inconceivable that IQ is an unique exception; rather I would now regard the IQ story as typical of the relation between 'science' and general, public knowledge and public policy.
The IQ story shows that no amount of relevant evidence is ever going to be enough to change people's minds when they do not want their minds to be changed; and this resistance to evidence is the case for even some of the wisest and most intelligent of people - given that some very decent and smart people are able to write-off reams of IQ research without a blink, and believe what they want instead.
The IQ story is doubly important since the IQ literature largely conforms to traditional wisdom, common observation and spontaneous belief. So it ought to be pushing at an open door.
Yet it looks very much as if since at least the mid-1960s our society's much vaunted scientific basis has been a sham. In other words, the more modern, rational and scientific we believed ourselves to be in The West - the less true this really was.
'Influential' science, science that is linked to policy and supposedly drives policy (e.g. 'climate science', or 'evidence based medicine'}, is now - and long has been - constructed by policy: the tail of politics is wagging the dog of science.
Yet even influential science is only apparently influential, since it is wholly driven by policy needs, and if one person does not do it then another will - or it will be conjured up from pre-existing material, or something or somewhere... The climate science story demonstrates that it is now facile to construct a truly vast and all-pervasive yet utterly fake-science from dullards, errors, lies and rumors to rationalize political demands.
Influential scientists are servants, albeit well paid servants, they are not masters. This can be seen by the fact that they write and speak only what is acceptable to their masters (or else they stop being influential). The mismatch between what everybody knows and what can be published or even mentioned get larger with each passing year.
Real science, truth seeking science, where it exists, has been since about 1965 (probably earlier) a free-spinning cog, a group hobby - albeit perhaps a well-funded hobby - when it conflicts with the needs of policy.
Just think – whole research units, dotted round the world, headed-up by professors and assisted by armies of technicians, well-funded, publishing and discussing stuff that solid knowledge, is disseminated in the media – yet utterly ineffectual, beyond the pale of policy.
There but not there.
Whole lives of delusion.
Occupational therapy for intellectuals.
We are living in an age where politics controls science just as much as it did in the remote ‘medieval’ past. The autonomy of science - such as it is - is a sham, in the sense that autonomy comes at the price of disarticulation from the rest of life.
If the IQ story is typical rather than exceptional, does this imply that the scientific discourse and literature is basically worthless, a fraud in its relation to human belief and behaviour? It does begin to look that way, as a generalization.
But that seems too much even to bear contemplating.
Chargaff on the loss of human pace and scale in science
Referring to his first twelve years at Columbia University, USA:
“The more than sixty regular papers published during that period dealt with a very wide field of biochemistry, as it was then understood; and a few of them may even have contributed a little to the advance of science, which, at that time, was still slow, i.e., it had human proportions.
“Nevertheless, when I look back on what I did during those twelve years, there come to mind the words ascribed to St. Thomas Aquinas: Omnia quae scripsi paleae mihi videntur. All he had written seemed to him as chaff.
“When I was young, I was required – and it was easy – to go back to the origins of our science. The bibliographies of chemical and biological papers often included reference to work done forty or fifty years earlier. One felt oneself part of a gently growing tradition, growing at a rate that the human mind could encompass, vanishing at a rate it could apprehend.
“Now, however, in our miserable scientific mass society, nearly all discoveries are born dead; papers are tokens in a power game, evanescent reflections on the screen of a spectator sport, new items that do not outlive the day on which they appeared.
“Our sciences have become forcing houses for a market that in reality does not exist, creating, with the concomitant complete break in tradition, a truly Babylonian confusion of mind and language.
“Nowadays, scientific tradition hardly reaches back for more than three or four years. The proscenium looks the same as before, but the scenery keeps on changing as in a fever dream; no sooner is one backdrop in place than it is replaced by an entirely different one.
“The only thing that experience can now teach is that it has become worthless.
“One could ask whether a fund of knowledge, such as a scientific discipline, can exist without a living tradition.
“In any event, in many areas of science which I am able to survey, this tradition has disappeared. It is, hence, no exaggeration and no coquettish humility if I conclude that the work we did thirty or forty years ago – with all the engagement that honest effort could provide – is dead and gone.”
Erwin Chargaff (1905-2002) – Heraclitean Fire, 1978.
Read, mark, learn and inwardly digest.
Note: “the advance of science (…) was still slow, i.e., it had human proportions. (…) One felt oneself part of a gently growing tradition, growing at a rate that the human mind could encompass, vanishing at a rate it could apprehend.”
*That* is the pace of real science.
“…in our miserable scientific mass society, nearly all discoveries are born dead; papers are tokens in a power game, evanescent reflections on the screen of a spectator sport, new items that do not outlive the day on which they appeared…”
But “our miserable scientific mass society” does not operate at the pace of real science, but at the pace of management – and what is more, a management suffering from ADHD. Six monthly appraisals, yearly job plans, three yearly grants and so on. (All evaluations being determined by committee and bureaucracy, rather than by individuals.)
Note: “Our sciences have become forcing houses for a market that in reality does not exist…”
Nobody really *wants* what modern science provides, there is no real market for it; which is why modern science is dishonest – from top to bottom: modern science must engage in public relations, hype, spin – lies – in order to persuade the ‘market’ that it really wants whatever stuff the ‘forcing houses’ of modern science are relentlessly churning-out.
Max Delbruck on the moral qualities of science
Max Delbruck - 1906-1981. Nobel Prize 1969.
Question (1971): "Does scientific research by itself foster high moral qualities in men?"
Delbruck's answer: "Scientific research by itself fosters one high moral quality: that you should be reasonably honest.
"This quality is in fact displayed to a remarkable extent.
"Although many of the things that you read in scientific journals are wrong, one does assume automatically that the author at least believed he was right."
(Quoted p282 in Thinking about Science: Max Delbruck and the origins of molecular biology. EP Fischer & C Lipson. 1988)
Comment: that was written in 1971, by a man who was one of the most well-connected of twentieth century scientists, a kind of godfather to molecular biology, and a person of great personal integrity.
So Delbruck was in a position to know what he was talking about.
And, in 1971, he was able to state that scientific research by itself fosters the high moral quality that you should be reasonably honest. And that this quality is *in fact* displayed to a remarkable extent. And that when reading journals scientists could and did assume that the authors were telling the truth as they saw it.
Only 40 years ago Delbruck could state that scientists were in fact, in reality, in practice - honest...
Bronowski on the habit of truth
Jacob Bronowski (1908-1974) invented the term 'the habit of truth' to describe the fundamental and distinctive ethic of science: the main foundation upon which was built the success of science, providing the means (knowledge) for mankind to shape the natural world.
Bronowski emphasized this, since it was (and is) often imagined that science is a morally neutral activity. But although scientific knowledge is indeed morally neutral (and can be used for good or evil) the practice of science (including being a scientist) is indeed a moral activity - based on the habit of truth.
He argued that for science to be truthful as a whole it is not sufficient to aim at truth as an ultimate outcome, scientists must also be habitually truthful in the ‘minute particulars’ of their scientific lives. The end does not justify the means, instead the means are indivisible from the end: scientific work is ‘of a piece, in the large and in detail; so that if we silence one scruple about our means, we infect ourselves and our ends together’.
The idea is that – to be successful in terms of the test of shaping the natural world, scientists – scientific communications - must speak the truth as it is understood. Indeed, I think it likely that the social structure of science is at root a group of people who seek truth and speak truth habitually (and if or when they cannot be truthful, they are silent).
Bronowski perceived that societies which abandoned, indeed persecuted, the habit of truth – such as, in his time, the USSR and Nazi German – paid the price in terms of losing their inability to perceive or generate the underlying knowledge of reality which forms the basis of shaping the natural world. (Note – these were societies which had the habit of truth in science, and then lost it.)
This declining ability to shape the natural world was concealed with propaganda, but such concealment could only be temporary since the cause of the decline was strengthened by every attempt to deny it.
Having grown up under the influence of Bronowski (for good and for ill) and also this distinctive morality of science, I have witnessed at first hand the rapid loss of the habit of truth from science: at first an ecapsulated loss whereby scientists continued to be truthful with each other (that is, truthful in the sense of speaking the truth as they see it) while lying to outsiders (especially in order to get grants, promote their research, and to deflect criticism); the situation degenerating swiftly to the final surrender whereby scientists are no longer truthful even with themselves.
At the same time I have seen hype (i.e. propaganda) expand from being merely a superficial sheen added to real science in order to make it more interesting to the general public, to the present situation where hype defines reality for scientists (as well as everyone else) – where propaganda is so pervasive that nobody can know what – if anything – lies beneath it (there is, indeed, no ‘beneath’ since by now hype goes all the way through science from top to bottom).
At the end of his life, Bronowski saw this coming, in its early stages, and wrote an essay entitled The Disestablishment of Science about the need for science to be separated from the state. This was necessary, he argued, because the morality of government and the morality of science were so different.
As understand it, Bronowski’s major distinction is between government’s focus on ‘expediency’ or direct short term capability – which is substantially power over human behaviour by propaganda and coercion plus already-available and useable technology; and science’s indirect generation of long term capability – which is substantially the result of greater knowledge leading to greater efficiency (more power per person).
“the hidden spring of power is knowledge; and more than this, power over our environment grows from discovery.”
Bronowski assumed that enlightened self-interest (i.e. long-termism) would be a strong force to maintain the independence and honesty of science against its erosion by short-termist government expediency.
This assumption was indeed crucial to Bronowski’s philosophy – which was atheist and utilitarian. He needed to believe that humanity needed to be and *would be* rational, sensible and far-sighted in its self-management; that humanity sought capability as a primary aim (not as an accidental by-product) and he also need to believe in the ‘democracy of intellect’: that humanity was intrinsically unified in terms of their motivation and capability, so that science was basically comprehensible to all (or the mass of) humankind and that the primacy of the habit of truth was also a universal aspiration.
The decades have made convinced me that Bronowski was factually wrong in several of his key assumptions, and this explains why the kind of rational ‘humanism’ Bronowski espoused has proven powerless to arrest the decline in the habit of truth and has indeed been a major collaborator in the erosion (the apparatus of hype and propaganda is staffed mostly by rational humanists, and justifies itself and its activities using rational humanist reasoning).
At root, as I understand it, Bronowski’s validation of science was power: the increased power it gave humanity, which was undeniable in terms of the vast and cumulative reshaping of the world which could be seen from the industrial revolution onwards.
Bronowski hoped that this power would be disciplined and moralized by the discipline and morality which itself generated the power: that is, by science. So his vision was of a society based on science becoming organized according to the morality of science, and thereby sustaining that science upon which it depended.
For Bronowski, science was therefore validated by the power it created, and power as an aim was validated by (long term) domination (i.e. in the long term the most scientific society would also be the strongest).
As an auxiliary justification of this seeking after power, Bronowski brought in an ethic that mankind’s deepest desire and ultimate destiny was the perpetual expansion of power, hel claimed to see this in the shape of history (the ‘ascent of man’).
This was indeed a moral principle for Bronowski – but in order to avoid the obvious problems of tyranny and aristocracy, he also needed to believe that the conditions for generating science (and power) were intrinsically ‘democratic’ – that in the long term the diffusion of power, the perpetuation of freedom, were two sides of the same coin of society becoming scientific in its mass.
From Science as a Humanistic Discipline:
“…science as a system of knowledge could not have grown up if men did not set a value on truth, on trust, on human equality, and respect, on personal freedom to think and to challenge, [… these are the] prerequisites of to the evolution of knowledge.”
My perspective is that ‘men’ did not (and do not) value these things, but scientists used to do so (although not any more). And I agree that they are indeed prerequisites to the evolution of knowledge – i.e. to real science.
Mass scientific competence and the dispersion of political power among citizenry were assumed to be linked phenomena – and mass education in science (including the morality of science) was therefore the basis of both power and freedom.
It now seems to me that Bronowski was wrong about the wellsprings of human motivation, and was engaging in wishful thinking concerning the basis of viable human societies. He grossly underestimated the intrinsically human oriented, short termist, selfish, nepotistic character of human nature; and failed to see the rarity of mathematical and scientific modes of thinking.
Far from being universal, the scientific way of thinking and the habit of truth is so rare in human history and across human societies as to look like a local and perhaps temporary ‘sport’ rather than a fundamental property of mankind.
Bronowski was also wrong about the hoped-for tendency for the desire for power intrinsically to regulate itself in a long-termist fashion, and I regard his installation of power seeking as a primary virtue as an instance of Nietzschian moral inversion – rather than an insight.
After all, the secular scientist (or humanist), for all his virtues, is very often a prideful egotist with an insatiable lust for status; and when he subscribes to an ethic of power he will often tend to justify himself as an instrument for the betterment of the human condition.
But the past decades have certainly confirmed that Bronowski was correct about the consequences of abandoning the habit of truth. Bronowski would have been utterly appalled at the pervasive, daily, habitual dishonesty of researchers (especially the leading researchers) in the largest and most dominant world science: medical science.
And as for the Lysenkoism of Climate Science… he might have been darkly amused at the defense of pervasive, deliberate, fundamental collusion and lying on the grounds (perfectly accurate!) that this was statistically *normal* behavior in modern science.
Because the world did not heed Bronowski’s warnings in the Disestablishment of Science, and the outcome of science becoming dependent on government funding has been wholly in line with Bronowski’s direct predictions.
As he wrote in Science as a Humanistic Disicpline: “… science is cut off from its roots, and becomes a bag of tricks for the service of governments.”
“A bag of tricks for the service of governments” – what a perfect description of a major, mainstream modern ‘science’!
Micro-specialization and the infinite perpetuation of error
Scientific specialization is generally supposed to benefit the precision and validity of knowledge within specializations, but at the cost of these specializations becoming narrower, and loss of integration between specializations.
In other words, as specialization proceeds, people supposedly know more and more about less and less - the benefit being presumed to be more knowledge in each domain, the cost that nobody has a general understanding.
However, I think the supposed benefit is actually not true. People do not really know more – often they know nothing at all or everything they know is wrong because undercut by fundamental errors.
Probably the benefits of specialization really do apply to the early stages of gross specialization such as the increase of scientific career differentiation in the early 20th century - the era when there was a division of university science degrees into Physics, Chemistry and Biology - then later a further modest subdivision of each of these into two or three.
But since the 1960s scientific specialization has now gone far beyond this point, and the process is now almost wholly disadvantageous. We are now in an era of micro-specialization, with dozens of subdivisions within sciences.
Part of this is simply the low average and peak level of ability, motivation and honesty in most branches of modern science. The number of scientists has increased by more than an order of magnitude – clearly this has an effect. Scientific training and conditions have become prolonged and dull and collectivist – deterring creative and self-motivated people. And these have happened in an era when the smartest kids tended not to gravitate to science, as they did in the early 20th century, but instead to professions such as medicine and law.
However there is a more basic and insoluble problem about micro-specialization. This is that micro-specialization is about micro-validation – which can neither detect nor correct gross errors in its basic suppositions.
In my experience, this is the case for many scientific specialties:
1. Epidemiologists are fixated on statistical issues and cannot detect major errors in their presuppositions because they do not regard individual patient data as valid nor do they regard sciences such as physiology and pharmacology as relevant. Hence they do not understand why statistical knowledge cannot replace biological and medical knowledge, nor why the average of 20 000 crudely measured randomized trial patients is not a substitute for the knowledgeable and careful study of individual patients. Since epidemiology emerged as a separate specialty, it has made no significant contribution to medicine but has led to many errors and false emphases. (All this is compounded by the dominant left-wing political agenda of almost all epidemiologists.)
2. Climate change scientists are fixated on fitting computer models to retrospective data sets, and cannot recognize that retrofitted models have zero intrinsic predictive validity. The validity of a model comes from the prediction of future events, from consistency with other sciences relevant to the components of the model, and from consistency with independent data not included in the retrofitting. Mainstream climate change scientists fail to notice that complex computer modelling has been of very little predictive or analytic value in other areas of science (macroeconomics, for instance). They don't even have a coherent understanding of the key concept of global temperature – if they did have a coherent concept of global temperature, they would realize that it is a _straightforward_ matter to detect changes in global temperature – since with proper controls every point on the globe would experience such changes. If the proper controls are not known, however, then global temperature simply cannot be measured; in which case climate scientists should either work out the necessary controls, or else shut-up.
3. Functional brain imaging involves the truly bizarre practice of averaging of synaptic events: with a temporal resolution of functional imaging methods typically averaging tens to hundreds of action potentials and a spatial resolution averaging tens to hundreds of millions of synapses. There may also be multiple averaging and subtraction of repeated tasks. What this all means at the end of some billions of averaged instances is anybody's guess - almost certainly it is un-interpretable (just consider what it would mean to average _any_ biological activity in this kind of fashion!). Yet this stuff is the basis for the major branch of neuroscience which for three decades has been the major non-genetic branch of biological/ medical science - at the cost of who knows how many billions of pounds and man-hours. And at the end of the day, the contribution of functional brain imaging to biological science and medicine has been - roughly - none-at-all.
In other words, in the world of micro-specialization the each specialist’s attention is focused on technical minutiae and the application of conventional proxy measures and operational definitions. These agreed-practices are used in micro-specialities for no better reason than 'everybody else' does the same and (lacking any real validity to their activities) there must be some kind of arbitrary ‘standard’ against which people are judged. ('Everybody else' here means the dominant Big Science researchers who dominate peer review (appointments, promotions, grants, publications etc.) in that micro-speciality.)
Micro-specialists cannot even understand what has happened when there are fatal objections and comprehensive refutations of their standard paradigms which originate from adjacent areas of science.
In a nutshell, micros-specialization allows a situation to develop where the whole of a vast area of science is bogus; and for this reality to be intrinsically and permanently invisible and incomprehensible to the participants in that science.
If we then combine this fact with the notion that only micro-specialists are competent to evaluate the domain of their micro-speciality - then we have a situation of intractable error.
Which situation is precisely what we do have. Vast scientific enterprises have consumed vast resources without yielding any substantive progress, and the phenomenon continues for time-spans of several human generations, and there is no end in sight (short of the collapse of science-as-a-whole).
According to the analysts of classical science, science was supposed to be uniquely self-correcting - in practice, now, thanks in part o micros-specialization, it is not self-correcting at all. Either what we call science nowadays is not 'real science' or else real science has mutated into something which is a mechanism for infinite perpetuation of error.
Reflections on Human Accomplishment and genius
Charles Murray's book length quantitative analysis of Human Accomplishment, made a big impact on me. I 'brooded' over it for quite a while, especially the summaries and speculations concerned with the possible cultural causes of 'genius'.
I have always been interested in 'genius', and have read many biographies of geniuses in a range of endeavors.
On the whole, I subscribe to the Great Man theory by which specific individuals shape the course of history - some of these individuals do exceptional damage, others are exceptionally creative, while of course some do both.
And therefore I regard the ability of a society to produce potential Great Men and embody the conditions they need to make a difference, as a major influence on it - and this ability has been very unevenly distributed between societies across space and time.
For example, the modern world - the kind of society characterized by growth in scientific knowledge, technological capability and economic production, which took off in Great Britain and became apparent in the late seventeenth and early eighteenth centuries - and spread from there; this kind of society I believe was probably driven by the work of numerous Great Men (or 'geniuses') who produced qualitative advances (or 'breakthroughs') across a wide range of human activities.
I believe that these numerous breakthroughs required Great Men (i.e an adequate supply of GM were necessary but not sufficient), but once made could be exploited by ordinary men.
Most histories of society took the form of a series of mini-biographies. For example, Jacob Bronowski's Ascent of Man TV series was, for most of its length, focused on a series of specific individuals. And the implication was that this was not just a convenience for the purposes of teaching and entertainment, but an account of how things really, necessarily happened.
Up to the 1950s it was obvious for Britons to focus on Great Men, since they were living among us, and each new generation brought forth a fresh supply - so many, indeed, that only a sample could become household names.
Then, from the mid 1960s into the second half of the century, people began to notice that the supply of GM seemed to be drying up. This went along with various fashions for denying the importance of GM in human history, and attributing change to other forces (such as class). And for human affairs increasingly to be organized bureaucratically, in ways that implicitly denied the need for GM, and indeed sought to replace human creativity and genius with explicit and predictable procedure.
By the mid 1980s I noticed that the last real English poets were dying (Larkin and Betjeman; perhaps also Graves and Bunting – take your choice) and that there was nobody to replace them. For the first time in several centuries, there was not one single living English poet of real stature. Amazing.
Looking around, the same situation was looming in science - and by now there are just a few elderly remnants of previous generations who might be regarded as geniuses. Medical breakthroughs also began drying up at about this time (although there have not been many major medical geniuses, according to Murray’s lists).
So apparently the age of genius is over for Britain, which probably means the age of progress via multiple breakthroughs is over; and the same situations seems to prevail everywhere else - so far as I can tell. If genius was the driver of the modern world, this means that the modern world is also over (unless you believe that genius has now effectively been replaced by bureaucracy – Ha!).
Whatever it was that created the supply of geniuses and the conditions in which they could make breakthroughs has changed. I do not know whether there are still *potential* geniuses being born, but the whole motivational structure of society is hostile to genius and it is likely that individuals who would have grown to be potential Great Men in an earlier phase of society, nowadays have their motivation and sense of self poisoned. Instead of trying to achieve great things, such people would now probably pursue a great career, or would simply find themselves fish out of water.
I find myself ambivalent about this. Of course I vastly prefer a society conducive to genius to one being destroyed by bureaucracy. And if human history is conceptualized –like Bronowski does - in terms of a story of progressively increasing power to shape nature (by increasing understanding of its underlying structure) then the prospect of a massive decline in human power is dismaying. It is also dismaying from the perspective of mass human happiness – the prospect of mass violence, displacement, starvation, disease etc.
Yet, realistically, modernity was not planned neither were the modernizing societies (such as late Medieval and early Modern England) in any sense designed to nurture, or provide opportunities for, genius. The whole thing was an unplanned by-product and the age of genius was accidental.
Indeed, it was transitional, never stable, containing the seeds of its own destruction – like so many things. The geniuses were usually transitional figures who – over the course of their own lives – rejected the religious and traditional societies into which they were born. In their own lives they sometimes combined the strengths of the traditional society of the past and the progressive society which was emerging (as a result, partly, of their own work).
Yet of course the transitional phase is necessarily temporary, evanescent, cannot be sustained – and the generations of fully modern people, who are born into the world created by the geniuses – are one-eyed, feeble, and lack the source of strength of traditionalism. They (we) are post modern hedonists, for the most part – consumers, not creators.
I now tend to regard modernity as a temporary aberration from the course of human history. It arose from an accidental conjunction of genetics and society; and the effect of genius was to destroy the genetics and society which had caused itself. Whether it would have been good to sustain modernity (and reform it – because its vices are intolerable and have grown exponentially) I don’t know for sure.
But I do know that we have not even tried to do this, has not even tried to sustain itself, but instead has parasitically exploited the heritage of genius; so the question is now unanswerable empirically.
I now see human choice, or at least our destiny, in terms of lying between traditional societies – a choice between the kinds of human societies which existed roughly between AD 500 and AD 1500.
The expectation of growth in scientific knowledge
We have become used to growth in scientific knowledge, and expect growth in scientific knowledge. This expectation at first shaped reality, then became reality, and eventually displaced reality. The link between expectation and actuality was broken and the world of assumptions took over.
The expectation that scientific knowledge will grow almost inevitably (given adequate 'inputs' of personnel and funding) is epitomized by the professionalization of scientific research (making scientific research a job) and the expectation of regular and frequent scientific publication – the expectation of regular and frequent publication would only make sense if it was assumed that scientific knowledge was accumulating in a predictable fashion.
We nowadays expect a growth in the number of scientific publications over time, and a growth in the totality of citations – these are fuelled by increases in the numbers of professional scientists and of journals for publishing science. We assume that there is an infinite amount of useful and new science waiting to be discovered and an infinite pool of people capable of making discoveries.
The economist Paul Romer – and many others – has built this into theories of the modern economy – they argue that continued growth in science and technology fuels continual improvement in productivity (economic output per person) and therefore growth in the economy. And this is kept going by increasing the investment in science and technology. The idea is that we are continually getting better at scientific discovery, investing in scientific discovery, therefore modern society can continue to grow. (Yes, I know it doesn’t make sense, but...)
But how would we really know whether science was growing? I mean, who is really in a position to know this?
Who could evaluate whether change is science and increased amounts of self-styled scientific *stuff* actually corresponded to more and better science?
When – as now – scientific growth is expected, and when society acts-upon the expectation, we have an overwhelming *assumption* of growth in science, an assumption that science *is* growing – but that says nothing about whether there really is growth.
Because when people assume science is growing and when they think they perceive that science is growing, this creates vast possibilities for dishonesty, hype and spin. Because people expect science to grow, for there to be regular breakthroughs, they will believe it when regular breakthroughs are claimed (whether or not breakthroughs have actually happened).
But how if there is really no growth in scientific knowledge? Or, how if the real growth is less than the assumed growth? How if there is actual decline in real scientific knowledge – how would we know?
Science – as a social system –resembles the economy. In the credit crunch of 2008 it was realized that the economy had not really been growing, but what we were seeing was actually some mixture of increasing inflation, increasing borrowing, and rampant dishonesty from many directions. (It is the rampant dishonesty that has prevented this from being understood – and this tactical dishonesty is itself no accident.)
So we discovered that we were not really getting richer, but we were living off ever more credit, and the value of money was less than we thought; and we (or at least I) discovered that we could not trust anybody to tell us anything about what was going on or why it had happened. They were not even trying to discover the truth, they were trying to build their careers (politicians, economists, journalists – all careerists). (To be fair, most of them are explicitly nihilists who do not believe in the truth – so why should we expect them to tell it?)
Truth about the credit crunch was something we amateurs needed to work out for ourselves, as best we could.
I believe that science is in the same bad state as the economy, but probably even worse.
In science, what masquerades as growth in knowledge (to an extent which is unknown, and indeed unknowable except in retrospect) is not growth in knowledge but merely an expansion of *stuff*, changes in the methods of counting, and so on.
Almost nobody in science is trying to discover the truth, and is embarrassed even by talking about the subject. Not surprising that they are embarrassed!
For instance, virtually every influential scientific article is now hyped to a variable but vast extent (the honest ones are buried and ignored).
Multiple counting is rife: progress in claimed when a grant is applied for and also when a grant is awarded, and even when the work is still happening – since scientific progress is assumed to be predictable – a mere function of resources, capital and manpower; credit for a scientific publication is counted for all of its (many) authors, for all the many forms in which the same stuff is published and republished, for the department and also for the university where it was done, and also the granting agency which provided the funds and for the journal where it was published – everyone grabs a slice of the ‘glory’.
Credit is given for the mere act of a ‘peer reviewed’ publication regardless of whether the stuff is true and useful – or false and harmful.
Thus the signal of real science is swamped utterly by the noise of hype.
Let us suppose that doing science is actually much *harder* than people assume; much harder and much less predictable.
Suppose that most competent and hardworking real scientists actually make no indispensable contribution to science – but merely *incremental* improvements or refinements in methods, the precision of measurements and the expression of theories. And if they personally had not done it, it would have slightly-slowed but would not have prevented progress, or somebody else would have done it.
If science is really *hard*, then this fact is incompatible with the professionalization of science – with the idea of scientific research as a career. Since science is irregular and infrequent, science could only be done in an amateur way; maybe as a sideline from some other profession like teaching, practicing medicine, or being a priest.
Professional science would then be intrinsically phoney, and the phoniness would increase as professionalization of science increased and became more precisely measured, and as the profession of science expanded – until it reached a situation where the visible products of science – the *stuff* bore no relationship to the reality of science.
Professional scientists would produce stuff (like scientific publications) regularly and frequently, but this stuff would have nothing to do with real science.
Or, more exactly, the growing amount of stuff produce by the growing numbers of professional science careerists, whose use of hype would also be growing – the amount of this stuff would be so much greater than the amount of real science, that the real science would be obscured utterly.
This is precisely what we have.
The observation of growth in scientific knowledge became an expectation of growth in science and finally an assumption of growth in science.
And when it was assumed that science was growing, it did not really need to grow, because the assumption framed the reality.
But if science is as hard as I believe it is; then scientific progress cannot be taken for granted, cannot be expected or assumed.
Our society depends on scientific progress – when scientific progress stops, our society will collapse. Yet so great is our societal arrogance that we do not regard science as something real. Instead science is the subject of wishful thinking and propaganda.
Science is a way of getting at certain kinds of truth, but the way that science works is dependent on honesty and integrity. Our societal arrogance is such that we believe that we can have the advantages of real science but at the same time subvert the honesty and integrity of science whenever that happens to be expedient.
Our societal arrogance is that we are in control of this dishonesty – that the amount of hype and spin we apply is under our control and can be reversed at will, or we can separate the signal from the noise, and re-calculate the reality of science. This is equivalent to the Weimar Republic assuming that inflation was under control when prices and wages were rising unpredictably by the hour.
But we cannot do this for the economy and we cannot do it for science. In fact we have no idea of the real situation in either science or the economy, except that in a universe tending towards entropy we must assume that the noise will tend to grow and swamp the signal. The Western economy was apparently growing but in reality it was increased inflation and borrowing and deception; science has appeared to be growing but the reality is increasing hype, spin and dishonesty. The link between stuff and substance has disappeared.
When the signals of economics and science (money and ‘publications’ and other communications) lose their meaning, when the meaning is detached from underlying reality, then there is no limit to the mismatch.
The economy was collapsing while the economic indicators improved; and science can be collapsing while professional science is booming.
But if science is very difficult and unpredictable, and if the amount of science cannot be indefinitely expanded by increasing the input of personnel and funding, then perhaps the amount of real science has not increased *at all* and the vast expansion of scientific-stuff is not science.
If so, then the amount of real science (intermittent, infrequent, unpredictable) has surely not stayed constant but will have actually declined due to the hostile environment. At the very least, real science will be de facto un-findable since the signal is drowned by every increasing levels of noise.
So, the economy was a bubble, and science is a bubbles, and bubbles always burst; and the longer delayed the burst, the bigger the bubble will become (the more air, the less substance), and the bigger will be the collapse.
When the economic bubble burst, the economy was much smaller than previously realized - but of course the economy was still enormous. In effect, the economy was set back several years.
But when the scientific bubble bursts, what will be left over after the explosion? Maybe only the old science will prove valuable – science from an era when most scientists were at least honest and trying to discover the truth about the natural world.
And, in an era of mindless technical specialization, will there be enough scientists even to understand what was left over?
At the very least, science would be set back by several decades and not just by a few years. But it could be even worse than that.
Real science was built on a search for truth that was cooperative and competitive at the same time. Popper emphasized the mixture of hypothesis and testing, conjecture and refutation, testing for consistence and predictive ability and discarding of error.
Bronowski emphasized the need for tolerance of honest error, and that contributors to the scientific process should be respected even after their views have been refuted. Otherwise, scientists would not risk being accused/ convicted of being wrong and so would never challenge consensus; and consensus would never yield to refutation. (Which is precisely descriptive of the situation in mainstream medical research.)
So we still respect Isaac Newton despite him having been superseded by Einstein; and Newton is not usually mocked or derided for having not been correct for all time.
But this balance has been difficult for many scientists, and even more difficult for those outside of science. Lamarck ranks all-time third in importance as a biologist according to Charles Murray's method in Human Accomplishment - behind Darwin and Aristotle - but Lamarck’s views on evolution are often (it seems to me) treated as a joke.
Of course, ignorant disrespect is part of growing-up. But although it has to be tolerated in teenagers, it is not an admirable trait; being a result of pride and aggression fuelled by insecurity.
Adolescents love to hate, and there are an awful lot of adolescents interested in science and working in and around science and is journalism and as pundits - many of them adolescents of advanced middle age.
Adolescents also form gangs, and gangs assert their status by seeking and beating-up victims (the victims of course ‘deserve’ this – for being who they are).
There is an awful lot of ignorant disrespect in science nowadays, and an awful lot of gangsterism. Real science used to be all about individuals – it really did! – but now science is all about gangs.
The reason for so much ignorant disrespect in science is mostly that there is so much ignorance, due to the abundance of low quality people and their micro-specialized perspective. Such have no concept of excellence higher than the standard, prevailing technical practices of their micro-discipline; anyone who does not adhere to these prevailing practices is regarded as either incompetent or wicked - hence despicable hence deserving of punishment. They deserve ‘what is coming to them’ – in gang parlance.
There is always disagreement in science, but the basis of real science was that scientific disagreement was conducted by scientific means. What is *not* acceptable to real science is that scientific disputes should be settled by non scientific methods.
Scientists must be allowed to make mistakes, to be wrong, or science cannot function.
This is necessary because in the first place they may not really have made a mistake and they may be right (or partly right) – but this may not be apparent for a while. Mainstream science may be in error, but this situation may be recoverable if dissent is tolerated.
However, in a system of real science, mistakes are tolerated only when they are *honest* mistakes – lying and deceptions are absolutely forbidden in real science; and will lead to exclusion from the community of real scientists. And incompetent errors are simply a waste of everybody’s time. So dishonesty and incompetence are rightly sanctioned by leading to a scientist’s work being disregarded by others in the field as probably unreliable or unsound.
This is why the dishonest thugs of modern pseudo-science always try to portray dissent and disagreement as always a result of incompetence or dishonesty.
The gangsters of pseudo-science cannot acknowledge even the *possibility* of an honest and competent scientist reaching a different conclusion from the one they themselves support. This is because the gangsters are transparently looking for an excuse to attack and to coerce; after all, gangsters need to make public displays of their power, or else they would soon lose it.
Gang-leaders need to beat-up dissenters, and they need people to know that this is happening, and they need these dissenters to be portrayed as deserving victims of attack.
Consequently the whole concept of honest and competent disagreement has been banished from modern bureaucratic pseudo-science.
In the world of bureaucratic pseudo-science there are only two kinds of view – the correct view which is defined and enforced by the peer review cartel; and wrong views which are held by those either too stupid to understand, or those corrupted by evil.
Lysenko was a scientific gangster in the Soviet Union – Stalin’s henchman - http://en.wikipedia.org/wiki/Trofim_Lysenko. His scientific sin was to suppress scientific opposition using non-scientific means; up to and including jail and death for his opponents. The justification for Lysenko’s use of coercion to suppress and silence dissent was that the opponents’ opposition was harmful to people, caused millions of death etc.
Modern science is just a couple of small steps away from full-blown Lysenkoism. Scientific opposition is suppressed using non-scientific means ranging from defunding, exclusion of publications and other blocks on dissemination, public humiliation, sacking and other legal threats. In many areas of science gangsterism is rife, with intimidation and threats and the administration of media ‘beatings’.
What does it mean? Many would regard the situation as regrettable – but it is much worse than regrettable. It is conclusive evidence of non-science.
A field in which the use of non-scientific modes of argument are rife is simply *not a science*. Not a science at all. It does not work. Gangsterism is incompatible with science.
For example, ‘climate science’ is not a science *at all*; as a field it does not function as a real science, it uses intimidation and coercion as a matter of routine. Therefore nothing in it can be trusted, the good and bad cannot be discriminated.
To clarify - because in general terms climate science does not settle disputes using scientific methods, but by using extra-scientific methods, therefore it is not a real science, but actually is whatever the main influence on its content happens to be: politics, mostly.
The main innovation of ‘climate science’ has been to legitimate the mass use of the hate-term ‘denialism’ to signal who ‘deserves’ a punishment-beating from the gang.
Let us call the phenomenon of labelling and beating up ‘denialists’ by the name of ‘anti-denialism’.
Anti-denialism is no accident, nor is it eradicable without massive reform because anti-denialism is functionally necessary for the cohesion of modern bureaucratic pseudo-science. Without victims to gang-up on, the gangs would fall apart into warring sects. They would fight each other because these gangs are composed of ignorant, egotistical, power-worshipping adolescents. What best holds such people together is pure hatred, and pure hatred needs victims.
With the phenomenon of anti-denialism rife in mainstream discourse, we are just a couple of small steps away from full blown Lysenkoism. We already have systematic intimidation of scientific opposition at every level short of the physical. But I have seen demands from the gangsters of science that the sanctions against denialists be escalated. Destroying livelihoods is no longer enough. Soon, perhaps very soon, unless the tide turns, we may be seeing scientists jailed for their views.
Since honest and competent dissent is not recognized, anyone who disagrees with the peer review cartel is either labeled as too stupid to understand or as competent but wicked. It is the competent dissenters who are most at risk under Lysenkoism, since disagreement with the mainstream coming from a competent scientist marks them out as evil and deserving of punishment.
Anti-denialism needs high profile victims. Lysenkoism needed to punish top scientists like Vavilov, who died following three years in prison http://en.wikipedia.org/wiki/Nikolai_Vavilov.
On present trends we may expect to see prominent denialists and dissenters jailed for being ‘wrong’ (as judged by peer review), jailed for the public good, jailed ‘to save ‘millions of lives’ – but in reality jailed for opposition to the ruling gangsters of bureaucratic pseudo-science, and because anti-denialists absolutely require a continuous supply of victims to beat-up on.
A future for science?
What will happen to science?
When I think about future civilizations I often find myself spontaneously worrying about the future of science - will it survive, can it be revived?
Yet my worry is irrational - because science is not and never has been a part of the human condition: it is merely a localized and time-bound phenomenon - at least, 'science' in the sense of a separate and recognizable system or institutional structure has been historically and geographically unusual.
Indeed, I suspect that science is in reality much, much more limited in its reality than in appearance. Most science is, after all, a kind of Laboratoire Garnier simulacrum of science: I mean like those TV perfume adverts of bespectacled, serious-looking people with white coats and clipboards, who wander through a white room of test tubes and retorts, ticking boxes...
It is also possible that science is actually the upper end of, and is a very unusual combination of, a distribution of general intelligence, creativity and motivation which differs widely between different societies of different sizes; such that a critical mass of such rare people has never been likely, and in fact has near-zero probability in most places at most times throughout history.
If so, real science is very seldom going to be common - even under ideal conditions, which are unlikely to emerge and even less likely to be sustained; and we should be careful not to confuse this with first a professionalization then later routinization of the external (but not core) features of real science.
A future society either will, or will not, have science as a recognizable activity - but there is not much (apparently) that we could (or should?) do about this: real science (when it really existed) was essentially a by-product, not a planned-product.
Doing science after the death of real science
Science is now, basically, dead (my direct experience of science is inevitably partial - but the same mechanisms seems to have been at work everywhere; even outside of medicine in the humanities some of which I know reasonably well - and the social sciences were essentially corrupt from the word go).
What we think of as science is now merely a branch of the bureaucracy. It would, indeed it does, function perfectly well without doing any useful and valid science at all.
Indeed, modern professional science functions perfectly well while, in fact, *destroying* useful and valid science and replacing it with either rubbish or actively harmful stuff (this is very clear in psychiatry).
I find that I now cannot trust the medical research literature _at all_. I trust a few individual individuals but I do not trust journals, not fields, not funding agencies, not scholarly societies (like the Royal Society, universities, or the NAS) not citations, not prizes (Nobel etc) - in my opinion, none of these are trustworthy indices of scientific validity - not even 'on average'.
The system is *so* corrupt that finding useful and valid science (and, of course, there is some) is like finding a needle in a haystack.
The vast bulk of published work is either hyped triviality (which is time wasting at best), or dishonest in a range of ways from actual invention of data down to deliberately selective publication, or else incompetent in the worst sense - the sense that the researchers lack knowledge, experience and even interest in the problems which they are pretending to solve.
So, what should a person do who wants to do real science in an area? - if (as I think its probably the case) they need to _ignore_ the mainstream published literature as doing more harm than good.
Essentially it is a matter of going back to pre-professional science, and trying to recreate trust based interpersonal networks ('invisible colleges') of truthful, dedicated amateurs; and accepting that the pace of science will be *slow*.
I've been reading Erwin Chargaff lately, and he made clear that the pace of science really is slow. I mean with significant increments coming at gaps of several years - something like one step a decade or so, if you are lucky. And if 'science' seems fast, then that is because it is not science!
This is why science is destroyed by professionalism and its vast expansion - there are too few steps of progress, and too few people ever make these steps. Most 'scientists' (nowadays in excess of 99 percent of them) - if judged correctly - are complete and utter failures, or indeed saboteurs!
So science inevitably ought to be done as a serious hobby/ pastime paid for by some other economic activity (which has usually teaching, but was often medicine up to the early 20th century, and before that being a priest).
Why should anyone take any notice of these putative small and self selected groups of hobby-scientists? Well, presumably if they produce useful results (useful as judged by common sense criteria - like relieving pain or reversing the predictable natural history of a disease), and if the members of the group are honest and trustworthy. But whether this will happen depends on much else - their work may be swamped by public relations.
So, groups of practitioners are best able to function as amateur scientists, since they can implement their own findings, with a chance that their effectivceness might be noticed. And in the past groups of practicing physicians would function as the scientists for their area of interest.
This seems the best model I can think of for those wanting to do science. But science is intrinsically a social activity, not an individual activity. So if you cannot find or create a group that you can trust (and whose competence you trust) - then you cannot do real science.
Master and ‘prentice
Probably the most important bit of work I did as a scientist was the malaise theory of depression - http://www.hedweb.com/bgcharlton/depression.html .
I worked on this intermittently for nearly 20 years from about 1980 when I first began to study psychiatry. My motivation was trying to understand how a mood state could apparently be cured by medication.
From what I was being told, it seemed as if 'antidepressants' were supposed to normalize mood while leaving the rest of the mind unchanged. Of course, this isn't really true, but that was what I was trying to understand initially. Or, to put it another way, I was trying to understand what was 'an antidepressant' - since none of the standard explanations made any sense at all.
So, how did I 'solve' this problem (solve at least to my own satisfaction, that is!). Part of it was 'phenomenology' (i.e. mental self observation) - especially to observe my own mood states in response to various illnesses (such as colds and flu) and in response to various medications (including some which I was taking for migraine).
But the best answer is that I did not really solve it myself, but only when I had subordinated my investigations to the work of two great scientists - two 'Masters': the Irish psychiatrist David Healy and the Portuguese neuroscientist Antonio R Damasio.
This apprenticeship was almost entirely via the written world - and involved reading, thinking-about and re-reading (and thinking about and re-re-reading some more) key passages from key works of these two scientists. That is accepting these men as my mentors and discerning guides to the vast (and mostly wrong) literature of Psychiatry and Neuroscience.
The lesson that I draw from my experience is that real science (which is both rare and slow) is done and passed-on by a social groups comprising a handful of great scientists and a still small but somewhat larger number of 'disciples' who learn and apply their insights and serve to amplify their impact.
But even the great scientists have themselves mostly served as apprentices to other great scientists (as has often been documented - e.g. by Harriet Zuckerman in Scientific Elite).
So, when thinking about the social structure of real science, it would seem that real scientific work is done (slowly, over a time frame of a few decades) by small groups that are driven by Masters who make the breakthroughs; plus a larger number of 'prentices who learn discernment from the Masters ('discernment' - i.e. the correct making of evaluations - being probably more important than techniques or specific information).
But disciples by themselves are not capable of making breakthroughs, but only capable of incremental extensions or combinations of Master work.
And it is best if the Master can be primarily responsible for training the next generation Master/s to carry on the baton of real science. Disciples can - at best - only train-up more 'prentices with the humility to seek and serve a Master.
Medieval science re-discovered
Although I was 'brought up' on the religion of science, and retain great respect for philosophers such as Jacob Bronowski and Karl Popper, and for sociologists such as Thos. Merton, David L Hull and John Ziman; I have come to believe that this 'classic' science (the kind which prevailed from the mid-19th - mid-20th century in the UK and Western Europe) - in other words that activity which these authors described and analyzed - was 'merely' a transitional state.
In short: classic science was highly successful, but contained the seeds of its own destruction because the very processes that led to classic science would, when continued (and they could not be stopped) also destroy it.
(As so often, that which is beneficial in the short term is fatal in the longer term.)
Specifically, this transitional state of classic science was an early phase of professional science, which came between what might be called medieval science and modern science (which is not real science at all - but merely a generic bureaucratic organization which happened to have evolved from classic science). But classic science was never a steady state, and never reproduced itself; but was continually evolving by increasing growth, specialization and professionalization/ bureaucratization.
But classic Mertonian/ Popperian science was never stable - each generation of scientists had a distinctly different experience than the generation before due to progressive increasing growth, specialization and professionalization/ bureaucratization.
And indeed Classic science was not the kind of science which led to the industrial revolution and the ‘modern world’; the modern world was a consequence of causes which came before modernity. The modern world is a consequence of medieval science. So, the pre-modern forms of science were real science, and had real consequences.
What I mean is that medieval science was an activity which was so diffuse and disorganized that we do not even recognize it as science – yet it was this kind of science which led to the process of societal transformation that is only recognized by historians as becoming visible from the 17th century (e.g the founding of the Royal Society in 1660). But the emergence of classic science was merely the point at which change become so visible that it could not be ignored.
Since modernity it is therefore possible that science has been unravelling even as it expanded – i.e. that the processes of growth, specialization and professionalization/ bureaucratization were also subverting themselves until the point (which we have passed) where the damage due to growth, specialization and professionalization/ bureaucratization outstripped the benefit.
This is good news, I think.
Much of the elaborate and expensive paraphernalia of science - which we mistakenly perceive to be vital – may in fact be mostly a late and parasitic development. Effective science can be, has been, much simpler and cheaper.
When we consider science as including the medieval era prior to classic science, then it becomes clear that there is no distinctive methodology of science. Looking across the span of centuries it looks like the process of doing science cannot really be defined more specifically than saying that it is a social and multigenerational activity characterized by truth-seeking.
I would further suggest that science is usually attempting to solve a problem – or to find a better solution to a problem than the existing one (problem solving per se is not science, but science is a kind of problem solving).
The main physical (rather than political) constraint on science in the past was probably the slowness and unreliability of communication and records. This made science extremely slow to advance. Nonetheless, these advances were significant and led to the modern world.
The encouraging interpretation is therefore that even when modern professional ‘science’ collapses a new version of medieval science should easily be able to replace it, because of the already-available improvements in the speed, accuracy and durability of communications.
In other words, a re-animated ‘medieval science’ (amateur, unspecialized, individualistic, self-organized) plus modern communications probably equals something pretty good, by world historical standards - probably not so good as the brilliant but unstable phase of 'classic science', but better than 'modern science'.
Science is about coherence not testing
Until recently I usually described science as being mostly a matter of devising theories which had implications, and these implications should be tested by observation or experiment.
In other words, science was about making and testing predictions.
Of course there is more which needs to be said: the predictions must derive from theory, and the predicted state should be sufficiently complex, so as to be unlikely to happen by chance.
But it is now clear that this sequence doesn’t happen much nowadays, if it ever did. And that there are weaknesses about the conceptualization of science as mostly a matter of testing predictions.
The main problem is that when science becomes big, as now, the social processes of science (i.e. Peer review) come to control all aspects of science, including defining what counts as a test of a prediction.
This is most obvious in medical research involving drugs. A loosely-defined multi-symptom syndrome is created and a drug or other intervention is tested. The prediction is that the drug/ intervention ‘works’ or works better than another rival, and the test of prediction involves multiple measures of symptoms and signs. Within a couple of years the loosely defined syndrome is being diagnosed everywhere.
Yet the problem is not at the level of testing, since really there is nothing to test – most ‘diagnoses’ are such loose bundles that their definition makes no strong predictions. The problem is a the level of coherence.
Science is a daughter of philosophy, and like philosophy, the basic ‘test’ of science is coherence. Statements in science ought to cohere with other statements in science, and this ought to be checked. Testing ‘predictions’ by observation and experiment is actually merely one type of checking for coherence, since ‘predictions’ are (properly) not to do with time but with logic.
Testing in science ought *not* to focus on predictions such as ‘I predict now that x will happen under y circumstances in the future’ – but instead the focus should be – much more simply – on checking that the statements of science cohere in a logical fashion.
It is an axiom that all true scientific statements are consistent will all other true scientific statements. True statements should not contradict one another, they should cohere.
When there is no coherence between two scientific propositions (theories, 'facts' or whatever), and the reasoning is sound, then one or both propositions are wrong.
Scientific progress is the process of making and learning about propositions, A new proposition that is not coherent with a bunch of existing propositions may be true, and all or some of the existing propositions may be false indeed that is the meaning of a paradign shift or evolutionary science: when new incoherent propositions succeed in overturning a bunch of old propositions, and establishing a new network of coherent propositions.
(This is always a work in progress, and at any moment there is considerable incoherence in science which is being sorted-out. The fatal flaw in modern science is that there is no sorting-out. Incoherence is ignored, propositions are merely piled loosely together; or incoherence is avoided rather than sorted-out, and leads to micro-specialization and the creation of isolated little worlds in which there is no incoherence.)
Using this very basic requirement, it is obvious that much of modern science is incoherent, in the sense that there is no coherence between the specialties of science – specialties of science are not checked against each other. Indeed, there is a big literature in the philosophy of science which purports to prove that different types of science are incommensurable, incomparable, and independent.
If this were true, then science as a whole would not add-up – and all the different micro-specialties would not be contributing to anything greater than themselves.
Of course this is correct of modern medical science and biology. For example ‘neuroscience’ does not add up to anything like ‘understanding’ – it is merely a collection of hundreds of autonomous micro-specialties about nervous tissue. This, then that, then something else.
These micro-specialties were not checked for consistency with each other and as a consequence they are not consistent with each other. Neuroscience was not conducted with an aim of creating a coherent body of knowledge, and as a result it is not a coherent body of knowledge.
‘Neuroscience’, as a concept (although it is not even a concept) is merely an excuse for irrelevance.
It is not a matter of whether the micro-specialties in modern science are correct observations (in fact they are nowadays quite likely to be dishonest). But that isolated observations – even if honest - are worthless. Isolated specialties are worthless.
It is only when observations and specialties are linked with others (using theories) that consistency can be checked, that understanding might arise - and then ‘predictions’ can potentially emerge.
Checking science for its coherence includes testing predictions, and maximizes both the usefulness and testability of science; but a science based purely on testing predictions (and ignoring coherence) will become both incoherent and trivial.
While wanting to know the truth does not mean that you will find it; on the other hand, if scientists are not even *trying* to discover the truth - then the truth will not be discovered.
That is the current situation.
'Truth' can be defined as 'underlying reality’. Science is not the only way of discovering truth (for example, philosophy is also about discovering truth - science being in its origin a sub-class of philosophy) - but unless an activity is trying to discover underlying reality, then it certainly cannot be science.
But what motivates someone to want to discover the truth about something?
The great scientists are all very strongly motivated to ‘want to know’ – and this drives them to put in great efforts, and keeps them at their task for decades, in many instances. Why they should be interested in one thing rather than another remains a mystery – but what is clear is that this interest cannot be dictated but arises from within.
Crick commented that you should research that which you gossip about, Watson commented that you should avoid subjects which bore you - http://medicalhypotheses.blogspot.com/2007/12/gossip-test-boredom-principle.html - their point being that science is so difficult, that when motivation is deficient then problems will not get solved. Motivation needs all the help it can get.
In a recent article Seth Roberts makes the important point that one motivation to discover something useful in medicine is when you yourself suffer from a problem -
Seth does self-experimentation on problems which he suffers - such as early morning awakening, or putting on too much weight (he is most famous for the Shangri-La diet). He has made several probable breakthroughs working alone and over a relatively short period; and one of the reasons is probably that he really wanted answers, and was not satisfied with answers unless they really made a significant difference.
By contrast, 95 percent (at least!) of professional scientists are not interested in the truth but are doing science for quite other reasons to do with 'career' - things like money, status, security, sociability, lifestyle, fame, to attract women or whatever.
The assumption in modern science is that professional researchers should properly be motivated by career incentives such as appointments, pay and promotion – and not by their intrinsic interest in a problem – certainly not by having a personal stake in finding an answer – such a being a sufferer. Indeed, such factors are portrayed as introducing bias/ damaging impartiality. The modern scientist is supposed to be a docile and obedient bureaucrat – switching ‘interests’ and tasks as required by the changing (or unchanging) imperatives of funding, the fashions of research and the orders of his master.
What determines a modern scientist’s choice of topic, of problem? Essentially it is peer review – the modern scientist is supposed to do whatever work that the cartel of peer-review-dominating scientists decide he should do.
This will almost certainly involve working as a team member for one or more of the peer review cartel scientists; doing some kind of allocated micro-specialized task of no meaning or intrinsic interest – but one which contributes to the overall project being managed by the peer review cartel member. Of course the funders and grant awarders have the major role in what science gets done, but nowadays the allocation of funding has long since been captured by the peer review cartel.
Most importantly, the peer review cartel has captured the ability to define success in solving scientific problems: they simply agree that the problem has been solved! Since peer review is now regarded as the gold standard of science, if the peer review cartel announces that a problem has been solved, then that problem has been solved.
(This is euphemistically termed hype or spin.)
To what does the modern scientist aspire? He aspires to become a member of the peer review cartel. In other words, he aspires to become a bureaucrat, a manager, a ‘politician’.
Is the peer review cartel member a scientist as well? Sometimes (not always) he used-to be – but the answer is essentially: no. Because being a modern high level bureaucrat, manager or politician is incompatible with truthfulness, and dishonesty is incompatible with science.
The good news is that when real science is restored (lets be optimistic!) then its practitioners will again be motivated to discover true and useful things – because science will no longer be a career it will be colonized almost-exclusively by those with a genuine interest in finding real world answers.
A scientist needs to want to understand reality - this entails believing in reality, and that one ought to be truthful about it.
The belief in reality is a necessary metaphysical belief, which cannot be denied without contradiction - nonetheless, in modern elite culture it is frequently denied (this is called nihilism), why is why modern elite culture is irrational, self-contradictory (and self-destroying).
But obviously, a real scientist cannot be a nihilist - whatever cynical or trendy things he might say or do in public, in his heart he must have a transcendental belief in reality.
Science also involves a metaphysical belief (i.e. a necessary assumption, not itself part of science) in the understandability of nature and the human capacity to understand. Without this belief, science becomes an absurd and impossible attempt to find the one truth among an infinite number of erroneous possibilities.
Nonetheless, in modern elite culture, a belief in the understandability of nature and human capacity is routinely denied - another aspect of nihilism. Among many other consequences, this denial destroys the science which makes possible modern elite culture.
Explaining reality is a second step which may follow understanding, but explaining needs to be preceded by the desire to understand - again because there are an infinite number of possible explanations, none of which can be decisively refuted.
Modern science is undercut by many things - one is the difficulty for modern scientists of living by the proper motivations and beliefs of a real scientist. The transcendental beliefs are difficult to hold in isolation; it is difficult to refrain from asking *why* is it that should humans have these beliefs and motivations? Difficult to avoid the idea that they are arbitrary or delusional beliefs.
Committed scientists in recent decades have often justified themselves by emphasizing that science is enormous 'fun' - but this is a foolish and desperate line of defence. Many things are 'fun' for the people who happen to like them, but science was supposed to be about reality.
Hitler and Stalin seemingly enjoyed being dictators, perhaps found it ‘fun’ – but does that justify them?
Of course the ‘science is fun’ line is mostly trying to avoid the ‘science is useful’ trap. Because the usefulness of science is something intrinsically accidental and unpredictable. And of course science might well turn out to be harmful – fatal.; so usefulness cannot be guaranteed . If you try to get usefulness directly, you won’t get science – aims such as usefulness need to be set aside when a scientist is actually trying to understand reality.
Likewise explanations, predictions and so on – these are second order, contingent aspects of scientific discovery. Understanding must come first.
There never will be many people who are genuinely motivated by a desire to understand, and successful science also requires ability and luck.
Not just ability and luck: faith. Doing real science is an act of faith that if the scientist approaches his problem in the proper way and with sufficient effort, he will be rewarded by understanding.
(Rewarded not necessarily with the understanding he expected, but something just as good, or better.)
This is a religious kind of concept; a concept of just reward for proper devotion.
So real science is, at heart, a spiritual vocation – although this may be expressed in a variety of languages, with different levels of insight, and often indirectly.
Obviously it would be best if scientists did *not* talk about their spiritual vocations too much, especially in public. However, if they *are* going to speak honestly about the motivations of real science, then this is the kind of language they would need to use. This is the kind of language they did, in fact, use until about 50 years ago.
But when, as now, the language of spiritual vocation is ruled-out from public discourse (as foolish or fanatical) then scientists will inevitably be dishonest and misleading in public on the subject of science – blathering-on about usefulness when asking politicians and bureaucrats for money, and emphasizing fun when entertaining undergraduates .
In the end, by excluding all mentions of transcendentals or metaphysics, scientists end-up being untruthful with themselves – which is of course fatal to science. Bad motivations will yield bad consequences. The just reward of understanding reality, of understanding the truth, is not given to those whose devotions are dishonest.
How to become an amateur scientist
For anyone wanting to do science when the social structure of science is so corrupt, the obvious question is to ask whether they can 'go it alone'? - whether it makes any kind of sense to do science solo.
At the extreme this would simply mean that a person studied a problem but did not communicate their findings - either study for its own intrinsic value, or perhaps implementing the findings in their own life - for example a doctor doing research and using the findings in their own clinical practice.
Implementing findings in personal practice is, at some level, universal - it is simply termed learning from experience.
But what about doing science for its intrinsic value? This is termed philosophy - or perhaps natural philosophy.
I don't believe there is any line dividing philosophy from real science - although the activities differ considerably at their extremes. Nowadays both philosophy and science are essentially corrupt - or perhaps one could say that the names philosophy and science have been stolen and applied to generic, large scale bureaucratic activities.
However, if philosophy is seen in its essential role - aside from being a career - then that is exactly what a solo scientist would be doing; as indeed was the case for someone like Aristotle who has been rated as both the greatest (i.e. most influential) philosopher and scientist.
Oof course Aristotle was a professional, not an amateur, and also he applied the fruits of his scholarship in practice. Indeed, it is hard for humans not to want to communicate their work - not least there is the motivation to get status for one's scholarship.
So, while it is not impossible, I do find it hard to imagine a satisfying life as a solo scientist; and I think that being part of a similarly-motivated group of people is probably a pre-requisite. However, such a group might be relatively small and local - as was the case in the 18th century in England, when science was carried forward by the Lunar Society in Birmingham and similar Literary and Philosophical Societies in other cities.
The basic and best method to learn and practice science is via apprenticeship - attach yourself (somehow) to a Master: someone who can do it already.
Help the Master with his work (without pay!), and in return he may teach you, advise you, or you may pick up an understanding of how to do what he does.
Read into the subject. Talk or write about what you read and try to get some feedback.
Valuable feedback from a competent 'Master' is very, very rare, however - it may come seldom and in little scraps, and the apprentice must be alert so as not to miss it.
Don't be too impatient to find a specific problem to work on - allow the problem to find you. Francis Crick proposed the 'gossip test' - that which you gossip about spontaneously, is probably contains a possible problem to work on.
When you are interested in a *problem*, you can usually find some aspect to work-on which you personally can do with your resources of time and effort, and without lavish material resources or manpower.
Publication is a matter of informing people who are genuinely interested in the same problem. This might be done by letter, as in the 17th Century. The internet has solved the problem of making work accessible to those who are interested.
If you are honest/ can earn trust, produce useful work or provide some valuable function, you will be admitted to the 'invisible college' of self-selected people working on a problem.
If you are not trustworthy, lack competence, or are unproductive, then you will not be allowed into the invisible college - because an invisible college is a synergistic group sustained by mutual benefit. If you don't provide benefits to the group, and show no prospect of providing any in the future, then you are merely a parasite and need to be excluded.
The respect of an invisible college is the currency of science - it is the invisible college which evaluates work, and develops and sustains understanding through time.
Charlton BG. Boom or bubble? Is medical research thriving or about to crash? Medical Hypotheses 2006; 66: 1-2
A recent issue of JAMA (2005; vol. 294) presented a portrait of medical research as a booming enterprise. By contrast I have suggested that medical research is a speculative bubble due to burst. How can two such different predictions be compatible? From inside the expanding world of medical research everything seems fine and getting better. But to people outside the system, it seems like there is an awful lot of money going in, and not much coming out. Professional criteria of success (publications, impact factors, citations, grant income, large teams, etc.) are not the same as the outsider’s view of success. Outsiders want the medical research system to generate therapeutic progress as efficiently as possible: the most progress for the least resources. Medical research is not the only good way of spending money and is in competition with other social systems. As funding increases, diminishing returns will set-in, opportunity costs will begin to bite, and there will be more and more social benefit to be gained from spending the extra research money on something else. Therefore, future cuts in medical research will happen because of pressure from outside the system – specifically pressure from other powerful social systems which will press their alternative claims for funding. In the short term, there will be a quantitative decline of research production. But in the longer term the medical research system will re-grow in a more efficient form. After a ‘golden age of therapeutic progress in the mid-20th century, recent decades have seen a ‘silver age’ of scholasticism which is due to end soon. Perhaps a renaissance of medical research lies not too many years in the future.
Charlton BG. Zombie science: A sinister consequence of evaluating scientific theories purely on the basis of enlightened self-interest. Medical Hypotheses. 2008; Volume 71: 327-329
Although the classical ideal is that scientific theories are evaluated by a careful teasing-out of their internal logic and external implications, and checking whether these deductions and predictions are in-line-with old and new observations; the fact that so many vague, dumb or incoherent scientific theories are apparently believed by so many scientists for so many years is suggestive that this ideal does not necessarily reflect real world practice. In the real world it looks more like most scientists are quite willing to pursue wrong ideas for so long as they are rewarded with a better chance of achieving more grants, publications and status. The classic account has it that bogus theories should readily be demolished by sceptical (or jealous) competitor scientists. However, in practice even the most conclusive ‘hatchet jobs’ may fail to kill, or even weaken, phoney hypotheses when they are backed-up with sufficient economic muscle in the form of lavish and sustained funding. And when a branch of science based on phoney theories serves a useful but non-scientific purpose, it may be kept-going indefinitely by continuous transfusions of cash from those whose interests it serves. If this happens, real science expires and a ‘zombie science’ evolves. Zombie science is science that is dead but will not lie down. It keeps twitching and lumbering around so that (from a distance, and with your eyes half-closed) zombie science looks much like the real thing. But in fact the zombie has no life of its own; it is animated and moved only by the incessant pumping of funds. If zombie science is not scientifically-useable – what is its function? In a nutshell, zombie science is supported because it is useful propaganda to be deployed in arenas such as political rhetoric, public administration, management, public relations, marketing and the mass media generally. It persuades, it constructs taboos, it buttresses some kind of rhetorical attempt to shape mass opinion. Indeed, zombie science often comes across in the mass media as being more plausible than real science; and it is precisely the superficial face-plausibility which is the sole and sufficient purpose of zombie science.
Charlton BG. Figureheads, ghost-writers and pseudonymous quant bloggers: The recent evolution of authorship in science publishing. Medical Hypotheses. 2008; 71: 475-480
Traditionally, science has been published only under the proper names and postal addresses of the scientists who did the work. This is no longer the case, and over recent decades science authorship has fundamentally changed its character. At one extreme, prestigious scientists writing from high status institutions are used as mere figureheads to publish research that has been performed, analyzed and ‘ghost-written’ by commercial organizations. At the other extreme ‘quant bloggers’ are publishing real science with their personal identity shielded by pseudonyms and writing from internet addresses that give no indication of their location or professional affiliation. Yet the paradox is that while named high status scientists from famous institutions are operating with suspect integrity (e.g. covertly acting as figureheads) and minimal accountability (i.e. failing to respond to substantive criticism); pseudonymous bloggers – of mostly unknown identity, unknown education or training, and unknown address – are publishing interesting work and interacting with their critics on the internet. And at the same time as ‘official’ and professional science is increasingly timid careerist and dull; the self-organized, amateur realm of science blogs displays curiosity, scientific motivation, accountability, responsibility – and often considerable flair and skill. Quant bloggers and other internet scientists are, however, usually dependent on professional scientists to generate databases. But professional science has become highly constrained by non-scientific influences: increasingly sluggish, rigid, bureaucratic, managerial, and enmeshed with issues of pseudo-ethics, political correctness, public relations, politics and marketing. So it seems that professional science needs the quant bloggers. One possible scenario is that professional scientists may in future continue to be paid to do the plodding business of generating raw data (dull work that no one would do unless they were paid); but these same professional scientists (functioning essentially as either project managers or technicians) may be found to lack the boldness, flair, sheer ‘smarts’ or genuine interest in the subject to make sense of what they have discovered. Some branches of future science may then come to depend on a swarm of gifted ‘amateurs’ somewhat like the current quant bloggers; for analysis and integration of their data, for understanding its implications, and for speculating freely about the potential applications.
Charlton BG. Why are modern scientists so dull? How science selects for perseverance and sociability at the expense of intelligence and creativity. Medical Hypotheses. Volume 72, Issue 3, Pages 237-243
Question: why are so many leading modern scientists so dull and lacking in scientific ambition? Answer: because the science selection process ruthlessly weeds-out interesting and imaginative people. At each level in education, training and career progression there is a tendency to exclude smart and creative people by preferring Conscientious and Agreeable people. The progressive lengthening of scientific training and the reduced independence of career scientists have tended to deter vocational ‘revolutionary’ scientists in favour of industrious and socially adept individuals better suited to incremental ‘normal’ science. High general intelligence (IQ) is required for revolutionary science. But educational attainment depends on a combination of intelligence and the personality trait of Conscientiousness; and these attributes do not correlate closely. Therefore elite scientific institutions seeking potential revolutionary scientists need to use IQ tests as well as examination results to pick-out high IQ ‘under-achievers’. As well as high IQ, revolutionary science requires high creativity. Creativity is probably associated with moderately high levels of Eysenck’s personality trait of ‘Psychoticism’. Psychoticism combines qualities such as selfishness, independence from group norms, impulsivity and sensation-seeking; with a style of cognition that involves fluent, associative and rapid production of many ideas. But modern science selects for high Conscientiousness and high Agreeableness; therefore it enforces low Psychoticism and low creativity. Yet my counter-proposal to select elite revolutionary scientists on the basis of high IQ and moderately high Psychoticism may sound like a recipe for disaster, since resembles a formula for choosing gifted charlatans and confidence tricksters. A further vital ingredient is therefore necessary: devotion to the transcendental value of Truth. Elite revolutionary science should therefore be a place that welcomes brilliant, impulsive, inspired, antisocial oddballs – so long as they are also dedicated truth-seekers.
Charlton BG. The vital role of transcendental truth in science. Medical Hypotheses. Volume 72, Issue 4, April 2009, Pages 373-376
I have come to believe that science depends for its long-term success on an explicit and pervasive pursuit of the ideal of transcendental truth. ‘Transcendental’ implies that a value is ideal and ultimate – it is aimed-at but can only imperfectly be known, achieved or measured. So, transcendental truth is located outside of science; beyond scientific methods, processes and peer consensus. Although the ultimate scientific authority of a transcendental value of truth was a view held almost universally by the greatest scientists throughout recorded history, modern science has all-but banished references to truth from professional scientific discourse – these being regarded as wishful, mystical and embarrassing at best, and hypocritical or manipulative at worst. With truth excluded, the highest remaining evaluation mechanism is ‘professional consensus’ or peer review – beyond which there is no higher court of appeal. Yet in Human accomplishment, Murray argues that cultures which foster great achievement need transcendental values (truth, beauty and virtue) to be a live presence in the culture; such that great artists and thinkers compete to come closer to the ideal. So a scientific system including truth as a live presence apparently performs better than a system which excludes truth. Transcendental truth therefore seems to be real in the pragmatic sense that it makes a difference. To restore the primacy of truth to science a necessary step would be to ensure that only truth-seekers were recruited to the key scientific positions, and to exclude from leadership those who are untruthful or exhibit insufficient devotion to the pursuit of truth. In sum, to remain anchored in its proper role, science should through ‘truth talk’ frequently be referencing normal professional practice to transcendental truth values. Ultimately, science should be conducted at every level, from top to bottom, on the basis of what Bronowski termed the ’habit of truth’. Such a situation currently seems remote and fanciful. But within living memory, routine truthfulness and truth-seeking were simply facts of scientific life – taken for granted among real scientists.
Charlton BG. Are you an honest scientist? Truthfulness in science should be an iron law, not a vague aspiration. Medical Hypotheses. 2009; Volume 73: 633-635
Anyone who has been a scientist for more than a couple of decades will realize that there has been a progressive and pervasive decline in the honesty of scientific communications. Yet real science simply must be an arena where truth is the rule; or else the activity simply stops being science and becomes something else: Zombie science. Although all humans ought to be truthful at all times; science is the one area of social functioning in which truth is the primary value, and truthfulness the core evaluation. Truth-telling and truth-seeking should not, therefore, be regarded as unattainable aspirations for scientists, but as iron laws, continually and universally operative. Yet such is the endemic state of corruption that an insistence on truthfulness in science seems perverse, aggressive, dangerous, or simply utopian. Not so: truthfulness in science is not utopian and was indeed taken for granted (albeit subject to normal human imperfections) just a few decades ago. Furthermore, as Jacob Bronowski argued, humans cannot be honest only in important matters while being expedient in minor matters: truth is all of a piece. There are always so many incentives to lie that truthfulness is either a habit or else it declines. This means that in order to be truthful in the face of opposition, scientists need to find a philosophical basis which will sustain a life of habitual truth and support them through the pressure to be expedient (or agreeable) rather than honest. The best hope of saving science from a progressive descent into Zombiedom seems to be a moral Great Awakening: an ethical revolution focused on re-establishing the primary purpose of science: which is the pursuit of truth. Such an Awakening would necessarily begin with individual commitment, but to have any impact it would need to progress rapidly to institutional forms. The most realistic prospect is that some sub-specialties of science might self-identify as being engaged primarily in the pursuit of truth, might form invisible colleges, and (supported by strong ethical systems to which their participants subscribe) impose on their members a stricter and more honest standard of behaviour. From such seeds of truth, real science might again re-grow. However, at present, I can detect no sign of any such thing as a principled adherence to perfect truthfulness among our complacent, arrogant and ever-more-powerful scientific leadership – and that is the group of which a Great Awakening would need to take-hold even if the movement were originated elsewhere.
Charlton BG. Clever sillies: Why high IQ people tend to be deficient in common sense. Medical Hypotheses. 2009; 73: 867-870.
In previous editorials I have written about the absent-minded and socially-inept ‘nutty professor’ stereotype in science, and the phenomenon of ‘psychological neoteny’ whereby intelligent modern people (including scientists) decline to grow-up and instead remain in a state of perpetual novelty-seeking adolescence. These can be seen as specific examples of the general phenomenon of ‘clever sillies’ whereby intelligent people with high levels of technical ability are seen (by the majority of the rest of the population) as having foolish ideas and behaviours outside the realm of their professional expertise. In short, it has often been observed that high IQ types are lacking in ‘common sense’ – and especially when it comes to dealing with other human beings. General intelligence is not just a cognitive ability; it is also a cognitive disposition. So, the greater cognitive abilities of higher IQ tend also to be accompanied by a distinctive high IQ personality type including the trait of ‘Openness to experience’, ‘enlightened’ or progressive left-wing political values, and atheism. Drawing on the ideas of Kanazawa, my suggested explanation for this association between intelligence and personality is that an increasing relative level of IQ brings with it a tendency differentially to over-use general intelligence in problem-solving, and to over-ride those instinctive and spontaneous forms of evolved behaviour which could be termed common sense. Preferential use of abstract analysis is often useful when dealing with the many evolutionary novelties to be found in modernizing societies; but is not usually useful for dealing with social and psychological problems for which humans have evolved ‘domain-specific’ adaptive behaviours. And since evolved common sense usually produces the right answers in the social domain; this implies that, when it comes to solving social problems, the most intelligent people are more likely than those of average intelligence to have novel but silly ideas, and therefore to believe and behave maladaptively. I further suggest that this random silliness of the most intelligent people may be amplified to generate systematic wrongness when intellectuals are in addition ‘advertising’ their own high intelligence in the evolutionarily novel context of a modern IQ meritocracy. The cognitively-stratified context of communicating almost-exclusively with others of similar intelligence, generates opinions and behaviours among the highest IQ people which are not just lacking in common sense but perversely wrong. Hence the phenomenon of ‘political correctness’ (PC); whereby false and foolish ideas have come to dominate, and moralistically be enforced upon, the ruling elites of whole nations.
Charlton BG. After science: Has the tradition been broken? Medical Hypotheses. 2010; 74: 623-625
The majority of professional scientists make use of the artefacts of science but lack understanding of what these mean; raising the question: has the tradition of science been broken? Explicit knowledge is only a selective summary but practical capability derives from implicit, traditional or ‘tacit’ knowledge that is handed on between- and across-generations by slow, assimilative processes requiring extended human contact through a wide range of situations. This was achieved mainly by prolonged apprenticeship to a Master. Such methods recognize the gulf between being able to do something and knowing how you have done it; and the further gap between knowing how you have done something and being able to teach it by explicit instructions. Yet the ‘Master–apprentice’ model of education has been almost discarded from science over recent decades and replaced with bureaucratic regulation. The main reason is probably that scientific manpower has expanded so rapidly and over such a long period as to overwhelm the slow, sure and thorough traditional methods. In their innocence of scientific culture, the younger generation of scientists are like children who have been raised by wolves; they do not talk science but spout bureaucratic procedures. It has now become accepted among the mass of professional ‘scientists’ that the decisions which matter most in science are those imposed upon science by outside forces: for example by employers, funders, publishers, regulators, and the law courts. It is these bureaucratic mechanisms that now constitute the ‘bottom line’ for scientific practice. Most of modern science is therefore apparently in the post-holocaust situation described in A canticle for Liebowitz and After Virtue, but the catastrophe was bureaucratic, rather than violent. So, the tradition has indeed been broken. However, for as long as the fact is known that the tradition has been broken, and living representatives of the tradition are still alive and active, there still exists a remote possibility that the tradition could be revived.
Charlton BG. The cancer of bureaucracy: how it will destroy science, medicine, education; and eventually everything else. Medical Hypotheses - 2010; 74: 961-5.
Everyone living in modernizing ‘Western’ societies will have noticed the long-term, progressive growth and spread of bureaucracy infiltrating all forms of social organization: nobody loves it, many loathe it, yet it keeps expanding. Such unrelenting growth implies that bureaucracy is parasitic and its growth uncontrollable – in other words it is a cancer that eludes the host immune system. Old-fashioned functional, ‘rational’ bureaucracy that incorporated individual decision-making is now all-but extinct, rendered obsolete by computerization. But modern bureaucracy evolved from it, the key ‘parasitic’ mutation being the introduction of committees for major decision-making or decision-ratification. Committees are a fundamentally irrational, incoherent, unpredictable decision-making procedure; which has the twin advantages that it cannot be formalized and replaced by computerization, and that it generates random variation or ‘noise’ which provides the basis for natural selection processes. Modern bureaucracies have simultaneously grown and spread in a positive-feedback cycle; such that interlinking bureaucracies now constitute the major environmental feature of human society which affects organizational survival and reproduction. Individual bureaucracies must become useless parasites which ignore the ‘real world’ in order to adapt to rapidly-changing ‘bureaucratic reality’. Within science, the major manifestation of bureaucracy is peer review, which – cancer-like – has expanded to obliterate individual authority and autonomy. There has been local elaboration of peer review and metastatic spread of peer review to include all major functions such as admissions, appointments, promotions, grant review, project management, research evaluation, journal and book refereeing and the award of prizes. Peer review eludes the immune system of science since it has now been accepted by other bureaucracies as intrinsically valid, such that any residual individual decision-making (no matter how effective in real-world terms) is regarded as intrinsically unreliable (self-interested and corrupt). Thus the endemic failures of peer review merely trigger demands for ever-more elaborate and widespread peer review. Just as peer review is killing science with its inefficiency and ineffectiveness, so parasitic bureaucracy is an un-containable phenomenon; dangerous to the extent that it cannot be allowed to exist unmolested, but must be utterly extirpated. Or else modernizing societies will themselves be destroyed by sclerosis, resource misallocation, incorrigibly-wrong decisions and the distortions of ‘bureaucratic reality’. However, unfortunately, social collapse is the more probable outcome, since parasites can evolve more rapidly than host immune systems.