De omnibus dubitandum

Archive for the ‘science’ Category

Our Actions Define Us

So the human race has invented 3D printing.

This is possibly the greatest scientific advancement of the last few centuries, as it’ll allow us to create a genuine egalitarian post-scarcity society where the means of production are in the hands of us, the people, and we can quite literally build almost anything we’ll ever need.

And what do some people want to do with this astonishing, liberating, levelling-the-capitalist-playing-field invention?

They want to use it to make guns.

Seriously, I often wonder if the human species deserves to exist.

Demolishing Popular Neuroscience Writing


I’ve been an avid fan of recent popular science books that delve in to the workings of our brains. Eagerly I’ve been devouring books like ‘Het Slimme Onderbewuste‘ and ‘Wij Zijn Ons Brein‘ as well as English language counterparts such as Malcolm Gladwell’s ‘Blink‘.

So reading this rather excellent demolition of the entire popular neuroscience genre was more than a little uncomfortable, though probably very necessary:

“So, instead, here is a recipe for writing a hit popular brain book. You start each chapter with a pat anecdote about an individual’s professional or entrepreneurial success, or narrow escape from peril. You then mine the neuroscientific research for an apparently relevant specific result and narrate the experiment, perhaps interviewing the scientist involved and describing his hair. You then climax in a fit of premature extrapolation, inferring from the scientific result a calming bromide about what it is to function optimally as a modern human being. Voilà, a laboratory-sanctioned Big Idea in digestible narrative form.”

While I do believe that neuro-scientific endeavours will, eventually, provide us with meaningful (if unpalatable for some) answers about the nature of thought and consciousness, it’s good to remind ourselves that this is a science in its infancy and we shouldn’t let ourselves be carried along by overly optimistic commercially-incentivised book writers.

  • Filed under: books, science
  • Deconstructing Raymond Tallis

    Raymond Tallis is, at first glance, a bit of a contradiction.

    A bit of background first. Tallis is a philosopher and a prominent mover in British Humanist circles. A staunch atheist, he is a regular contributor to New Humanist magazine, in which pages I first learned of him.

    In addition to being a fierce critic of religion, Raymond Tallis is also, as it turns out, a fierce critic of science. Or, specifically, of certain aspects of neuroscience. Which is ironic, as he used to be a neuroscientist.

    Tallis rejects the reductionist perspective – increasingly embraced by neuroscientists – that the brain and human consciousness are the same thing, that consciousness is inextricably linked to neural activity in the brain. He even devoted an entire book to this argument, titled Aping Mankind, in which he attacks ‘neuromania’ and ‘Darwinitis’.

    What Tallis actually attacks, however, is the simplified portrayal of neuroscientific discoveries in the media, and the sweeping statements journalists like to make in eye-catching headlines. But Tallis fails to make that distinction, instead preferring to use this convenient straw man to criticise all of neuroscience and what he perceives as its reductionism with regards to consciousness and free will (or lack thereof).

    Prolific in his criticisms, Tallis however fails to offer any opposing theory as to what consciousness and free will then really are, if not material properties of the brain.

    Now I’m not one to denounce a critic for failing to provide an alternative view (that would be rather hypocritical of me) nor for the viciousness of his criticisms (again, pot, kettle & black), but what strikes me most is that Tallis’s criticisms fly in the face of an ever growing mountain of scientific evidence.

    Tallis is essentially rejecting empirical scientific evidence without providing any counter-evidence. That, to me, seems a rather untenable position, especially for someone who once declared science to be “the greatest achievement of that community of minds called the human race”. He is intent on maintaining the specialness of human consciousness, without any supporting evidence.

    Considering the fact that Tallis is a ridiculously intelligent man and a defender of scientific rationalism in general, this all combines in to a rather contradictory picture of the man.

    Or does it?

    Apparently Tallis at the age of 15 suffered from great depths of personal despair, which is not particularly uncommon in adolescence. He overcame this depression when he discovered philosophy, which provided him with “a sense of overwhelming joy at the complexity of the world.”

    Aha. Suddenly it all makes sense.

    Philosophy, Tallis’s intellectual soulmate, is at its core an embrace of the concept that fundamental truths of the universe can be discovered by thought alone. Philosophy has put all its eggs in to the basket of conscious thought, as that is the source from which its knowledge springs.

    And that was perfectly fine, right up until science – and neuroscience in particular – started shooting galaxy-sized holes in to the presumed superiority of consciousness and free will.

    Neuroscience has not (yet) disproved the existence of free will, nor has it managed to explain what consciousness is.
    But the science has made massive strides towards finding answers to those pivotal questions. And the direction of this progress points towards an absence of free will and a rather peripheral role of consciousness in the bigger picture of our mental faculties.

    That, I believe, is the true axe Raymond Tallis has to grind. Like so many philosophers he has placed conscious thought on an artificial pedestal, and like many he is seeing that pedestal undermined by the continued progress of biological science.

    So in an effort to preserve his adolescent and enduring love affair with philosophy, he rejects any attack on the sanctity of conscious thought. He opposes the materialist notion that we are our brains, regardless of the scientific evidence.

    You’d almost feel sorry for the man. So keen to cling to his deeply entrenched cognitive biases, he cannot face the possibility that his beloved philosophy is perhaps nothing but a deeply flawed emanation from our imperfect, materialist brains. And he will go to any length of shrieking irrationality to preserve this personal delusion.

    I suppose it just proves once more that even the brightest among us are not the perfect superhumans we’d like them to be.

  • 1 Comment
  • Filed under: philosophy, science
  • A Worrying Conviction

    Today two people were sentenced to life in prison for murdering a young woman. And I believe this is a very worrying conviction.

    Iftikhar and Farzana Ahmed are believed to have killed their daughter Shafilea in 2003 as an ‘honour killing’. The problem is that, from what the media has reported, this conviction is based entirely on a single eyewitness testimony.

    One of the couple’s other children told authorities, seven years after Shafilea’s death, that she witnessed her parents kill her sibling. Whether or not that is true, we will never know.

    Memories are fickle and unreliable things. They are created in an instant, and they are nearly always inaccurate. People believe that what they remember is what actually happened, while in fact what we remember is often a different version of the events as they transpired.

    Memories are malleable, ephemeral, and eternally revised. Every time you remember something, you are actually re-creating the memory – and often change it in the process.

    Memories not only change over time, they’re also inaccurate the moment they’re created. We don’t remember actual events – we remember our own biased, coloured versions of events. And our perceptions, much like our memories, are incredibly unreliable.

    Memories don’t even need to be based on actual events. They can be created without the person ever experiencing anything even remotely close to it. A story you heard, a dream you had, a TV show you watched – all of these things can lead to memories that you will swear are true, while in fact they’re nothing but pure fabrications of your mind.

    So I’m pretty sure that whatever Shafilea’s sister told in court what happened, and what actually happened on the night Shafilea disappeared, are two entirely and vastly different things.

    No doubt there was a jury that was entirely unaware of the fallibility of eyewitness testimony, and I suspect there was not a small degree of bias present in many jurors – a result of the ceaseless barrage from the tabloid media distrusting anything that even remotely reeks of immigrants and Islam.

    I don’t know if Shafilea’s parents killed her or not. That’s not the issue I’m addressing. The issue is that the jury should have recognised they don’t know either.

    We have a justice system that allows evidence of incredibly flimsy substance to serve as the pivotal aspect of a prosecution’s case, and ignorant juries to uncritically accept it, with life-altering repercussions for all involved.

    And that, I believe, can only be a bad thing.

  • Filed under: life, science
  • Philosophy of the Gaps

    I’ve hinted at my views on philosophy before. Summarised I think modern philosophers are undeservingly arrogant and accord themselves a level of prestige in intellectual spheres that they don’t actually deserve.

    But it’s only recently that I think I figured out why this is. When you view philosophy in its proper historical context, their increasingly loud screeching – especially on matters where science is making progress in leaps and bounds – is revealed as the desperate pleas of an intellectual pursuit rapidly being made obsolete.

    As was pointed out to me on Twitter by Sander Tamaëla, the early philosophers were the scientists of their days. They tried to understand how things worked, and why they worked the way they did. They were restricted to the tools of their age, which meant they had little to rely on except their own minds.

    As a result of this limited toolset, philosophers put conscious thought in the center of their discipline. It was all they could rely on at the time, and ever since it’s been the axle around which the entire philosophical discourse of the past few millennia has moved.

    So philosophers have been building pedestals to their champion, the conscious mind, for thousands of years. And now, thanks to the advances being made in neuroscience and other disciplines, they’re finding that this proclaimed champion is actually a bit of a dud.

    Our conscious mind is not in charge. Free will is pretty much proven to be mostly – if not entirely – illusory. We are not enlightened creatures.

    And philosophy, as the herald of consciousness’ greatness, is struggling to accept it. Which is why philosophers are spending an awful lot of energy trying to discredit the scientific advances that are hinting at philosophy’s obsolescence.

    From the rather untenable – and frankly ridiculous – posturing of philosophy as the purest of all scientific endeavours (evidenced in this Infinite Monkey Cage podcast) to their shrieking rebuttals (peppered with logical fallacies) in the neuroscience debate, philosophy is obviously in distress.

    It’s reminiscent of the desperation with which religion has grasped at as-of-yet-unexplained phenomena as evidence for the existence of God. This is called the ‘God of the gaps‘ argument, in which religion finds increasingly small areas where science has not yet been able to provide enlightenment. Philosphy is doing the same, wrangling itself in to the ever-narrowing gaps of knowledge that science is rapidly breaking open and exploring.

    While I believe there is a role to play for philosophy in scientific discourse, it’s not nearly as big a role as philosophers wrongly think they ought to play. It’s time they realise that their best days are behind them, and that they should stop trying to artificially inject themselves in to every discovery that further gnaws at their crumbling foundations.

    This is the age of science, and philosophy would do well to keep pace.

  • Filed under: philosophy, science
  • The Epic Awesomeness of Marie Curie

    Marie CurieIf you don’t know who Marie Curie was, you probably shouldn’t be reading this blog in the first place. I’ll assume you’re all at least passingly familiar with this historic figure.

    What you may not be aware of is exactly how epically awesome Marie Curie really was. This is a woman who, at the turn of the 19th century, when feminism was pretty much non-existent and most women around the world did not even have the right to vote, managed to become a widely renowned and respected scientist.

    Science, in those days, was considered a strictly male endeavour, and I can only imagine the depths of bigotry and sexism Marie Curie had to overcome in her voyage to become a scientist that was taken seriously.

    Then, in 1903, she wins a Nobel prize, the first woman to do so. Her role in the discovery of radioactivity earns her the Nobel Prize for Physics. Remember, this is at the start of the 20th century, when women were on the whole not taken particularly serious as scientists.

    But it gets better. In 1911, she wins a second Nobel Prize. This time it’s the Nobel Prize for Chemistry, which she received for her discovery of the radium element. That made her the first person – not just the first woman, but the first person in the history of mankind – to have won two Nobel prizes in different disciplines. Only one other person has since matched that feat.

    Marie Curie died in 1934 from the effects of radioactive poisoning. She literally gave her life in service of her craft. When her remains were transferred to the Panthéon in Paris in 1995, she became the first – and so far only – woman to be entombed in the Panthéon on her own merits.

    Regardless of her gender, Marie Curie was one of the greatest scientists who ever lived. And when you do take her gender in to account, the fact that she was such a great scientist in a day and age when sexism was the norm, we can only conclude that she was without doubt one of the greatest human beings to have ever lived.

    Marie Curie was truly, epically, awesome.

  • Filed under: life, science
  • The other day my eye caught an AdWords ad for a book called “The Final Theory” by Mark McCutcheon, an author previously unknown to me. This book allegedly solves all of the existing scientific conundrums and supposedly introduces ‘a new scientific perspective’ that ‘radically re-thinks’ all we know about how the universe works today.

    Now, as you may know, I’m a bit of a science geek. I’m also a sceptic. De Omnibus Dubitandum, and all that. The description of this book in the ad and on its website set off all kinds of bullshit alarms in my head. The book’s marketing material focused purely on how this new final theory would overturn all established science and revolutionise our understanding of the laws of physics, casting in to doubt centuries worth of scientific advancements.

    I’ve seen similar tones struck in many different promotional materials, usually those published by creationists, homeopaths, energy healers, and other similarly delusional quacks. So I did what any physics geek of sound mind would do: I went to and looked at the book’s reviews.

    Amazon tends to be a place where works of atrocious quality are skilfully eviscerated by a horde of merciless reviewers who will destroy a work if it lacks merit. At least, that’s what I thought.

    As it turns out, the vast majority of reviews for this book on Amazon are overwhelmingly positive, with no fewer than 71 five-star reviews at last count. According to the Amazon reviewers this book is at least on a par with Stephen Hawkin’s “A Brief History of Time”.

    That, too, set of further bullshit alarms. I’d never heard of this Mark McCutcheon fellow before, and I try to keep myself at least moderately informed of what’s going on in the world of science. As this book was originally published in 2003, if it truly had such amazing scientific merit as is claimed by these countless Amazon reviewers, there should by all accounts have been quite a shockwave going through the scientific establishment. And there most certainly was not.

    So I dug deeper. Wikipedia was, mysteriously, devoid of any mentioning of the book and its author. In fact, Wikipedia was so diligent in not mentioning Mark McCutcheon and his Final Theory, that I suspected it was a deliberate deletion. That turned out to be the case, as is evident from this administrators’ discussion page (search for ‘McCutcheon’ on that page to find the relevant passages).

    Also there are various sceptical forum threads and blog posts dedicated to the book, specifically to how negative reviews on Amazon are mysteriously and inexplicably deleted, leaving only a vast bulk of four- and five-star positive reviews. These positive reviews are themselves rather suspect, as they seem to be posted by new Amazon users without any significant review history, and many of them use very similar phrasings and writing styles.

    The last damning piece of evidence comes from a forum thread on a physics community site where the book’s ‘Final Theory’ is thoroughly slaughtered for the nonsensical quackery that it so obviously is.

    What is most disturbing about this whole episode is Amazon’s complicity in the whole affair. There is, for all intents and purposes, deliberate censorship at work here in an effort to promote a book that espouses such an obviously farcical concept. Genuine criticism is being silenced in favour of a commercial message, trying to get you to buy a book that contains patent falsehoods, distortions, and lies.

    I suppose when there is money to be made, truth is entirely optional.



    Adamus is the online identity of Barry Adams. A Dutchman living in Northern Ireland, Barry / Adamus is an internet fanatic, skeptic, technophile, gamer, and geek.

    On this personal blog he provides his unpolished view of the world and its insanities.

    Identity 2.0

      Twitter  LinkedIn  Google+