h1

A dangerous trick of the mind…

May 22, 2007

automaton.jpg

The Medical Protection Society Casebook publication has a fascinating article today on Involuntary Automaticity (IA). This is what happens when the involuntary side of the brain takes over something that you do in the same way repeatedly, like driving. It may also account for why some medical errors are made, when we have fixed protocols for everything and IA takes over from the conscious checking mechanism. It’s also more likely to happen (to both parties) when 2 people are following a protocol which requires them both to check it, as they both involuntarily assume that the other person has done whatever it is they are supposed to check. So in a rather bizarre twist it means that changing a medical protocol from something which an individual has to make a conscious effort to think about and do into one which has pathways and guidelines which tell us what to do, with built in checks, may actually lead to more errors being made. This, rather worryingly is what pilots do on landing and take off.

It is also a particular issue in oncology where the driver is for ever more tightly controlled policies and guidelines, and is thought to have been a factor in the recent tragic case in Glasgow of a 16 year old girl who received a huge overdose of radiotherapy for a brain tumour last year. I can certainly think of occasions when I could have sworn I had checked something, but when I have double checked it it is clear that I can not have done.

Maybe it’s time to get rid of all the protocols and guidelines and go back to properly trained doctors taking responsibility for what they do, rather than expecting half-trained sub-consultants to get by through rigid adherence to defined procedures…

UPDATE

By a bizarre twist of fate at our clinical governance meeting we have just discussed a case where radiotherapy was given to the wrong area because a single mistake was made in annotation at the beginning of a patients journey.Despite (or because of) rigid adherence to protocol and an inbuilt check system, the mistake was not detected and ultimately led to this error, which thankfully should have no long-term sequelae. In retrospect this is quite a clear case of IA.  I was able to sound surprisingly knowledgeable about systems theory.  The Casebook article suggests the following remedies for IA:

  • Teaching doctors about systems theory
  • Adapting protocols to generate tactile and oral responses
  • Creating effective relationships between managers and clinicians
  • Using independent checkers
  • Developing different checklists to keep clinicians alert
  • Involving patients in their consultations more effectively
  • Minimising stress levels
  • Reducing distractions
  • Although I believe there is no substitute for good training and individual (rather than collective) responsibility, I think most of these sound very sensible, and especially the last three.  The patient in question knew the correct information which would have prevented him from getting the wrong treatment.  If only he had been asked…

    Advertisements

    9 comments

    1. Thank you for your kind comments about the article on involuntary automaticity in the latest edition of Casebook. Your readers may be interested to know that this article is now available on our website.

      With best wishes

      Jonathan Haslam, Casebook Editor


    2. Thank you-I have now provided the link in the text


    3. Maybe it’s time to get rid of all the protocols and guidelines and go back to properly trained doctors taking responsibility for what they do, rather than expecting half-trained sub-consultants to get by through rigid adherence to defined procedures…

      UPDATE

      By a bizarre twist of fate at our clinical governance meeting we have just discussed a case where radiotherapy was given to the wrong area because a single mistake was made in annotation at the beginning of a patients journey.

      Mens Sana, all joking aside
      I knew we were on the same wavelength on serious matters


    4. I’d like to think so Q9.


    5. Thank you Quasar for leaving … I shall enjoy reading this.
      Thank you for the journey over to me.
      :o)


    6. Hmm. I have just been and read the original article on the WHO website and I was not very impressed with the level of scholarship. I think some experimental evidence of the proposed mechanism would be nice. I’m not that convinced that it amounts to more than “if you do stuff for a long time you get sloppy”. In particular, there is no evidence that this sort of double-check makes things WORSE ie it picks up some errors, although not all.

      Also, I very much doubt that this (potential) issue was relevant to the Glasgow case (and I read every single word of the report on the latter).

      I would quite like to know what Brian Toft’s PhD was about. He certainly seems to be well in with the risk management industry, which is relatively new and which contains a lot of management-speak gobbledygook which can be difficult to separate out from the genuinely useful


    7. Hi P

      I take your point. As with many things psychological the evidence is thin. I believe it is a bit more than repetition leading to sloppiness though. I do believe that one reaches a level where the system is trusted to iron out mistakes rather than the individual, and that this leads to dangerous inattention to detail. Furthermore I would not suggest (sorry for the implication to the contrary) that the practice of double checking leads to more errors. Just that the fact that both checkers are “protocolised” may mean the check has less value at detecting error.

      How would you account for the Glasgow error-there was clearly a serious failure both of process and of safety checks


    8. Oh, I agree there’s probably something there to be understood – I just don’t think the right technical/academic expertise has been applied to it yet (or not that I know of). And proper psychological/neurological investigation of such features often shows the true picture to be much more complicated and subtle than the somewhat glib-sounding label “IA” suggests.

      I would have to go back and read the Glasgow report again, but my memory was that (a) the main person doing calculations (appointed by the supervisor)was insufficiently trained, that (b) there was an official signing-off process which should have identified this but which had been ignored by the supervisor whose responsibility it was (or more charitably was not up-to-date), that (c) one checker might have picked up the error serendipitously but was actually officially checking for something else, and that (d) the main checker (the supervisor) was specifically disallowed by the procedure from being the checker for this particular thing because he/she had been previously involved (in what capacity I forget) but ignored that bit of the procedure. All compounded by the fact that (e) the IT had been changed without a rigorous enough job being done in changing the procedures so that the IT and the procedures surrounding it made a seamless whole.

      (b) and (d) were definitely the supervisor’s fault, and as a person with a lot of experience running operations, I would say that the ultimate responsibility for (e) also rested with the supervisor, even though it was also some IT project team’s responsibility. The book got thrown at the supervisor so presumably the person doing the report agreed.

      There would be management responsibility too in various circs; if the department was extremely short-staffed, or the supervisor was inexperienced or ill or otherwise incapable of doing their job properly, or there weren’t any procedures at all, or even that it was consistently tacitly agreed that procedures could be ignored. The report made no suggestion that any of this pertained, apart from it was a bit busy over the Christmas period.

      You might say that the supervisor didn’t pick up the error on their (unauthorised) check because of IA. But the supervisor knew that the original person doing the work, whom they had chosen, was inexperienced, and should therefore have been checking with the real expectation of picking up an error.

      I reiterate that this analysis is based on memory!


    9. An interesting article.

      I showed this to Mrs (Dr) Weasel, who asked me to email this to her at work, since she agreed with me that it was relevant as she does a lot of this sort of thing.



    Leave a Reply

    Fill in your details below or click an icon to log in:

    WordPress.com Logo

    You are commenting using your WordPress.com account. Log Out / Change )

    Twitter picture

    You are commenting using your Twitter account. Log Out / Change )

    Facebook photo

    You are commenting using your Facebook account. Log Out / Change )

    Google+ photo

    You are commenting using your Google+ account. Log Out / Change )

    Connecting to %s

    %d bloggers like this: