The contribution of Idries Shah to the understanding of cults and cult thinking

Until Idries Shah began writing on the subject, the word ‘cult’ tended to refer to a religious sect that was usually eccentric and possibly dangerous. It could include any quasi-religious or deeply political grouping in which the leader was idolised – hence the ‘cult of personality’ witnessed in former communist and other authoritarian countries. Cults, then, were definitely ‘other’- we Westerners sitting by a cosy hearth were thankfully immune from such corruptions. Shah suggested otherwise.

In his book Knowing How to Know Shah wrote, “Man is a cult-making creature. That is to say, he formalises what could be constructive procedures and uses them for recreational purposes. This is done in the same way that beavers build dams whether they need a dam or not. Superficialists, ignoring this, applaud beavers unreservedly. In the human being this recreational tendency is bad, because instead of taking what diversion he needs and also working constructively towards self-fulfilment on a higher level, he may think that when he is playing he is fulfilling a ‘higher’ function.”

Shah realised that cultish thinking is a human default setting. It can happen anywhere a group starts to form, or it can happen in a lone individual’s mind. A prominent student and associate of his, Dr Arthur Deikman, took Shah’s insight into the prevalence of cultish thinking and developed it into the definitive work on cults, The Wrong Way Home[1]. In it, Deikman identified the human need to be looked after (characterised memorably in the book as the warm feeling of being driven home at night in the back of your parents’ car) as the start of cult thinking. It is an aberration of the childish requirement for protection and education carried beyond its functional period to become what Shah calls ‘recreation’- that is, playing at being something you are not. In this case playing at being a child who needs protective parents when you don’t.

It can start with simple mental laziness: the desire to construct a system, rigidify something true into a cosy generic answer to everything. Panacea thinking. The system has the answer, so you can relax in the mental backseat of the car. The problem is, it doesn’t stop there. Cults develop common features that are injurious to long term well-being; they halt real growth of the individual and can lead to unwise decision-making: Deikman shows that the disastrous Bay of Pigs invasion of Cuba in 1960 was the result of cult thinking at the highest level of the US government.

How do you know cult thinking is happening? Certain factors always crop up. These include:

  • compliance with the group
  • dependence on the leader
  • devaluing the outsider
  • avoiding dissent
  • lacking interest in the real views of outsiders
  • not being critical of one’s own position
  • disapproving of those who leave the group
  • feeling self-righteous

Whenever you have one or two of the above, the others seem to follow. Those who routinely adopt an us-and-them mentality will welcome greater group compliance and a greater tendency to depend on the ‘leader figures’ in their lives. When one admires someone and becomes dependent on this leader figure, as if by magic a certain defensiveness, a feeling of being misunderstood by others, will develop. It’s the way we are wired: cult thinking is a hypertrophy of the ‘family instinct’ needed for survival and education at a certain point, but, like weaning from breast milk, a stage we don’t continually need to revisit.

Anything which involves teachers and study groups – like Sufism itself – is going to be subject to cult thinking on some level. All religious groups are particularly susceptible to it because the subject matter is ‘serious’ and humour deemed inappropriate. Yet, as Shah continually remarked, it is often humour that reveals the existence of cult thinking. In a later blog we will look more specifically at the role of humour in his work, but suffice it to say here, Shah was the first writer on Sufism in the West to lay such large emphasis on humour. As he wrote, “If you cannot laugh regularly and often you have no soul.” The Nasrudin corpus of jokes works like an anti-cult-thinking handbook, humour being used to deflect the straight lines and wrong destinations our minds can often take us. Any reflection on cult behaviour must include acknowledging the absence of a certain kind of humour. There may be cynical, knowing in-jokes in the cult, but certain subjects – the leader, the main articles of faith, the position of outsiders – can never be treated humorously. Shah was fond of reminding people that the absence of real humour was often the first warning bell that you were in a cult, or engaged in cult thinking.

When Shah started writing in the 1960s his desire was that the findings of the last fifty years of psychological research be made known to ordinary non-academic people. Not least of this was the research into ‘brainwashing’ and conditioning, the method by which cult thinking is transmitted to others. In his BBC documentary One Pair of Eyes, Shah talks to Psychologist William Sargant, author of Battle for the Mind. Sargant remarks, “I have no doubt at all that suppose Hitler had conquered England and Hitler had then run all the public schools and all the secondary education, that perhaps 70% of the new generation in England could have been brought up with Hitlerite viewpoints.”

Research such as the infamous Milgram experiment[2] show that our desire to follow an authority figure, a leader, our desire not to rock the boat, to conform, can be used against us. A moral form of education would alert us to this and try to outmanoeuvre cult thinking – which is precisely what Shah has done in his corpus of work. But when our natural tendency to belittle outsiders, conform, and over-venerate a leader is encouraged by a political party, corporation or other organisation, we should be wary of joining any such group. They may look ‘official’ and legitimate but they are actually using the same kind of manipulation as any unbalanced and marginal cult that springs to mind.

It is a childish dream to hope that someone is at the controls, driving that car or flying that plane. When journalist Craig Karpel managed to infiltrate a meeting of the infamous Bilderberg Group – a quasi-secret summit attended by the power elite of the West – he found that there was no conspiracy. The leaders – including Henry Kissinger, Helmut Kohl and David Rockefeller – were interested in stability and self-preservation, yet without the slightest idea of what to do beyond that. There is no master plan. When we realise that, we are less likely to seek a leader or teacher to live our lives for us.

Shah writes[3]: “The human tendency is to attach oneself to people and objects. The esotericist’s object is to help people find a ‘self’ which can attach itself to something more refined.” When we engage in cult thinking we ‘use up’ our mental capacity for improvement, settling for lower things – recreation and perhaps therapy, things that are much better sought directly.

The greater atomisation of the populace in current times does not mean that cult thinking is diminished. This is an important point: you can feel totally alone but still be engaged in cult thinking. It could take any form on a surface level – from an interest in architecture to one in zoology. If you venerate a leader figure (however distant), won’t hear a word said against the group (which you only belong to in your head), and take a strong anti position to any thinkers from outside; if you block or minimise dissent in your own mind about the subject and find certain things about it ‘just not funny’- then you’re engaged in cult thinking.

As long as you know what’s going on you can outwit it. But, as Shah observed, just because you know you’re a moving target doesn’t stop you from being one.

As well as being common to people the world over, the current dominant Western culture brings its own pressure to bear, pressure that encourages cult thinking in order to promote harder work or to sell more products. It is common for new workers in law firms or management consultancies to work long hours in order to be rewarded with pizza and pep talks. These are classic conditioning techniques used by cults the world over.

Shah wrote[4] “Western people, whatever their ideology, spend much time in the engineering of belief, conviction, commitment – they keep changing the name, but the process is the same… I call it descriptive to mention this, and to offer a remedy: in the saying: ‘When you are most convinced: that is the time to use caution about your certainty.’”


[1] Arthur J. Deikman. The Wrong Way Home. Beacon Press. 1990.

[2] In 1963 Dr Stanley Milgram showed that 65% of people will keep administering electric shocks in an authority situation even after the subject stops shouting and is presumably unconscious, possibly dead.

[3] Idries Shah. Knowing How to Know. Octagon 1998.

[4] Idries Shah. Learning How to Learn. Octagon 1978.