What is Groupthink?
Group think, according to the International Encyclopedia of the Social Sciences (2008), is a term that came from Irving Janis (1972, 1982) and is used to describe a set of beliefs and behaviors found in decision-making groups when their motivation is to maintain internal consensus over using rational thought to appraise information.
From: http://www.psysr.org/about/pubs_resources/groupthink%20overview.htm
There are situations in which group think can be detrimental and even life-threatening and the example that comes immediately to mind is the People's Temple. Jim Jones, a charismatic leader of the People's Temple, led a group of about 900 people to their deaths in Guyana, South America.
Now that sounds absolutely insane, right? Indeed it does. So now the question is how did Jim Jones manage to convince so many people to "drink the koolaid?"
Jones used the technique to manipulate his congregation. Followers would first be asked to donate a small amount of their income to the Temple, but over time the amount required would rise until they had given all of their property and savings to Jones. The same applied to acts of devotion. When they first joined the church, members were asked to spend just a few hours each week working for the community. As time passed, these few hours expanded little by little until members were attending long services, helping to attract others into the organization and writing letters to politicians and the media. By ratcheting up his requests slowly, Jones was using the 'foot in the door' technique to prepare his followers to make the ultimate sacrifice. But this technique is only successful if people do not draw the line in the sand and speak out against the increased demands (emphasis mine). the second psychological technique employed by Jones was designed to quell this potential rebellion.
They had been seated in such a way as to ensure that the genuine participant answered last. Everyone was asked to voice their answers and each of the 'volunteers' always gave the same one. For the first two trials, all of the stooges gave the correct response to comparing the lines, while on the tired trial the stooges all gave an incorrect answer. Asch wanted to discover what percentage of participants would conform to peer pressure and give an obviously incorrect answer in order to go along with the group. Amazingly, 75 percent of the people conformed. In a slight variation on the procedure, Asch had just one of the stooges break with the group and give a different answer. This one dissenting voice reduced the amount of conformity to around 20 percent.
People's Temple was a huge experiment in the psychology of conformity. Jones was aware that any dissent would encourage others to speak out and so tolerated no criticism (emphasis mine). To help enforce this regime, Jones had informers befriend those thought to be harboring doubts about the Temple, with any evidence of dissent resulting in brutal beatings or public humiliation. He also split up any groups that were likely to share their concerns with each other. Families were separated, with children first being seated away from their parents during services and later placed into the full-time care of another church members. Spouses were encouraged to participate in extramarital sexual relationships to loose marital bonds. Similarly, the dense jungle around Jonestown ensured that the community was completely cut off from the outside world and had not way of hearing any dissenting voices from those not involved. The powerful and terrible effects of this intolerance of dissent emerged during the mass suicide. An audiotape of the tragedy revealed that at one point a woman openly declared that the babies deserved to live. Jones acted to quickly quell the criticism, stating that babies are even more deserving of peace and that 'the best testimony we can give is to leave this goddamn world'. The crowd applaud Jones, with one man shouting 'It's over, sister...We've made a beautiful day', and another adding, 'If you tell us we have to give our lives now, we're ready.'
But Jones was not just concerned with getting his foot in the door and quashing any dissent. He also employed a third psychological weapon to help control the minds of his followers -- he appeared to have a hot line to God and be able to perform miracles.
It was all a sham. The 'cancers' were actually rancid chicken gizzards that Jones concealed in his had prior to 'pulling' them from the people's mouths. The curing of the 'lame' was created by a small inner circle of highly devoted followers pretending that they couldn't walk. The information about the congregation was not God-given, but instead obtained by members of Jones' 'inner circle' sifting through people's rubbish bins for letters and other useful documentation. These individuals later described how they willingly assisted Jones because he was conserving his genuine supernatural powers for more important matters. And the miracle of deep fried chicken? One member of the congregation later described how he saw the bearer of the trays arrive at the church a few moments before the miracle, armed with several buckets of food from Kentucky Fried Chicken. When Jones found out about the comment he put a mild poison in a piece of cake, gave it to the dissenting church member, and announced that God would punish his lies by giving him vomiting and diarrhea.
So was Jones' mind control just about getting his food in the door, creating conformity and performing miracles? In fact, there was also the important issue of self-justification.
When you arrive at Aronson's laboratory, a researcher asks you whether you would mind participating in a group discussion about the psychology of sex. Drooling, you say that you are open to the idea. the researcher then explains that some people have become very self-conscious during the discussion and so now all potential volunteers have to pass an 'embarrassment' test. You are handed a long list of highly evocative words (including many containing four letters) and two passages containing a vivid descriptions of sexual activity. The researcher asks you to read both the list and the passages out loud, while he rates the degree to which you are blushing. After much sanctioned cursing, the researcher says that the good news is that you have passed the test and so can now take part in the group discussion. However, the bad news is that the 'embarrassment' test has taken longer than anticipated, so the discussion has already stared and you will just have to listen to the group this time around. The researcher shows you into a small cubicle, explains that all of the group members sit in separate rooms to ensure anonymity, and asks you to wear some headphones. You don the headphones and are rather disappointed to discover that after all you have been through, the group is having a rather dull discussion about a book called Sexual Behavior in Animals. Finally, the researcher returns and asks you to rate the degree to which you want to join the group.
Like many psychology experiments, Aronson's study involved a considerable amount of deception. In reality, the entire experiment was not about the psychology of sex, but the psychology of belief. When participants arrived at the laboratory they were randomly assigned to one of two groups. Half of them went through the procedure described above, and were asked to read out highly evocative word lists and graphic passages. Those in the other group were asked to read out far less emotionally charged words (think 'prostitute' and 'virgin'). Everyone then heard the same recorded group discussion and was asked to rate the degree to which they valued being a member of the group. Most psychologists in Aronson's day would have predicted that those who underwent the more embarrassing procedure would end up liking the group less because they would associate it with a highly negative experience. However, Aronson's work into the psychology of self-justification had led him to expect a quite different set of results. Aronson speculated that those who had read out the more evocative sexual material would justify their increased embarrassment by convincing themselves that the group was worth joining, and end up thinking more highly of it. Aronson's predictions proved correct. Even though everyone had heard the same recording of the group discussion, those who underwent the more extreme embarrassment test rated joining the group as far more desirable than those in the 'prostitute and virgin' group.
Aronson's findings help explain why many groups demand that potential members undergo painful and humiliating initiation rituals. American college fraternities make freshmen eat unpleasant substances or strip naked, the military put new recruits through extreme training, and medical interns are expected to work night and day before becoming fully fledged doctors. Jones used the same tactics to encourage people to feel committed to People's Temple. Members of the congregation had to endure long meetings, write self-incriminating letters, give their property to the Temple, and allow their children to be raised by other families. If Jones suspected someone of behaving in a way that was not in the interests of the Temple, he would ask other members of the congregation to punish them. Common sense would predict that these acts would drive people away from both Jones and Peoples Temple. In reality, the psychology of self-justification ensured that it actually moved them closer to the cause.
The mind control exhibited by the likes of Jim Jones does not involve any hypnotic trances or prey on the suggestible. Instead, it uses four key principles. The first involves a slow ratcheting up of involvement. Once a cult leader has his foot in the door, they ask for greater levels of involvement until suddenly followers find themselves fully immersed in the movements. Second, any dissenting voices are removed from the group. Skeptics are driven away and the group is increasingly isolated from the outside world. Then there are the miracles. By appearing to perform the impossible, cult leaders often convince their followers that they have direct access to God and therefore should not be questioned. Finally, there is self-justification. You might imagine that asking someone to carry out a bizarre or painful ritual would encourage them to dislike the group. In reality, the opposite is true. By taking part in these rituals followers justify their suffering by adopting more positive attitudes toward the group.
Of course, it would be nice to think that if the group had not been so isolated from society, it might have been possible to undo the effects of these techniques, explain the madness of their ways, and avert a major tragedy. However, our final sojourn into the world of cults suggest that this is a naïve view of those that have fallen under the spell of a charismatic leader.
From: http://jonestown.sdsu.edu/?page_id=61771
Trueblackanonymous You Tube Channel
@BLKAnonymous
Symptoms of Group Think
- Illusion of invulnerability – Creates excessive optimism that encourages taking extreme risks.
- Collective rationalization -- Members discount warnings and do not reconsider their assumption.
- Belief in inherent morality -- Members believe in the rightness of their cause and therefor ignore the ethical or moral consequences of their decisions.
- Stereotyped views of out-groups -- Negative views of the "enemy" make effective responses to conflict seem unnecessary.
- Direct pressure on dissenters -- Members are under pressure to not express arguments against any of the group's views.
- Self-censorship -- Doubts and deviations from the perceived group consensus are not expressed.
- Self-appointed 'mindguards' -- Member protect the group and the leader from information that is problematic or contradictory to the group's cohesiveness, view and/or decisions.
From: http://www.psysr.org/about/pubs_resources/groupthink%20overview.htm
There are situations in which group think can be detrimental and even life-threatening and the example that comes immediately to mind is the People's Temple. Jim Jones, a charismatic leader of the People's Temple, led a group of about 900 people to their deaths in Guyana, South America.
Now that sounds absolutely insane, right? Indeed it does. So now the question is how did Jim Jones manage to convince so many people to "drink the koolaid?"
Getting a Foot in the Door
In a now-classic study carried out by Jonathan Freedman and Scott Fraser of Stanford University, researchers posed as volunteer workers and went from door-to-door explaining that there was a high level of traffic accidents in the area and asking people if they would mind placing a sign saying 'DRIVE CAREFULLY' in their gardens. This was a significant request because the sign was very big and so would ruin the appearance of the person's house and garden. Perhaps, not surprisingly, few residents agreed to display it. In the next stage of the experiment, the researchers approached a second set of residents and asked them to place a sign saying 'BE A SAFE DRIVER' in their garden. This time the sign was just three inches square, and almost everyone accepted. Two weeks later, the researchers returned and now asked the second set of residents to display the much larger sign. Amazingly, over three quarters of people agreed to place the big ugly placard. This concept, known as the 'foot in the door' technique, involves getting people to agree to a large request by first getting them to agree to a far more modest one.
All Together Now
In the 1950s, American psychologist Solomon Asch conducted a series of experiments into the power of conformity. Participants were asked to arrive at Asch's laboratory one at a time and were introduced to about six other volunteers. Unbeknownst to each participant, all of these others volunteers were actually stooges who were working for Asch. The group, made up of the participant and stooges, were sat around a table and told that they were about to take part in a 'vision test'. They were then shown two cards. The first card had a single line on it, while the second card contained three lines of very different length, one of which is the same length as the line of the first card. The group were asked to say which of the three lines on the second card matched the line of the first card.They had been seated in such a way as to ensure that the genuine participant answered last. Everyone was asked to voice their answers and each of the 'volunteers' always gave the same one. For the first two trials, all of the stooges gave the correct response to comparing the lines, while on the tired trial the stooges all gave an incorrect answer. Asch wanted to discover what percentage of participants would conform to peer pressure and give an obviously incorrect answer in order to go along with the group. Amazingly, 75 percent of the people conformed. In a slight variation on the procedure, Asch had just one of the stooges break with the group and give a different answer. This one dissenting voice reduced the amount of conformity to around 20 percent.
People's Temple was a huge experiment in the psychology of conformity. Jones was aware that any dissent would encourage others to speak out and so tolerated no criticism (emphasis mine). To help enforce this regime, Jones had informers befriend those thought to be harboring doubts about the Temple, with any evidence of dissent resulting in brutal beatings or public humiliation. He also split up any groups that were likely to share their concerns with each other. Families were separated, with children first being seated away from their parents during services and later placed into the full-time care of another church members. Spouses were encouraged to participate in extramarital sexual relationships to loose marital bonds. Similarly, the dense jungle around Jonestown ensured that the community was completely cut off from the outside world and had not way of hearing any dissenting voices from those not involved. The powerful and terrible effects of this intolerance of dissent emerged during the mass suicide. An audiotape of the tragedy revealed that at one point a woman openly declared that the babies deserved to live. Jones acted to quickly quell the criticism, stating that babies are even more deserving of peace and that 'the best testimony we can give is to leave this goddamn world'. The crowd applaud Jones, with one man shouting 'It's over, sister...We've made a beautiful day', and another adding, 'If you tell us we have to give our lives now, we're ready.'
But Jones was not just concerned with getting his foot in the door and quashing any dissent. He also employed a third psychological weapon to help control the minds of his followers -- he appeared to have a hot line to God and be able to perform miracles.
Wonder of Wonders, Miracles of Miracles
Many people followed Jones because he appeared to be able to perform miracles. During services Jones would ask those suffering from any illnesses to make their way to the front of the church. Reaching into their mouths, he would dramatically pull out a horrid mass of 'cancerous' tissue and announce that they were now cured. Sometimes the lame would apparently be instantly healed, with Jones telling them to throw away their walking aids and dance back up the aisle. He also claimed to hear the voice of God, calling out to people in the congregation and accurately revealing information about their lives. On one occasion more people than expected turned up for a service and Jones announced that he would feed the multitude by magically producing more food. a few minutes later, the door swung open and in walked a church member carrying two large trays filled with fried chicken.It was all a sham. The 'cancers' were actually rancid chicken gizzards that Jones concealed in his had prior to 'pulling' them from the people's mouths. The curing of the 'lame' was created by a small inner circle of highly devoted followers pretending that they couldn't walk. The information about the congregation was not God-given, but instead obtained by members of Jones' 'inner circle' sifting through people's rubbish bins for letters and other useful documentation. These individuals later described how they willingly assisted Jones because he was conserving his genuine supernatural powers for more important matters. And the miracle of deep fried chicken? One member of the congregation later described how he saw the bearer of the trays arrive at the church a few moments before the miracle, armed with several buckets of food from Kentucky Fried Chicken. When Jones found out about the comment he put a mild poison in a piece of cake, gave it to the dissenting church member, and announced that God would punish his lies by giving him vomiting and diarrhea.
So was Jones' mind control just about getting his food in the door, creating conformity and performing miracles? In fact, there was also the important issue of self-justification.
On Behaviour and Belief
In 1959 Stanford University psychologist Elliot Aronson conducted a revealing study into the relationship between belief and behaviour. Let's turn back the hands of time and imaging that you are a volunteer in that experiment.When you arrive at Aronson's laboratory, a researcher asks you whether you would mind participating in a group discussion about the psychology of sex. Drooling, you say that you are open to the idea. the researcher then explains that some people have become very self-conscious during the discussion and so now all potential volunteers have to pass an 'embarrassment' test. You are handed a long list of highly evocative words (including many containing four letters) and two passages containing a vivid descriptions of sexual activity. The researcher asks you to read both the list and the passages out loud, while he rates the degree to which you are blushing. After much sanctioned cursing, the researcher says that the good news is that you have passed the test and so can now take part in the group discussion. However, the bad news is that the 'embarrassment' test has taken longer than anticipated, so the discussion has already stared and you will just have to listen to the group this time around. The researcher shows you into a small cubicle, explains that all of the group members sit in separate rooms to ensure anonymity, and asks you to wear some headphones. You don the headphones and are rather disappointed to discover that after all you have been through, the group is having a rather dull discussion about a book called Sexual Behavior in Animals. Finally, the researcher returns and asks you to rate the degree to which you want to join the group.
Like many psychology experiments, Aronson's study involved a considerable amount of deception. In reality, the entire experiment was not about the psychology of sex, but the psychology of belief. When participants arrived at the laboratory they were randomly assigned to one of two groups. Half of them went through the procedure described above, and were asked to read out highly evocative word lists and graphic passages. Those in the other group were asked to read out far less emotionally charged words (think 'prostitute' and 'virgin'). Everyone then heard the same recorded group discussion and was asked to rate the degree to which they valued being a member of the group. Most psychologists in Aronson's day would have predicted that those who underwent the more embarrassing procedure would end up liking the group less because they would associate it with a highly negative experience. However, Aronson's work into the psychology of self-justification had led him to expect a quite different set of results. Aronson speculated that those who had read out the more evocative sexual material would justify their increased embarrassment by convincing themselves that the group was worth joining, and end up thinking more highly of it. Aronson's predictions proved correct. Even though everyone had heard the same recording of the group discussion, those who underwent the more extreme embarrassment test rated joining the group as far more desirable than those in the 'prostitute and virgin' group.
Aronson's findings help explain why many groups demand that potential members undergo painful and humiliating initiation rituals. American college fraternities make freshmen eat unpleasant substances or strip naked, the military put new recruits through extreme training, and medical interns are expected to work night and day before becoming fully fledged doctors. Jones used the same tactics to encourage people to feel committed to People's Temple. Members of the congregation had to endure long meetings, write self-incriminating letters, give their property to the Temple, and allow their children to be raised by other families. If Jones suspected someone of behaving in a way that was not in the interests of the Temple, he would ask other members of the congregation to punish them. Common sense would predict that these acts would drive people away from both Jones and Peoples Temple. In reality, the psychology of self-justification ensured that it actually moved them closer to the cause.
The mind control exhibited by the likes of Jim Jones does not involve any hypnotic trances or prey on the suggestible. Instead, it uses four key principles. The first involves a slow ratcheting up of involvement. Once a cult leader has his foot in the door, they ask for greater levels of involvement until suddenly followers find themselves fully immersed in the movements. Second, any dissenting voices are removed from the group. Skeptics are driven away and the group is increasingly isolated from the outside world. Then there are the miracles. By appearing to perform the impossible, cult leaders often convince their followers that they have direct access to God and therefore should not be questioned. Finally, there is self-justification. You might imagine that asking someone to carry out a bizarre or painful ritual would encourage them to dislike the group. In reality, the opposite is true. By taking part in these rituals followers justify their suffering by adopting more positive attitudes toward the group.
Of course, it would be nice to think that if the group had not been so isolated from society, it might have been possible to undo the effects of these techniques, explain the madness of their ways, and avert a major tragedy. However, our final sojourn into the world of cults suggest that this is a naïve view of those that have fallen under the spell of a charismatic leader.
How to Avoid being Brainwashed
- Do you feel as if the 'foot in the door' technique might be at work? Did the organization or person start by asking you to carry out small acts of commitment or devotion, then slowing increase their requirements? If so, do you really want to go along with their requests or are you being manipulated?
- Be wary of any organizations that attempts to distance you from a dissenting point of view. Are they trying to cut you off from friends and family? Within the organization, is dissent and open discussion squashed? If the answer to either of these questions is 'yes', think carefully about any involvement.
- Does the leader of the organization claim to be able to achieve paranormal miracles? Perhaps healing or acts of prophesy? However impressive, these are likely to be the result of self-delusion or deception. Don't be swayed by supernatural phenomena until you have investigated them for yourself.
- Does the organization require any painful, difficult or humiliating initiation rituals? Remember that these may well be designed to manufacture an increased sense of group allegiance. Ask yourself whether any suffering is really needed.
From: http://jonestown.sdsu.edu/?page_id=61771
Trueblackanonymous You Tube Channel
Comments
Post a Comment