<<

(The Raids of the 1920’s) UNITED STATES HISTORY WRITTEN BY: Gregory Dehler See Article History Alternative Title: Palmer Red Raids BRITANNICA STORIES

Palmer Raids, also called Palmer Red Raids, raids conducted by the U.S. Department of Justice in 1919 and 1920 in an attempt to arrest foreign anarchists, communists, and radical leftists, many of whom were subsequently deported. The raids, fueled by social unrest following , were led by Attorney General A. Mitchell Palmer and are viewed as the climax of that era’s so-called .

 A. Mitchell Palmer. Harris and Ewing Collection/Library of Congress, Washington, D.C. (digital file no. LC-DIG-hec-16294) The emotional pitch of World War I did not abate with the armistice, and rampant inflation, unemployment, massive and violent strikes, and brutal race riots in the United States contributed to a sense of fear and foreboding in 1919. A mail bomb plot, consisting of 36 explosive packages designed to go off on , 1919, triggered a grave fear that a Bolshevik conspiracy sought the overthrow of the United States. On June 2, 1919, a second series of bombings took place, destroying Palmer’s home and leading to increased public pressure for action against the radical agitators. Palmer was a latecomer to the anticommunist cause and had a history of supporting civil liberties. However, he was ambitious to obtain the Democratic nomination for the presidency in 1920 and believed that he could establish himself as the law-and-order candidate. Together with J. Edgar Hoover, Palmer created the General Intelligence Division in the Federal Bureau of Investigation and secured an increase in funds from Congress to devote to anticommunist activities by the Justice Department. On , 1919 (the second anniversary of the Bolshevik takeover of Russia), U.S. federal and local authorities raided the headquarters of the in and arrested more than 200 individuals. On November 25 a second raid on the Union of Russian Workers headquarters unveiled a false wall and a bomb factory, confirming suspicions that the union harboured revolutionary intentions. Palmer believed that the way to deal with the radicals was to deport the immigrants. On December 21, 249 radicals, including anarchist , were packed aboard the USS Buford, which the press dubbed the Soviet Ark, and deported to Russia. On January 2, 1920, the most spectacular of the Palmer Raids took place, when thousands of individuals (estimates vary between 3,000 and 10,000) were arrested in more than 30 cities. The following day, federal, state, and local agents conducted further raids. In all the Palmer Raids, arrests greatly exceeded the number of warrants that had been obtained from the courts, and many of those arrested were guilty of nothing more than having a foreign accent. Palmer declared the raids a success but announced that the work was far from done. He claimed that there were still more than 300,000 dangerous communists inside the United States. Local authorities lacked the facilities to hold the arrestees from the January raids, and Palmer sent a large number of suspected radicals to the Bureau of Immigration for . Acting Secretary of Labor Louis Post, however, did not share Palmer’s fear of radical aliens and reversed more than 70 percent of the 1,600 deportation warrants. Meantime, American public opinion shifted under Palmer’s feet. As news of the brutality of the raids became public and the constitutionality of the actions was brought into question, many, including the National Civil Liberties Bureau, publicly challenged Palmer’s actions. Palmer’s unfulfilled dire predictions of a May Day 1920 destroyed his credibility with the public, diminishing the Red Scare and ending the Palmer Raids.

  Fearing a communist revolutionary takeover of the U.S., Attorney General A. Mitchell Palmer set in motion a hunt for conspirators, called the Palmer Raids. Read about these raids and test your understanding with a quiz.

 The Red Scare

 Though the U.S. and the Allies won World War I in 1918, there was still fear in the air. Influential journalist Walter Lippman wrote, 'We seem to be the most frightened victors the world ever saw.' So what were Americans so scared of?

 In 1917 the (nicknamed the 'Reds') took control of Russia and created the world's first communist state. They declared their goal of fomenting a worldwide communist revolution. Many Americans were sympathetic to the Bolshevik's anti-capitalist views: socialists were elected as mayors and council members in many U.S. cities, and the socialist candidate for president Eugene V. Debs, languishing in federal prison for criticizing America's role in World War I, received almost 1 million votes in the 1920 election. Radical labor unions challenged their employers with a series of strikes and violent confrontations. And anarchists triggered bombs in courthouses and police stations, killing dozens. This set off the Red Scare: a fear of revolutionary violence in the United States.

Many Americans feared revolutionary violence during the Red Scare

 When a bomb destroyed the front of the U.S. Attorney General's house in June, 1919, he vowed to stop the pending anti-capitalist revolution in America. The Attorney General, a man named A. Mitchell Palmer, proclaimed that a 'blaze of revolution' was 'sweeping over every American institution of law and order.'

The aftermath of the bombing of the house of Attorney General A. Mitchell Palmer

 The Palmer Raids

 Beginning in 1918 and continuing through 1920, Palmer spearheaded a nationwide hunt for potential revolutionaries. He argued that anyone who might harbor ideas that could lead to violence should be detained and deported, even if they hadn't committed an actual crime. Palmer set up a Radical Division within the Justice Department, and the new division's sole aim was to seek out revolutionaries.

 The Palmer Raids targeted radical labor unions such as the Industrial Workers of the World and the Union of Russian Workers. Federal officials also roughed up and arrested immigrants, socialists, and communists.

 The largest of the Palmer Raids occurred in January, 1920, when Palmer's henchmen broke down doors in over 30 cities. Almost 6,000 people were arrested, many taken from their homes without arrest warrants. Though Palmer claimed to have captured a variety of conspirators and subversives, the truth was something different. Most of the 500 people that were deported as result of the raids were nothing more than intellectual radicals who had committed no crime nor participated in anything illegal. But Americans were scared, so they allowed Palmer to act with impunity.  Fear-mongers like A. Mitchell Palmer usually go too far. And Palmer did when he announced that he had uncovered a revolutionary conspiracy set to be unleashed on May 1, 1920. Palmer predicted an orgy of violence and destruction. Heeding his warning, states sent out militias, bomb squads, and police en masse.

A Byte Out of History The Palmer Raids 12/28/07 The bomb hit home, both literally and figuratively. On June 2, 1919, a militant anarchist named Carlo Valdinoci blew up the front of newly appointed Attorney General A. Mitchell Palmer’s home in Washington, D.C.—and himself up in the process when the bomb exploded too early. A young Franklin and Eleanor Roosevelt, who lived across the street, were also shaken by the blast. The bombing was just one in a series of coordinated attacks that day on judges, politicians, law enforcement officials, and others in eight cities nationwide. About a month earlier, radicals had also mailed

bombs to the mayor of Seattle and a U.S. Senator, blowing the hands The destruction caused by the bombing off the senator’s domestic worker. The next day, a postal worker in of Attorney General Palmer’s home. New York City intercepted 16 more packages addressed to political and business leaders, including John D. Rockefeller. It was already a time of high anxiety in America—driven by a deadly wave of the pandemic flu, the Bolshevik revolution in Russia and ensuing over-hyped “Red Scare,” and sometimes violent labor strikes across the country. demanded a response to the bombings, and the Attorney General—who had his eye on the White House in 1920—was ready to oblige. He created a small division to gather intelligence on the radical threat and placed a young Justice Department lawyer named J. Edgar Hoover in charge. Hoover collected and organized every scrap of intelligence gathered by the Bureau of Investigation (the FBI’s predecessor) and by other agencies to identify anarchists most likely involved in violent activity. The young Bureau, meanwhile, continued to investigate those responsible for the bombings. Later that fall, the Department of Justice began arresting, under recently passed laws like the Sedition Act, suspected radicals and foreigners identified by Hoover’s group, including well-known leaders Emma Goldman and . In December, with much public fanfare, a number of radicals were put on a ship dubbed the “Red Ark” or “Soviet Ark” by the press and deported to Russia. At this point, though, politics, inexperience, and overreaction got the better of Attorney General Palmer and his department. Hoover—with the encouragement of Palmer and the help of the Department of Labor—

started planning a massive roundup of radicals. By early January 1920, the plans were ready. The department Attorney General A. Mitchell Palmer organized simultaneous raids in major cities, with local police called on to arrest thousands of suspected anarchists. But the ensuing “Palmer Raids” turned into a nightmare, marked by poor communications, planning, and intelligence about who should be targeted and how many arrest warrants would be needed. The constitutionality of the entire operation was questioned, and Palmer and Hoover were roundly criticized for the plan and for their overzealous domestic security efforts. The “Palmer Raids” were certainly not a bright spot for the young Bureau. But it did gain valuable experience in terrorism investigations and intelligence work and learn important lessons about the need to protect civil liberties and constitutional rights. Today, as Director Mueller has said, we realize that the FBI will be judged not just on how well it protects the nation, but also on how well it protects our nation’s constitutional freedoms along the way. We are committed to doing both.

Rwandan Genocide Over the past fifty years, conflict between the Hutu and Tutsi in postcolonial Rwanda has resulted in over one million deaths and a series of horrific genocides. Between the years of 1959 and 1994, "the idea of genocide, although never officially recognized, became a part of life" (Melvern, 9). The early political violence escalated into the horrific massacre of 800,000 people in 1994. Present day Rwanda still struggles to come to terms with the aftermath of the conflict.

The terms "Hutu" and "Tutsi" were first given significance by the early white explorers. As both Hutu and Tutsi spoke the same language, practiced the same religion, and participated in the same government, ethnographers claim that the two groups "cannot be called distinct ethnic groups" (Gourevitch, 48). Rather, the two terms referred to a caste system in which the Hutu were primarily farmers and the Tutsi were primarily herdsmen. In the Kigali Genocide Memorial, historians even claim that this status was not always permanent; by owning ten or more cattle a Hutu could become a Tutsi and vice versa. However, British explorer John Hanning Speke interpreted these terms in a different way. In what is now known as the Hamitic Myth, Speke believed that the Hutu were a typical specimen of a "primitive race," "the true curly- headed, flab- nosed, pouched- mouthed Negro" while the Tutsi were "descended from the best blood of Abyssinia" and therefore far superior (Speke, Journal of the Discovery of the Source of the Nile).

Soon after, Rwanda became a German colony. After the German defeat in World War I, Rwanda- Burundi was given to Belgium by the League of Nations. When Europeans first arrived in Rwanda, they found a complex and well organized semi- feudal society with a strong monarchy (Melvern, 5). In order to effectively control Rwanda, the Belgian colonialists issued out identity cards in 1933, "arbitrarily [classifying] the whole population as Hutu, Tutsi or Twa" by measuring qualities such as height, length of nose and eye shape. During Belgian rule, Tutsis were favored for all administrative positions and Hutus were actively discriminated against.

In 1957, the Hutu Manifesto was published, calling for majority rule and blaming Rwandan problems on Tutsi superiority. At this point, public opinion changed and many Belgians started to support the Hutu majority. In 1959, the Rwandan king died under mysterious circumstances while being treated by a Belgian doctor. Tutsi feared that this was part of Hutu plot to gain power and began trying to destroy emerging Hutu leaders. After a young Tutsi attacked a Hutu leader, widespread Tutsi murders began. Thousands of Tutsi were killed and thousands more families had to flee the country. Many petitions sent to the UN reported "burning and killing being done in the daylight sometimes in the presence of the so- called police" (excerpt from Melvern, 7), indicating that killings were planned and organized. When the UN sent forces to investigate the reports, officials reported seeing "racism that 'bordered on Nazism against Tutsi minorities'" (Melvern, 7). Under these circumstances, Rwanda became independent and Gregoire Kayibanda, a Hutu teacher, became president. Kayibanda encouraged violence against Tutsi and spread frequent rumors of Tutsi plots. During his reign, there were several genocides killing an estimated 10,000 to 14,000 Tutsi. A former Hutu military officer, Juvenal Habyarimana, ousted Kayibanda in 1973. Habyarimana was initially welcomed by both Tutsi and Hutu but quickly asserted total control over the country, enforcing a one party system, restricting movements, and encouraging divisions between Tutsi and Hutu. Violence, discrimination and corruption quickly became a government policy. When the RPF (Rwandan Patriotic Front), a small army comprised of Rwandan refugees, attacked the Rwandan border in 1990, Habyarimana reacted with panic. The government fired attacks on the capital to rouse Hutu citizens to action against Tutsi citizens. Relations between the Tutsi and Hutu deteriorated even more, leading to the massive genocide of 1994.

On April 6, 1994, President Habyarimana's plane was shot down, killing the presidents from Rwanda and Burundi. Within hours of the plane crash, Tutsi massacres began. In Kigali, the international peacekeeping forces were badly organized and not united. Belgian soldiers guarding the house of the man set to become the next president were shot and killed. The Presidential Guard, loyal to Habyarimana, surrounded the future president's house and killed him and his family, claiming they were Tutsi supporters. After the former government had been disabled, the Hutu Power party took control of the city and country and began to urge people to "do their work," and kill the Tutsi men, women, and children. By this point, it was clear that the UN and international forces were powerless or unwilling to stop the current events unfolding. Over the next hundred days, between one million and 800,000 Rwandans would be killed by their neighbors, friends, families, teachers, and priests.

While at first glance, this conflict is often dismissed as an ethnic issue, involving only the Hutu and Tutsi, there are clearly many more parties involved. The primary parties involved in this conflict were the Hutu Power party and their supporters and Rwandan Tutsi and other Rwandan citizens of the time. Many secondary parties further fueled the conflict. To the South, Burundi was made up of both Hutu and Tutsi, both groups that would sympathize with groups in Rwanda. West of Rwanda, Zaire, the current Democratic Republic of Congo, was ruled by the dictator Mobutu, a close personal friend of Habyarimana, who blatantly sympathized with the Hutu. Mobutu, infamous for pocketing European and American aid, would encourage the conflict as a means of bringing in money for refuge camps and programs. Mobutu was also closely allied with France and, with French help, would support Hutu forces during the genocide and take a strong position against the RPF and Tutsi after the genocide. The Anglophone and Francophone conflict throughout Africa would play an important role in Rwanda, as Francophone countries and supporters would provide help and support for the Hutu. Currently, France's position against the current Rwandan president, Paul Kagame, has only served to further delay reconciliation. Tertiary parties, intended as international monitors and peacekeepers, had an important, and not always positive, role in the outcome of the conflict. The United Nations, despite admirable efforts by General Dallaire, did not have the necessary support, troops, and supplies to stop the genocide. In addition, NGOs and the UN funded refugee camps in Zaire, allowing thousands of genocidaires to escape justice and regroup for attacks. The belated intervention and peacekeeping attempts escalated the conflict and increased the death toll.

Despite some of the ulterior motives of the secondary and tertiary parties, the Rwandan conflict is mostly seen through identity frames. The conflict dismissed by the majority of the world as an ethnic issue and, as a tribal issue, one that outside parties did not have any responsibility to mediate. Secondary parties, such as France and Francophone colonies, viewed the conflict through identity frames and power frames. The close personal relationships between the presidents of France, Zaire, and Habyrimanara in Rwanda led to a close alliance between these French- speaking nations. These countries feared that any change of government in Rwanda would bring in a group of people that were not willing to maintain the cultural and financial ties with France. Although there was nothing to suggest that another government would have strengthened Anglophone ties, France and Francophone countries were more concerned with maintaining the status quo and limiting the potential for change in Rwanda. These countries framed the conflict in terms of power frames as well, deciding that the best way to maintain their own positions and power was to support the "old boy's network" in Africa. Finally, the primary groups involved also saw this as an identity conflict, using characterization and identity frames to view the conflict. In a perversion of the original Hamitic myth, Hutu Power and similar groups believed the Tutsi were immigrants to Rwanda and the Hutu were the original inhabitants. Tutsi were largely stereotyped, described as "cockroaches" and other vermin that required extermination. The Hutu Power movement strongly stereotyped the Tutsi and felt they threatened their own position.

These stereotypes were reinforced by the media portrayal of the groups and, after the President's plane was shot down, Hutu Power controlled media easily persuaded people to take up their weapons and kill Rwandan Tutsi. The massacre only lasted 100 days until the RPF army invaded but close to 1 million Tutsi were killed. Today, the government is headed by Paul Kagame, the leader of the RPF force which entered the country to abruptly stop the genocide. Still, the majority of the people in Rwanda during 1994 were at least indirectly either involved in genocide or knew victims of genocide. A UNICEF survey estimated that 5 out of 6 children in Rwanda at that time had witnessed bloodshed (Gourevitch, 224). People left in Rwanda were deeply scarred by their experiences. As one survivor described it, "people come to Rwanda and talk of reconciliation... imagine talking to the Jews of reconciliation in 1946" (Gourevitch, 240). Understandably, Rwanda remains in a stage of post conflict peace- building. Under normal circumstances, anyone who committed murder would be tried and punished. But, despite the large numbers of genocidaires in prison, the majority of the genocidaires are free. President Kagame has focused on trying and punishing mostly the masterminds of the genocide, instead of a huge percentage of the population. For the rest of the population involved, Rwanda has adopted the policy of trying to teach and redeem people who committed war crimes. Even today, villages have weekly trials or meetings to try to promote reconciliation and deal with past crimes of village members.

Although Rwandans face a daunting task, an amazing amount of progress has been made building back the country. There is a long history of violence and prejudice to overcome but Rwanda's future looks promising.

THE RWANDAN GENOCIDE From April to July 1994, members of the Hutu ethnic majority in the east-central African nation of Rwanda murdered as many as 800,000 people, mostly of the Tutsi minority. Begun by extreme Hutu nationalists in the capital of Kigali, the genocide spread throughout the country with staggering speed and brutality, as ordinary citizens were incited by local officials and the Hutu Power government to take up arms against their neighbors. By the time the Tutsi-led Rwandese Patriotic Front gained control of the country through a military offensive in early July, hundreds of thousands of Rwandans were dead and many more displaced from their homes. The RPF victory created 2 million more refugees (mainly Hutus) from Rwanda, exacerbating what had already become a full-blown humanitarian crisis.

 CONTENTS o Background: Ethnic Tensions in Rwanda o Genocide o International Response BACKGROUND: ETHNIC TENSIONS IN RWANDA By the early 1990s, Rwanda, a small country with an overwhelmingly agricultural economy, had one of the highest population densities in Africa. About 85 percent of its population is Hutu; the rest is Tutsi, along with a small number of Twa, a Pygmy group who were the original inhabitants of Rwanda. Part of German East Africa from 1894 to 1918, Rwanda came under the League of Nations mandate of Belgium after World War I, along with neighboring Burundi. Rwanda’s colonial period, during which the ruling Belgians favored the minority Tutsis over the Hutus, exacerbated the tendency of the few to oppress the many, creating a legacy of tension that exploded into violence even before Rwanda gained its independence. A Hutu revolution in 1959 forced as many as 300,000 Tutsis to flee the country, making them an even smaller minority. By early 1961, victorious Hutus had forced Rwanda’s Tutsi monarch into exile and declared the country a republic. After a U.N. referendum that same year, Belgium officially granted independence to Rwanda in July 1962. DID YOU KNOW? In September 1998, the International Criminal Tribunal for Rwanda (ICTR) issued the first conviction for genocide after a trial, declaring Jean-Paul Akayesu guilty for acts he engaged in and oversaw as mayor of the Rwandan town of Taba. Ethnically motivated violence continued in the years following independence. In 1973, a military group installed Major General Juvenal Habyarimana, a moderate Hutu, in power. The sole leader of Rwandan government for the next two decades, Habyarimana founded a new political party, the National Revolutionary Movement for Development (NRMD). He was elected president under a new constitution ratified in 1978 and reelected in 1983 and 1988, when he was the sole candidate. In 1990, forces of the Rwandese Patriotic Front (RPF), consisting mostly of Tutsi refugees, invaded Rwanda from Uganda. A ceasefire in these hostilities led to negotiations between the government and the RPF in 1992. In August 1993, Habyarimana signed an agreement at Arusha, Tanzania, calling for the creation of a transition government that would include the RPF. This power- sharing agreement angered Hutu extremists, who would soon take swift and horrible action to prevent it.

GENOCIDE On April 6, 1994, a plane carrying Habyarimana and Burundi’s president Cyprien Ntaryamira was shot down over Kigali, leaving no survivors. (It has never been conclusively determined who the culprits were. Some have blamed Hutu extremists, while others blamed leaders of the RPF.) Within an hour of the plane crash, the Presidential Guard together with members of the Rwandan armed forces (FAR) and Hutu militia groups known as the Interahamwe (“Those Who Attack Together”) and Impuzamugambi (“Those Who Have the Same Goal”) set up roadblocks and barricades and began slaughtering Tutsis and moderate Hutus with impunity. Among the first victims of the genocide were the moderate Hutu Prime Minister Agathe Uwilingiyimana and her 10 Belgian bodyguards, killed on April 7. This violence created a political vacuum, into which an interim government of extremist Hutu Power leaders from the military high command stepped on April 9.

The mass killings in Rwanda quickly spread from Kigali to the rest of the country, with some 800,000 people slaughtered over the next three months. During this period, local officials and government-sponsored radio stations called on ordinary Rwandan civilians to murder their neighbors. Meanwhile, the RPF resumed fighting, and civil war raged alongside the genocide. By early July, RPF forces had gained control over most of country, including Kigali. In response, more than 2 million people, nearly all Hutus, fled Rwanda, crowding into refugee camps in the Congo (then called Zaire) and other neighboring countries.

After its victory, the RPF established a coalition government similar to that agreed upon at Arusha, with Pasteur Bizimungu, a Hutu, as president and Paul Kagame, a Tutsi, as vice president and defense minister. Habyarimana’s NRMD party, which had played a key role in organizing the genocide, was outlawed, and a new constitution adopted in 2003 eliminated reference to ethnicity. The new constitution was followed by Kagame’s election to a 10-year term as Rwanda’s president and the country’s first-ever legislative elections. INTERNATIONAL RESPONSE As in the case of atrocities committed in the former Yugoslavia around the same time, the international community largely remained on the sidelines during the Rwandan genocide. A U.N. Security Council vote in April 1994 led to the withdrawal of most of a U.N. peacekeeping operation (UNAMIR) created the previous fall to aid with governmental transition under the Arusha accord. As reports of the genocide spread, the Security Council voted in mid-May to supply a more robust force, including more than 5,000 troops. By the time that force arrived in full, however, the genocide had been over for months. In a separate French intervention approved by the U.N., French troops entered Rwanda from Zaire in late June. In the face of the RPF’s rapid advance, they limited their intervention to a “humanitarian zone” set up in southwestern Rwanda, saving tens of thousands of Tutsi lives but also helping some of the genocide’s plotters–allies of the French during the Habyarimana administration–to escape.

In the aftermath of the Rwandan genocide, many prominent figures in the international community lamented the outside world’s general obliviousness to the situation and its failure to act in order to prevent the atrocities from taking place. As former U.N. Secretary-General Boutros Boutros-Ghali told the PBS news program “Frontline”: “The failure of Rwanda is 10 times greater than the failure of Yugoslavia. Because in Yugoslavia the international community was interested, was involved. In Rwanda nobody was interested.” Attempts were later made to rectify this passivity. After the RFP victory, the UNAMIR operation was brought back up to strength; it remained in Rwanda until March 1996, as one of the largest humanitarian relief efforts in history.

In October 1994, the International Criminal Tribunal for Rwanda (ICTR), located in Tanzania, was established as an extension of the International Criminal Tribunal for the former Yugoslavia (ICTY) at The Hague, the first international tribunal since the Nuremburg Trials of 1945-46 and the first with the mandate to prosecute the crime of genocide. In 1995, the ICTR began indicting and trying a number of higher-ranking people for their role in the Rwandan genocide; the process was made more difficult because the whereabouts of many suspects were unknown. The trials continued over the next decade and a half, including the 2008 conviction of three former senior Rwandan defense and military officials for organizing the genocide.

Japanese-American Internment Camps

Two months after the Japanese bombing of Pearl Harbor, U.S. President Franklin D. Roosevelt signed Executive Order 9066 ordering all Japanese-Americans to evacuate the West Coast. This resulted in the relocation of approximately 120,000 people, many of whom were American citizens, to one of 10 internment camps located across the country. Traditional family structure was upended within the camp, as American- born children were solely allowed to hold positions of authority. Some Japanese-American citizens of were allowed to return to the West Coast beginning in 1945, and the last camp closed in March 1946. In 1988, Congress awarded restitution payments to each survivor of the camps.

The relocation of Japanese-Americans into internment camps during World War II was one of the most flagrant violations of civil liberties in American history. According to the census of 1940, 127,000 persons of Japanese ancestry lived in the United States, the majority on the West Coast. One-third had been born in Japan, and in some states could not own land, be naturalized as citizens, or vote. After Japan bombed Pearl Harbor in December 1941, rumors spread, fueled by race prejudice, of a plot among Japanese-Americans to sabotage the war effort. In early 1942, the Roosevelt administration was pressured to remove persons of Japanese ancestry from the West Coast by farmers seeking to eliminate Japanese competition, a public fearing sabotage, politicians hoping to gain by standing against an unpopular group, and military authorities.

On February 19, 1942, Roosevelt signed Executive Order 9066, which forced all Japanese-Americans, regardless of loyalty or citizenship, to evacuate the West Coast. No comparable order applied to Hawaii, one- third of whose population was Japanese-American, or to Americans of German and Italian ancestry. Ten internment camps were established in California, Idaho, Utah, Arizona, Wyoming, Colorado, and Arkansas, eventually holding 120,000 persons. Many were forced to sell their property at a severe loss before departure. Social problems beset the internees: older Issei (immigrants) were deprived of their traditional respect when their children, the Nisei (American-born), were alone permitted authority positions within the camps. 5,589 Nisei renounced their American citizenship, although a federal judge later ruled that renunciations made behind barbed wire were void. Some 3,600 Japanese-Americans entered the armed forces from the camps, as did 22,000 others who lived in Hawaii or outside the relocation zone. The famous all-Japanese 442nd Regimental Combat Team won numerous decorations for its deeds in Italy and Germany.

The Supreme Court upheld the legality of the relocation order in Hirabayashi v. United States and Korematsu v. United States. Early in 1945, Japanese-American citizens of undisputed loyalty were allowed to return to the West Coast, but not until March 1946 was the last camp closed. A 1948 law provided for reimbursement for property losses by those interned. In 1988, Congress awarded restitution payments of twenty thousand dollars to each survivor of the camps; it is estimated that about 73,000 persons will eventually receive this compensation for the violation of their liberties.

10Home Raids

Soon after the bombing of Pearl Harbor, FBI agents raided homes of the Issei (i.e., first-generation immigrants from Japan). The American government also froze the assets of anyone connected to Japan. These actions violated people’s rights to their property, invaded people’s privacy, and resulted in the arrest of 1,212 innocent Issei—and this was just the initial round-up.

Irreplaceable family heirlooms were confiscated, never to be returned. Potentially dangerous items and objects with a special connection to Japan were labeled “contraband.” Possession of contraband was illegal because it showed allegiance to the enemy. Anyone caught holding on to their precious family keepsakes was arrested.

Targets included first-generation immigrants and Japanese-American citizens—farmers, teachers, business owners, doctors, bankers, and various other productive members of society. Many had already had their assets frozen on July 26, 1941, in response to a Japanese invasion in Asia months before the Pearl Harbor bombings.

These freezes, forcible seizures of property, and undeserved arrests were only the beginning of injustices experienced by loyal Japanese Americans.

9Forced Evacuation

Registration was the first step to evacuation. After registering, Japanese Americans were expected to follow strict rules, such as curfew and travel restrictions. They were eventually ordered to abandon home. Those whose assets weren’t frozen weren’t given long to sell their businesses and property. Belongings were let go at a fraction of their worth, if they could be sold at all.

Some Japanese Americans avoided this fate by moving farther east. Approximately 150,000 Hawaiians also avoided internment. Almost 40 percent of Hawaiian islanders were Japanese-American. Though the racial connection caused fear, powerful Hawaiians demanded that those of Japanese ancestry be left alone. Many labored on the pineapple and sugar plantations, and they were essential to local economic success. The population of the west coast received no such protection, and they suffered immensely for it.

8Assembly Centers Built For Animals

Photo credit: Library of Congress

When evacuated, Japanese Americans were only allowed to take what they could carry. Each internee was sent to one of 16 assembly centers. From there, they were assigned to one of 10 internment camps. The most well- known is Manzanar War Relocation Center.

Racetracks and fairgrounds were the types of environments used for the assembly centers. Internees stayed in animal stables and stalls where livestock had been kept recently. The stench of manure rose up from the ground, dust blew inside, and people were forced to literally live like animals. Many of these units didn’t even have roofs overhead. Health care, food, and general cleanliness were disgustingly low-quality.

7Communal Living

Photo credit: Library of Congress

The treatment of interned Japanese Americans was similar to that of European POWs. Sometimes, family members would be sent to separate barracks, and sometimes they’d be sent to separate camps.

Internees were forced to share living quarters with strangers. They could not even get dressed in privacy. Since the barracks didn’t include toilets, everyone had to wait in line to use communal latrines, which did not include partitions. Showers were taken in open areas, meant to service many people at once, rather than to accommodate modesty. Even running water had to be acquired from a communal source.

Living in close quarters and sharing so much, on top of the terrible housing conditions, gave easy rise to sickness. Proper medical care was rarely accessible. Numerous people died or experienced great suffering for lack of the necessary medical treatment. The physical and emotional trauma from internment became a permanent part of the people’s lives.

6The All-Japanese Regiment

Not even World War I veterans, who’d fought courageously and honorably for the American cause, could avoid being interned. They were pigeonholed as enemy aliens.

One way to escape the camps, however, was to enlist in the 442nd Infantry Regimental Combat Team. Everyone in the regiment was Japanese-American. Many who enlisted saw it as an opportunity to prove their loyalty to America. Internees were classified as 4-C, or enemy aliens, whereas soldiers were seen as allegiant to America. Some camps protested the enlistment of their internees, believing that the 442nd would be sent only on the most dangerous missions. The military still found all the volunteers it needed.

Soldiers of the 442nd showed unbelievable bravery and are highly renowned to this day. During the war, 650 of them died. Twenty in the regiment received the Medal of Honor—in the year 2000.

5Desert Prisons

Most of the assembly centers and camps were built on barren land. Internees tried growing crops in the desert, as the government wanted the camps to support themselves, but it didn’t always work.

The interned Japanese Americans were paid low wages for their labor. In the summer, desert heat sent the temperature above 38 degrees Celsius (100 °F), and winter temperatures plummeted below freezing. People who’d done nothing wrong were kept behind barbed-wire fences, in desolate camps patrolled by military police. Armed guards kept constant watch and shot anyone suspected of attempting escape. “Troublemakers” were separated from their families and sent to more unpredictable environments. The government paid little attention to the grievances of Japanese Americans. Allegedly, one of the reasons for internment was to protect the inmates from an American public hostile and violent toward the Japanese. But one internee, famously illustrating how it felt to be forced into these camps, posed the question: “If we were put there for our protection, why were the guns at the guard towers pointed inward, instead of outward?”

4Death As Punishment

Attempting escape, resisting orders, and treason were all punishable by death in internment camps. Guards would face little consequence for killing without just cause.

A mentally ill man in his mid-forties, Ichiro Shimoda, was shot trying to escape in 1942. He’d attempted suicide twice since entering the camp, and the guards were well aware of his mental illness. That same year, two Californians were killed during an alleged escape attempt from the Lourdsburg, New Mexico camp. It was later revealed that Hirota Isomura and Toshiro Kobata were both extremely weak upon arrival—too weak to walk, much less escape.

A handful of guards went to court for their wrongdoings but with disappointing results. One guard was tried for the 1943 murder of an elderly chef named James Hatsuki Wakasa. He was found not guilty. Private Bernard Goe was also tried after killing Shoichi James Okamoto. Goe was acquitted and fined for unauthorized use of government property. The amount: $1—the cost of the bullet used to kill the victim.

3Expatriation

After World War II ended and the internment camps closed, 4,724 Japanese Americans were permanently relocated to Japan. The majority were US citizens or resident aliens. Nearly all of the expatriated citizens were 20 years old or younger.

Teachers in the internment camps had taught them how to read and write Japanese and to be proud of their heritage so they’d have an easier time assimilating. They were transported directly from the internment camps to ships and then overseas to their new homeland of Japan.

More than 20,000 Japanese Americans requested to expatriate between 1941 and 1945. The longer that internment continued, the more requests were filed. Asking to leave the US was a form of nonviolent protest. Those who requested expatriation weren’t forced to follow through once internment was over. However, we’ll never know what the thousands relocated to Japan might have contributed to American society had they remained.

2Rebranding

Today, we call them”internment camps.” A more accurate term would be “concentration camps.” They were called exactly that by then-President Roosevelt as he confidently endorsed them. The name “enemy alien internment camps” was also used to describe these centers.

The modern wording stems from how they weren’t the vicious death campsexperienced in Europe, which is how most people view concentration camps today. Internees enjoyed weddings, gardening, painting, sports, clubs, and even newspapers. There were no gas chambers. Inmates were not doomed to genocide. Still, “internment camp” doesn’t do justice to the horrors experienced within them. Japanese Americans were uprooted from their homes and treated like criminals. They experienced enormous loss. They suffered great physical and emotional trauma. A racial minority was concentrated in specific areas for the security of the nation, imprisoned in deplorable conditions, and stripped of their dignity. They were living in concentration camps.

1Lack Of Remorse

Anti-Japanese sentiment remained even after the last camp closed in March 1946. Former internees who returned home for their assets were beaten and even killed. Neighborhood signs declared that “Japs” weren’t welcome anymore, warning them to keep away. Not only did they lose their belongings, they lost their sense of belonging. They weren’t even welcome to rebuild the lives they once knew.

Worsening the matter, the American government was slow to admit its mistake. Fred Korematsu challenged the legality of Executive Order 9066 in 1944. He lost in the Supreme Court by a 6–3 vote; internment was rationalized as a wartime necessity.

A formal US apology and recompense was finally offered through the Civil Liberties Act of 1988. Former inmates became eligible for a one-time restitution payment of $20,000. Many of their losses greatly exceeded that value.

There’s no way to truly make up for the way Japanese Americans were treated during the World War II era, but we can be more considerate of the rights of all Americans in the future.

McCarthyism AMERICAN HISTORY WRITTEN BY: Paul J. Achter See Article History

McCarthyism DATE

 c. 1950 - c. 1954 McCarthyism, name given to the period of time in American history that saw Wisconsin Sen. Joseph McCarthy produce a series of investigations and hearings during the 1950s in an effort to expose supposed communist infiltration of various areas of the U.S. government. The term has since become a byname for defamation of character or reputation by means of widely publicized indiscriminateallegations, especially on the basis of unsubstantiated charges. McCarthy was elected to the Senate in 1946 and rose to prominence in 1950 when he claimed in a speech that 205 communists had infiltrated the State Department. McCarthy’s subsequent search for communists in the Central Intelligence Agency, the State Department, and elsewhere made him an incredibly polarizing figure. After McCarthy’s reelection in 1952, he obtained the chairmanship of the Committee on Government Operations of the Senate and of its Permanent Subcommittee on Investigations. For the next two years he was constantly in the spotlight, investigating various government departments and questioning innumerable witnesses about their suspected communist affiliations. Although he failed to make a plausible case against anyone, his colourful and cleverly presented accusations drove some persons out of their jobs and brought popular condemnation to others. McCarthyism both reached its peak and began its decline during the “McCarthy hearings”: 36 days of televised investigative hearings led by McCarthy in 1954. After first calling hearings to investigate possible espionage at the Army Signal Corps Engineering Laboratories in Fort Monmouth, New Jersey, the junior senator turned his communist-chasing committee’s attention to an altogether different matter, the question of whether the Army had promoted a dentist who had refused to answer questions for the Loyalty Security Screening Board. The hearings reached their climax when McCarthy suggested that the Army’s lawyer, Joseph Welch, had employed a man who at one time had belonged to a communist front group. Welch’s rebuke to the senator—“Have you no sense of decency, sir, at long last? Have you left no sense of decency?”—discredited McCarthy and helped to turn the tide of public opinion against him. Moreover, McCarthy was also eventually undermined significantly by the incisive and skillful criticism of a journalist, Edward R. Murrow. Murrow’s devastating television editorial about McCarthy, carried out on his show, See It Now, cemented him as the premier journalist of the time. McCarthy was censured for his conduct by the Senate, and in 1957 he died. While McCarthyism proper ended with the Senator’s downfall, the term still has currency in modern political discourse.

During the late 1940s and early 1950s, the prospect of communist subversion at home and abroad seemed frighteningly real to many people in the United States. These fears came to define–and, in some cases, corrode– the era’s political culture. For many Americans, the most enduring symbol of this “Red Scare” was Republican Senator Joseph P. McCarthy of Wisconsin. Senator McCarthy spent almost five years trying in vain to expose communists and other left-wing “loyalty risks” in the U.S. government. In the hyper-suspicious atmosphere of the Cold War, insinuations of disloyalty were enough to convince many Americans that their government was packed with traitors and spies. McCarthy’s accusations were so intimidating that few people dared to speak out against him. It was not until he attacked the Army in 1954 that his actions earned him the censure of the U.S. Senate.

THE COLD WAR In the years after World War II ended, events at home and abroad seemed to many Americans to prove that the “Red menace” was real. In August 1949, for instance, the Soviet Union exploded its first atomic bomb. Later that year, Communist forces declared victory in the Chinese Civil War and established the People’s Republic of China. In 1950, North Korea’s Soviet-backed army invaded its pro-Western neighbors to the South; in response, the United States entered the conflict on the side of South Korea.

Did You Know? Along with the Army-McCarthy hearings, journalist Edward R. Murrow’s exposés of McCarthyism played an important role in the senator’s downfall. On March 9, 1954, millions of Americans watched as the national news program "See It Now" attacked McCarthy and his methods.

At the same time, the Republican-led House Un-American Activities Committee (known as HUAC) began a determined campaign to extirpate communist subversion at home. HUAC’s targets included left-wingers in Hollywood and liberals in the State Department. In 1950, Congress passed the McCarran Internal Security Act, which required that all “subversives” in the United States submit to government supervision. (President Truman vetoed the Act—he said it “would make a mockery of our Bill of Rights”—but a Congressional majority overrode his veto.)

JOSEPH MCCARTHY AND THE RISE OF MCCARTHYISM All of these factors combined to create an atmosphere of fear and dread, which proved a ripe environment for the rise of a staunch anticommunist like Joseph McCarthy. At the time, McCarthy was a first-term senator from Wisconsin who had won election in 1946 after a campaign in which he criticized his opponent’s failure to enlist during World War II while emphasizing his own wartime heroics.

In February 1950, appearing at the Ohio County Women’s Republican Club in Wheeling, West Virginia, McCarthy gave a speech that propelled him into the national spotlight. Waving a piece of paper in the air, he declared that he had a list of 205 known members of the Communist Party who were “working and shaping policy” in the State Department.

The next month, a Senate subcommittee launched an investigation and found no proof of any subversive activity. Moreover, many of McCarthy’s Democratic and Republican colleagues, including President Dwight Eisenhower, disapproved of his tactics (“I will not get into the gutter with this guy,” the president told his aides). Still, the senator continued his so-called Red-baiting campaign. In 1953, at the beginning of his second term as senator, McCarthy was put in charge of the Committee on Government Operations, which allowed him to launch even more expansive investigations of the alleged communist infiltration of the federal government. In hearing after hearing, he aggressively interrogated witnesses in what many came to perceive as a blatant violation of their civil rights. Despite a lack of any proof of subversion, more than 2,000 government employees lost their jobs as a result of McCarthy’s investigations.

“HAVE YOU NO SENSE OF DECENCY, SIR?” In April 1954, Senator McCarthy turned his attention to “exposing” the supposed communist infiltration of the armed services. Many people had been willing to overlook their discomfort with McCarthyism during the senator’s campaign against government employees and others they saw as “elites”; now, however, their support began to wane. Almost at once, the aura of invulnerability that had surrounded McCarthy for nearly five years began to disappear. First, the Army undermined the senator’s credibility by showing evidence that he had tried to win preferential treatment for his aides when they were drafted. Then came the fatal blow: the decision to broadcast the “Army-McCarthy” hearings on national television. The American people watched as McCarthy intimidated witnesses and offered evasive responses when questioned. When he attacked a young Army lawyer, the Army’s chief counsel thundered, “Have you no sense of decency, sir?” The Army-McCarthy hearings struck many observers as a shameful moment in American politics.

THE FALL OF JOSEPH MCCARTHY By the time the hearings were over, McCarthy had lost most of his allies. The Senate voted to condemn him for his “inexcusable,” “reprehensible,” “vulgar and insulting” conduct “unbecoming a senator.” He kept his job but lost his power, and died in 1957 at the age of 48.

SENATOR JOSEPH MCCARTHY, MCCARTHYISM, AND THE WITCH HUNT

On November 14, 1908, Joseph McCarthy was born into a Roman Catholic family as the fifth of nine children in Appleton, Wisconsin. Although McCarthy dropped out of grade school at the age fourteen, he returned to diligently finish his studies in 1928, permitting him to attend Marquette University. Once accepted, he began his journey to become what many historians consider to be one of the least qualified, most corrupt politicians of his time. After receiving his law diploma at Marquette University, McCarthy dabbled in unsuccessful law practices, and indulged in gambling along the way for extra financing. Despite being a Democrat early in his political years, he quickly switched into the Republican Party after being overlooked as a candidate in the Democratic Party for district attorney. His dirty campaign to win the position as circuit court judge proved to be an ominous foreshadowing to his later era of “McCarthyism.”

To stimulate his political career, McCarthy quit his job as circuit court judge and joined the Marines during World War II. After his short military career McCarthy then ran as the Republican candidate for the Wisconsin Senate seat, where he used propaganda and erroneous accusations against his opponent, Robert La Follette, to promote his own campaign. Damaging La Follete’s reputation by claiming he hadn’t enlisted in the military during the war, McCarthy won the election and became Senator.

As re-election began to loom closer, McCarthy, whose first term was unimpressive, searched for ways to ensure his political success, resorting even to corruption. Edmund Walsh, a close fellow Roman Catholic and anti- communist suggested a crusade against so-called communist subversives. McCarthy enthusiastically agreed and took advantage of the nation’s wave of fanatic terror against , and emerged on February 9, 1950, claiming he had a list of 205 people in the State Department who were known members of the American Communist Party. The American public went crazy with the thought of seditious communists living within the United States, and roared for the investigation of the underground agitators. These people on the list were in fact not all communists; some had proven merely to be alcoholics or sexual deviants. Regardless, McCarthy relentlessly pushed through and became the chairman of the Government Committee on Operations of the Senate, widening his scope to “investigate” dissenters. He continued to investigate for over two years, relentlessly questioning numerous government departments and the panic arising from the witch-hunts and fear of communism became know as McCarthyism.

Joseph McCarthy then accused several innocent citizens, most notably Owen Lattimore, of being associated with communism. Along the way, he had Louis Budenz, the former editor of The Daily Worker, back his accusations with evidence that was circumstantial at best, for Budenz was only using information he had heard from other people as much as 13 years prior. Another victim of McCarthy’s spurious communist accusations was Drew Pearson, a critic who discredited McCarthy’s accusations regularly through columns and radio broadcasts. McCarthy made seven speeches to the Senate on Pearson, which resulted in the loss of sponsors to Pearson’s show. Also, money was then raised to help numerous men sue Pearson, all charges of which he was found innocent and not liable.

McCarthy’s downfall finally began in October of 1953, when he started to investigate “communist infiltration into the military.” This was the final straw for then president Dwight D. Eisenhower, who realized that McCarthy’s movement needed to be stopped. The Army fired back at the accusations, sending information about McCarthy and advisors abusing congressional privileges to known critics of McCarthy. Reporters, Drew Pearson included, and other critics soon hopped on board, publishing unflattering articles about Joseph McCarthy and his methods of seeking out the supposed communists in America.

Through the televised investigations into the United States Army and the reporters’ attack, the nation grew to realize that McCarthy was “evil and unmatched in malice.” He lost his position as chairmanship on the Government Committee on Operations of the Senate and in December of 1954, a censure motion, which is a formal reprimand from a powerful body, was issued condemning his conduct with the vote count at 67 to 22. The media subsequently became disinterested in his communist allegations and McCarthy was virtually stripped of his power. He died in May of 1957 after being diagnosed with cirrhosis of the liver due to heavy drinking. The resounding effects of McCarthy’s era symbolized the pure terror of communism during the time due to the Cold War. Although it came to an end in a few short years, it attributed to the growing dissension between the Soviets and United States. History.com Throughout the 1940s and 1950s America was overwhelmed with concerns about the threat of communism growing in Eastern Europe and China. Capitalizing on those concerns, a young Senator named Joseph McCarthy made a public accusation that more than two hundred “card-carrying” communists had infiltrated the United States government. Though eventually his accusations were proven to be untrue, and he was censured by the Senate for unbecoming conduct, his zealous campaigning ushered in one of the most repressive times in 20th-century American politics.

While the House Un-American Activities Committee had been formed in 1938 as an anti-Communist organ, McCarthy’s accusations heightened the political tensions of the times. Known as McCarthyism, the paranoid hunt for infiltrators was notoriously difficult on writers and entertainers, many of whom were labeled communist sympathizers and were unable to continue working. Some had their passports taken away, while others were jailed for refusing to give the names of other communists. The trials, which were well publicized, could often destroy a career with a single unsubstantiated accusation. Among those well-known artists accused of communist sympathies or called before the committee were Dashiell Hammett, Waldo Salt, Lillian Hellman, Lena Horne, Paul Robeson, Elia Kazan, Arthur Miller, Aaron Copland, Leonard Bernstein, Charlie Chaplin and Group Theatre members Clifford Odets, Elia Kazan, and Stella Adler. In all, three hundred and twenty artists were blacklisted, and for many of them this meant the end of exceptional and promising careers.

During this time there were few in the press willing to stand up against McCarthy and the anti-Communist machine. Among those few were comedian Mort Sahl, and journalist Edward R. Murrow, whose strong criticisms of McCarthy are often cited as playing an important role in his eventual removal from power. By 1954, the fervor had died down and many actors and writers were able to return to work. Though relatively short, these proceedings remain one of the most shameful moments in modern U.S. history.

Islamophobia: the othering of Europe’s Muslims

Issue: 146 Posted on 11th April 2015 Hassan Mahamdallie

Islamophobia has become the predominant form of racism in Europe today. It is proving to be potent and multifaceted, manifesting itself at state, popular and party political level. It represents a profoundly divisive force, not least because the “Muslim question” is a central component of the “war on terror” characterised by those prosecuting it as an elemental struggle for the very survival of Western civilisation and Enlightenment values. This has thrown significant sections of the European left and liberal intelligentsia into confusion and reaction. Many of them have in effect abandoned Muslims to their fate and/or convinced themselves of the necessity for military interventions and draconian security measures to eradicate “Islamist terrorism” at home and abroad. The form of racism Islamophobia most resembles is anti-Semitism in that it seeks to “other” and then victimise a minority group on the basis that their culture and essential beliefs are a fundamental threat to the rest of society. As the late Edward Said observed, “Hostility to Islam in the modern Christian West has historically gone hand in hand with, has stemmed from the same source, has been nourished at the same stream as anti-Semitism”.1 Existing forms of racism, whether it be the persecution of the Roma peoples, against people based on their skin colour or xenophobia against migrants not only remain embedded in society; they have been revitalised by the growth of Islamophobia. The erosion of civil liberties and freedoms, such as the expansion of the surveillance state, although impacting first on Muslims, represents a much wider threat. The terrorist attack in Paris in January 2015 and its aftermath have produced another ratcheting up of punitive measures principally aimed at Muslims, as panicked governments across Europe realise they can’t protect their populations from future attacks carried out by heavily armed “self-starters” similar to those who carried out the killings at Charlie Hebdo’s offices and the kosher supermarket. The hostile political climate has become such that mass expressions of anti-Muslim hatred, around which other social and economic grievances coalesce, can seemingly spring out of nowhere. Pegida (Patriotic Europeans Against the Islamisation of the West) grew from a handful of people to demonstrations of 25,000 in a matter of weeks in Dresden, a city in the eastern province of Saxony. The movement, although initiated by far-right hooligans, quickly attracted a mass of older, middle class protesters, fearful of losing their pensions and savings, yet content to march under the ethnically exclusive slogan “Wir sind das Volk” (we are the people).2 A carnival of reaction The well springs of Islamophobia have been flowing for some time. Racism is not a set of ideas that float above society; it is expressed within particular historical circumstances and social relations. Islamophobia is made real in national and localised political and economic antagonisms—as the rise of Pegida shows—but its primary driver is the Western powers’ political and military interventions in the Middle East and other Muslim countries. It would be naive or disingenuous to think that the consequences of these events, and the violence, misery and instability on a huge scale inherent within them, could somehow be confined to the region. Although it has been previously argued in this journal that “racism towards Muslims pre-dates 9/11 and the ensuing warmongering” and that “it has far more to do with domestic social processes than a singular focus on the ‘war on terror’ would allow”,3 clearly Islamophobia has intensified in the present period and closely follows the contours of events in the Middle East and manifestations of its violence on Europe’s streets. In the present period (after 9/11 and the invasion of Afghanistan) a series of terrorist attacks have hit the capital cities of Europe—Madrid, London, Stockholm, Paris, Brussels and Copenhagen—mounted by individuals or groups who have stated their actions to be in retaliation against, or out of revenge for, Western military operations in Muslim territories, abuses such as Abu Ghraib and the CIA torture programme and the oppressive treatment of Muslims in Europe. This is not an explanation (nor a justification) for why these particular individuals decided to turn to violent methods, but it does help us identify the source of their grievances. As the former Labour deputy prime minister John Prescott stated in his typically blunt manner: I was with Tony Blair on Iraq. We were wrong. They told us it wasn’t regime change. It was. And that’s exactly what the Americans have had. Now Tony, unfortunately, is still into that. I mean the way he’s going now, he now wants to invade everywhere. He should put a white coat on with a red cross and let’s start the bloody crusades again. When I hear people talking about how people are radicalised, young Muslims. I’ll tell you how they are radicalised. Every time they watch the television where their families are worried, their kids are being killed and murdered and rockets firing on all these people, that’s what radicalises them.4 The Labour Party immediately distanced themselves from Prescott. This is to be expected given that when in government they not only prosecuted the invasions of Afghanistan and Iraq but put in place the basis for the subsequent scapegoating of Muslims. The ideological consensus across Europe’s governments is of a continent disarmed from within by multiculturalism and over-tolerant attitudes that has left society defenceless against the encroachment of Islamic fundamentalist ideas harboured within Muslim populations.5 Hence, David Cameron’s watershed speech to the annual Munich security conference in 2011 in which he criticised “the doctrine of state multiculturalism”, saying that we need “a lot less of the passive tolerance of recent years and much more active, muscular liberalism”.6 Cameron was lining up with other European leaders at the time, notably Nicolas Sarkozy in France, Angela Merkel in Germany and José Aznar in Spain who had already made similar speeches. One of preoccupations of the right (and their new fellow travellers of the former left) has been the very term “Islamophobia”, which they have sought to delegitimise. The term is anathema to both groups. They argue that even recognising that Islamophobia exists is tantamount to surrendering ground to the enemy. The right to vilify and denigrate the religious beliefs of a minority group has perversely come to symbolise the dividing line between democracy and totalitarianism. As Voltaire, decrying the French state’s violent persecution of the minority Protestant religion at the end of the 18th century, asked, “What I want to know is, on which side is the horror of fanaticism?”7 In the wake of the Charlie Hebdo attack France’s prime minister Manuel Valls stated: “I refuse to use this term ‘Islamophobia’, because those who use this word are trying to invalidate any criticism at all of Islamist ideology. The charge of ‘Islamophobia’ is used to silence people”.8 This argument flies in the face of reality, particularly in the French context. Muslims in Europe (and the United States) have been caught in an unbridled vitriolic firestorm in which their religion, ethnic backgrounds and cultures have become merged into a series of negative stereotypes, distilled, for example, into the Charlie Hebdo cartoons of the prophet Muhammad that recall classic anti-Semitic caricatures or vented in the constant stream of bigotry broadcast by Fox News. The notion that anyone in a position of power with or access to the media has held back from attacking Muslims and their beliefs for fear of being accused of racism is absurd. In fact, negative portrayals of Muslims are so widespread that those who articulate them are assured they will not be “called out”. It is quite something to realise that Islamophobia has become such common currency that it is effectively cloaked in invisibility. As the author and commentator Reza Aslan has pointed out of the US, “Islamophobia has become so mainstream in this country that Americans have been trained to expect violence against Muslims—not excuse it, but expect it. And that’s happened because you have an Islamophobia industry in this country devoted to making Americans think there’s an enemy within”.9 Aslan’s point was tragically reinforced when the murder of three young Arab-American Muslims in Chapel Hill, North Carolina, in February 2015, and the likely Islamophobic motive of their killer were markedly downplayed by the media. In the US Muslims have been the target of that which Nathan Lean describes as an “Islamophobia industry” using “lurid imagery, emotive language, charged stereotypes, and repetition, to exacerbate fears of a larger-than-life, ever- lurking Muslim presence”.10 Lean shows how the “industry” links the far-right, evangelical Christians, the Tea Party and various extreme fringe groups. It is heavily funded by powerful backers from right wing foundations and business interests and networked internationally. One of its recurring themes is the “Islamisation of America”, echoed in Europe by those warning of a dystopian “Eurabia”. The “thought experiments” of the Eurabia proponents contain the seeds of ethnic cleansing. The novelist Martin Amis casually remarks to a reporter that Muslims are “gaining on us demographically at a huge rate. A quarter of humanity now and by 2025 they’ll be a third. Italy’s down to 1.1 child per woman. We’re just going to be outnumbered”.11 Canadian writer Mark Steyn argues in a bestselling book that the war in Bosnia was caused by Muslims outbreeding their Serb counterparts. His conclusion: “In a democratic age you can’t buck demography— except through civil war. The Serbs figured that out—as other Continentals will in the years ahead: if you can’t outbreed the enemy, cull ’em. The problem Europe faces is that Bosnia’s demographic profile is now the model for the entire continent”.12 Who can hold that these words have no consequence when you realise that Steyn has plucked this grotesquery straight from the mouth of the Butcher of Srebrenica Ratko Mladic who explicitly justified his war crimes thus: “The Islamic world does not have the atomic bomb, but it has the demographic bomb. The whole of Europe will be swamped by Albanians and Muslims”?13 This obsession with breeding and protecting the gene pool indicates that this “new” racism is quite capable of incorporating older forms of biological racism. The reality? Muslims make up 4 percent of Europe’s population and in no country do they make up more than 7 percent (in the US the figure is between 0.2 and 0.6 percent). The majority of Europe’s Muslims lack (or are denied) meaningful political and economic influence and power at a national level. They are among the most deprived members of the working class; suffering discrimination, structural unemployment and the effects of poverty. A 2014 report based on Office for National Statistics data found that Muslims are facing the worst job discrimination of any minority group in Britain and have the lowest chance of being in work or in a managerial role. Researchers found that “Muslim men were up to 76 percent less likely to have a job of any kind compared to white, male British Christians of the same age and with the same qualifications. And Muslim women were up to 65 percent less likely to be employed than white Christian counterparts… Of those in work, the researchers found only 23 percent and 27 percent of Muslim Bangladeshis and Muslim Pakistanis, respectively, had a salaried job”.14 One of the researchers, Dr Nabil Khattab, found that Britain’s Muslims face both an ethnic and religious penalty in the job market. He concluded the situation was: likely to stem from placing Muslims collectively at the lowest stratum within the country’s racial or ethno-cultural system due to growing Islamophobia and hostility against them… They are perceived as disloyal and as a threat rather than just as a disadvantaged minority… Within this climate, many employers will be discouraged from employing qualified Muslims, especially if there are others from their own groups or others from less threatening groups who can fill these jobs.15 Khattab added: “The main components of this discrimination are skin colour and culture or religion. But colour is dynamic, which means white colour can be valued in one case, but devalued when associated with Muslims. Equally, having a dark skin colour—Hindu Indians, for example—is not always associated with any significant penalty.” Other research demonstrated job hunters with identifiable Muslim names had to send out nearly twice as many job applications before they got a positive response than those who had “white” names.16 Muslims over the age of 50 are more likely to suffer bad health than their peers in the general population. Nearly half of the entire Muslim population live in the ten most deprived local authority districts in England. Some 5 percent of Muslims are in hostels or temporary shelters for the homeless (general population figure 2.2 percent). Muslims are much more likely to live in social housing than the general population, and less likely to own their own home. Muslims are over 13 percent of the prison population (roughly 11,000 out of a prison population of 86,000, 8,000 of which are British black or South Asian). Overall there is greater disproportionality in the number of black people in prisons in the UK than in the US. Black and Muslim prisoners both report being perceived through racialised stereotypes; black prisoners through the lens of gangs and drugs and Muslim offenders through the lens of extremism and terrorism.17 The incarceration figures in France are even more stark: 70 percent of France’s prison population is Muslim, even though they make up around 7 percent of the population. The figure is even higher in prisons that serve Paris. Those who carried out terrorist attacks in Paris, Toulouse (and Brussels) all had backgrounds as petty criminals, and appear to have made up their minds to carry out the murders either in prison or upon their release. This is not to argue a direct causal link between being jailed and carrying out murderous attacks, but neither can we ignore the backgrounds and position in society of the attackers.18 Much has been written of Muslims’ closed societies, refusal to integrate and incompatible belief systems. However, numerous surveys have shown that Britain’s Muslims see themselves as British, identify with “British values”, are opposed to violence and, despite popular belief (notwithstanding their socio-economic circumstances), feel part of society. The only indicators upon which they depart from general attitudes is when it comes to defending their religion.19 How then have large elements of those who regard themselves as progressive and on the left come to the position by which they view this marginalised, vilified and oppressed section of the working class with such suspicion and animosity? A 2014 Pew Global Attitudes Europe survey found that although distrust of Muslims was mainly held by those who consider themselves as holding right wing ideas, a significant percentage of those who aligned with general left wing ideas also held negative views. So in France 47 percent of those on the right held anti-Muslim views as did 17 percent of those on the left. In Spain the figure was 54 percent of the right and 38 percent of the left. In Germany it was 47 percent of the right and 20 percent of the left and, in the UK 34 percent of the right and 19 percent of the left.20 Significant sections of the left and anti-racist groups have convinced themselves through a variety of baleful political misjudgements that the fundamental dividing line in Western society is between secularism and religious obscurantism. They believe that the principal enemy of the values emerging from the Enlightenment is not war, neoliberalism, austerity and the far-right, but Islam and its followers. This has led to the “othering” of Europe’s Muslims, and its corollary—the “comfort” of belonging to a (supposedly) superior group defined by shared beliefs, values and culture. This position relies partly on a reductive caricature of both the Enlightenment and religious ideas. Scholars such as Jonathan Israel have revealed a contradictory and ambiguous view of Islam among Enlightenment philosophers, many of whom took the study of the particulars of Islam extremely seriously.21 As Chris Harman wrote: They were far from seeing it [Islam] as do the B52 liberals who claim to be the heirs of the Enlightenment today. As Israel says, in “radical texts” the “image of Islam” was of “a pure monotheism of high moral calibre which was also a revolutionary force for positive change and one which proved from the outset to be both more rational and less bound to the miraculous than Christianity or Judaism”.22 Islam, like other religions, has its philosophic framework and textual approaches (hermeneutics) that cannot be reduced to a bundle of irrationalism and superstition. In this context setting up a false binary between the secular and religious ignores the philosophical advance that monotheistic religions such as Islam were able to achieve, contributing to the rationalism underpinning the Enlightenment itself. This advance in human thought was recognised by Edward Gibbon in his sympathetic chapter on the prophet Muhammad in The History of The Decline and Fall of the Roman Empire (published between 1776 and 1788): The creed of Mahomet is free from suspicion or ambiguity; and the Koran is a glorious testimony to the unity of God. The prophet of Mecca rejected the worship of idols and men, of stars and planets, on the rational principle that whatever rises must set, that whatever is born must die, that whatever is corruptible must decay and perish… These sublime truths, thus announced in the language of the prophet, are firmly held by his disciples, and defined with metaphysical precision by the interpreters of the Koran… The first principle of reason and revolution was confirmed by the voice of Mahomet: his proselytes, from India to Morocco, are distinguished by the name of Unitarians; and the danger of idolatry has been prevented by the interdiction of images.23 Despite this, there are those who regard themselves as the inheritors of Gibbon and the Enlightenment who believe the left project to be an exclusively secular journey. They view the emergence of a Muslim religious identity, particularly in the West, as an unambiguously backward development. For them, the “good old days” (before the Satanic Verses affair at the end of the 1980s) when Asians were Asians you could have a beer with after a demo, worthy of the attention of the anti-racist left, have been usurped by a Muslim identity that places its adherents beyond the pale and undeserving of support. This turns the entire anti-racist tradition on its head. Even putting aside the principle of solidarity, the notion that those who hold to an Islamic identity are an undifferentiated mass prone to backward ideas, who must somehow pass a “secular test” before they can either be supported or be involved in progressive struggles, is fundamentally a reflection of the dominant discourse of the right. The French political scientist François Burgat, who specialises in the emergence of political Islam, has taken the left in France to task (although his criticism applies more widely) for abandoning anti-imperialism and anti-racism and collapsing into abstention, indifference, hostility and denial: The political right has found in the Islamic spectre a confirmation of some of its old prejudices towards Islam, the Third World and Arabs. The left is in principle more inclined to accept the emergence of the “other”, yet it too has made a spectacular mistake: although it is capable of recognising Arabs, it loses its bearings and ability to be rational when dealing with Muslims. Its anti-clericalism focuses on the religious content of a phenomenon. Once the left has retreated behind its supercilious (should one say fundamentalist?) attachment to the symbols of its “secularism”, it becomes incapable of admitting that the universalism of republican thought might be challenged in part or in whole, and that someone might one day dare to write a piece of history in a vocabulary that is not its own.24 Why single out Muslims from other religious minorities and deny them the capacity to make their own history? For example, why should a Muslim woman who wears a headscarf be regarded any differently from a Sikh man who wears a turban as an outward sign of his religiosity, or a Jewish man who wears a kippah? It is also simplistic to say that young Pakistanis and Bangladeshis, for example, swapped their Asian identity for a Muslim one after falling into the arms of reactionary Imams during the Rushdie affair.25 This is to confuse a political banner that unifies minority groups fighting back against oppression with their individual cultural or religious beliefs. Although Pakistanis and Bangladeshis rightly forged a unity with others in the 1970s as “black” or “Asian” that does not automatically imply that they gave up their religious identities to do so. The reality was more complex. As Tariq Mehmood, one of the Bradford 12, put on trial for defending their community from the National Front in the early 1980s, recounts: Most of the people in the youth movements were religious, but religion was not an issue for the members, it was their own affair. Many Sikhs, Hindus and Christians helped to protect mosques, as Muslims did of temples when they were attacked. We had very close relationships with gurdwaras and mosques whom we were always calling upon to support us in our actions. There were many among the Muslim [members] who kept all fasts… The unity was in anti-racism and anti-imperialism. Even among these groups there were believers and non-believers all working together. Ishaq Mohammed Kazi came to me about the question of God. Two weeks later he was in jail as part of the Bradford 12. Religion was important to many—weddings, funerals, etc. People celebrated or commemorated in their own ways. Any divisions were political, either Labour Party or left party. Or else caste or national.26 In the period before this the previous generation of Muslims who migrated to Britain in the 1960s and 1970s entered into a series of struggles in the workplace, in the community and against racism and fascism; yet we should not forget that at the same time they were clubbing together to buy the empty premises that laid the foundation for the early network of mosques.27 Islamophobia and racism Writer and broadcaster Kenan Malik describes himself as a holder of secular universalist Enlightenment views. He defines himself against a left represented by this journal whom he regards as abandoning these principles post- Rushdie in favour of multiculturalism and identity politics. In an oft-cited article “The Islamophobia Myth” Malik asserts that “there is a fundamental difference between race and religion. You can’t choose your skin colour; you can choose your beliefs. Religion is a set of beliefs. I can be hateful about other beliefs, such as conservatism or communism. So why can’t I be hateful about religion too?”28 However, this seemingly neat distinction can only survive if separated from reality, or to quote Marx, existing “as an independent realm in the clouds”.29 First, Kenan, in an effort to make a convincing argument sets aside the basic anti-racist insight that “race” is a social construct that has no scientific basis. As such it is open to wide and differing interpretation. So under British law a racial group is defined as “any group of people who are defined by reference to their race, colour, nationality (including citizenship) or ethnic or national origin”. In 1982 a House of Lords judgement expanded this definition to include “ethno-religious” groups including Sikhs and Jews, arising from the case of a Sikh school student who was sent home from a Birmingham school because he was wearing a turban. It is the lack of inclusion of Muslims under this protective category that the BNP, EDL and others have exploited; allowing them to use racist and inflammatory language against Muslims that they would be prosecuted for if aimed at black people and Jews. A similar situation exists in other European countries. Secondly, recent history shows that reactionary forces are wholly capable of collapsing the distinctions between race and religion into one another, with terrible consequences. Consider the worst example of ethnic cleansing since the Second World War—the Bosnian War (1992-5). The Muslim population of Bosnia were massacred and driven out by Serb ultra-nationalist forces despite both ethnic groups sharing the same racial phenotype, the same language root and a common culture (apart from their religious denominations). Indeed, as Sejad Mekic writes, “over centuries, Bosnians had gone beyond tolerance to embrace synthetic, eclectic religious norms, with each religious group often borrowing customs and rituals from its rivals”.30 The massacre could happen because the Serb leadership were able to “racialise” the Muslim population in the eyes of their Serb counterparts. The subsequent war led to the deaths of 100,000 people and 2 million driven from their homes. The third rebuttal of Malik’s position is that the vast majority of Muslims living in Europe (and the US) also belong to racial “types” that have been the main objects of racism and discrimination throughout recent history. Although Europe’s Muslims are very diverse in their origins, nationalities, histories, culture, political and religious allegiances, the majority are of Asian or African heritage. Seven out of ten British Muslims are South Asian with the others being mostly of African or Arab descent. Most Muslims in France have roots in North Africa, around two thirds of German Muslims are of Turkish descent, the Dutch Muslim population is made up principally of those of Moroccan and Turkish origin as well as refugees from the Middle East and Africa, with Muslims in Scandinavia also being drawn from displaced people from war zones such as Palestine, Somalia and Iraq. The effect of Islamophobia has been to overlay a negative religious identity on top of a pre-existing negative racial identity. The two have become merged and mutually reinforcing. Naser Meer and Tariq Modood write that Islamophobia has: A religious and cultural dimension, but equally clearly, bares a phenotypical component. For while it is true that “Muslim” is not a (putative) biological category…neither was “Jew”. It took a long, non-linear history of racialisation to turn an ethno-religious group into a race. More precisely the latter did not so much as replace the former but superimposed itself. As they point out in relation to Bosnia: “The ethnic cleanser, unlike an inquisitor, wasted no time in finding out what people believed, if and how often they went to a mosque and so on: their victims were racially identified as Muslims”.31 Regardless of all this, Kenan Malik argues in his article that there is no proof of a direct link between hostility towards Islam and attacks on Muslims. “Should we treat every attack on a Muslim as Islamophobic? If an Afghan taxi driver is assaulted, is this a racist attack, an Islamophobic incident or simply a case of random violence?”32 The nature of racist attacks on Muslims shows that Malik’s distinctions are not at all apparent to the perpetrators. Reports of attacks describe physical violence and intimidation accompanied by insults combining racism, xenophobia and Islamophobia, such as “Paki”, “Go back to where you came from”, and “Terrorist”.33 This demonstrates the “ethno-religious” nature of Islamophobia argued above. Indeed “white” Muslims on the receiving end of hate attacks can find their race transmuted to fit the phenotype associated with being a Muslim. In one reported incident a white British Muslim woman who had a car driven at her reported the perpetrator shouting “I’m going to pop you Muslim” before calling her a “fucking Paki bastard”. As she concluded, “it doesn’t matter how white you are”.34 This racialisation is also evident in bias against Muslims in the labour market. Researchers undertaking a detailed study of the effect of discrimination against Muslims were surprised to find that: “White Muslims also experience an employment penalty, other things being equal”.35 Analysis of the now familiar “spikes” in attacks on Muslims show their attackers regard them as deserving collective punishment. Even taking into account disputes over the accuracy of figures (including under-reporting) and their interpretation, a pattern can be clearly discerned. After the 2005 London bombings the Metropolitan Police reported that “religious hate crimes, mostly against Muslims, have risen six-fold in London since the bombings” compared to the previous year.36 A 2014 report by the monitoring group Tell MAMA (Measuring anti-Muslim Attacks) and researchers at Teesside University reported that: One of the most significant events in the field of anti-Muslim hate crime over the past few years was doubtless the ruthless murder of Lee Rigby, and the ensuing anti-Muslim backlash. While different agencies reported different rates of increase—Tell MAMA found a 373 percent increase over the course of a week relative to the week before— one London Borough Commander suggested that there had been an eight-fold increase in parts of London, and Home Office Statistics suggesting a low estimate of a 63 percent increase in the West Midlands—it is clear that anti- Muslim hate crime spiked after this.37 Immediately after the January 2015 Paris attack it was reported that: 26 mosques around France have been subject to attack by firebombs, gunfire, pig heads, and grenades as Muslims are targeted with violence in the wake of the Paris attacks. France’s National Observatory Against Islamophobia reports that since last Wednesday a total of 60 Islamophobic incidents have been recorded, with countless minor encounters believed to have gone unreported.38 A feature of Islamophobia is the disproportionate level of attacks on Muslim women, particularly those wearing outward signs of religiosity. This not only demonstrates that the attacks are motivated by anti-Islamic sentiment, but also underlines the bankruptcy of pseudo-feminist/Enlightenment arguments against the hijab, niqab and abaya (dress). The Islamophobic view of Muslim women specifically as carriers of fundamentalist ideas, and of their clothing as signifiers of their intent, has made them a target for discrimination, abuse and violence. Attacks and threats against Muslim women account for 58 percent of all incidents reported to Tell MAMA. Of these, 80 percent were visually identifiable as Muslim—“wearing hijab, niqab or other clothing associated with Islam”.39 As Liz Fekete has pointed out: The call to ban the hijab in the name of individual autonomy relies on essentialist arguments about Islam that deny any personal autonomy to Muslim women and girls… A debate about the furthering of Enlightenment values leads to the exclusion of Muslim women and girls from the culture of civil rights. Because veiled women are not, in the eyes of their “liberators”, autonomous beings (they are either representatives of, or victims of, a fundamentalist culture), they are denied political agency altogether.40 It is important that the racist reality of Islamophobia is acknowledged against all those who seek to deny it in order to wield it. However, it is also crucial that its nature—how it resembles or differs from other racisms—is understood, so that it may be effectively opposed. Contrary to the right wing conspiracy theory that the term Islamophobia was an invention of the mullahs of the Iranian Revolution to deflect attention away from their theocratic excesses, the term seems to have been first used a century ago, but became common currency in 1997 with the publication in Britain of the report “Islamophobia: A Challenge for Us All” by the Runnymede Trust.41 The report defined Islamophobia as an “unfounded hostility towards Islam [and] the practical consequences of such hostility in unfair discrimination against Muslim individuals and communities, and to the exclusion of Muslims from mainstream political and social affairs”. The report’s authors conceded that “the term is not, admittedly, ideal”, before explaining that: The word “Islamophobia” has been coined because there is a new reality which needs naming: anti-Muslim prejudice has grown so considerably and so rapidly in recent years that a new item in the vocabulary is needed so it can be identified and acted upon. In a similar way there was a time in European history when a new word, anti- Semitism, was needed and coined to highlight the growing dangers of anti-Jewish hostility.42 However, the term has been subject to scrutiny from the left. The anti-racist educationalist Robin Richardson has pointed out that: The use of the word Islamophobia on its own implies that hostility towards Muslims is unrelated to, and basically dissimilar from, other forms of hostility such as racism, xenophobia, sectarianism, and such as hostility to so-called fundamentalism. Further, it may imply there is no connection with issues of class, power, status and territory; or with issues of military, political economic competition and conflict.43 As the racism of imperialism was rooted in its earlier mode—the racism that justified transatlantic slavery and colonialism, so this new racism also draws upon historic foundations. Islamophobia in its extreme forms reveals pseudo-biological and racial justifications and imperious attitudes to Muslims and their religion and cultures that predate our times. “Muslim” and “Islam” carry powerful historical associations, particularly in the West. As Talat Ahmed has argued, “the demonisation of contemporary Muslims utilises imagery from previous periods and reinvents it to fit current needs. In the process older forms of racism can be accommodated and given space to spew their bile relatively unhindered”.44 It is possible to avoid regarding Western attitudes towards Islam and Muslim societies as monolithic and wholly negative, while recognising that colonial expansion into the Middle East particularly cast Islam and Muslim societies as inferior. The academic Aziz Al-Azmeh argues that during the colonial period the dominant “orientalist discourse” created the monolithic figure of homo islamicus in contrast to the perceived essence of Western civilisation: reason, freedom and perfectibility: To reason corresponded enthusiastic unreason, politically translated as fanaticism, a major concern of 19th century scholars and colonialists as of today’s television commentators. That notion provided an explanation for political and social antagonism to colonial and post-colonial rule, by reducing political and social movements to motivations humans share with animals. Freedom was contrasted with “a total abandonment of individuality to the exclusive worship of an abstract god…the subjection of individuality to collectivity”, and while Western thinkers convinced themselves to be well along the evolutionary path to a perfect higher civilisation, Islam could be looked back upon as a flawed anomaly characterised by “despotism, unreason, belief, stagnation, medievalism”.45 In regard to the 19th century notion that Muslim countries are inherently primitive, and the contemporary argument that military intervention can force backward states onto the higher path of Western modernity, one must ask how these particular countries might have progressed if they had been spared colonialisation and had been allowed to develop independently. As the early 20th century radical anthropologist Franz Boas, describing the effect of colonialism, put it: “The rapid dissemination of Europeans over the whole world destroyed all promising beginnings which had arisen in various regions”.46 To point this out is not a denial of historical progress, or the dismissal of the Enlightenment as merely a Eurocentric discourse. In fact it is a prerequisite for restoring the potentiality of the Enlightenment as a project yet to be fully realised. As Alex Callinicos has argued: Really overcoming Eurocentrism depends chiefly on the historian taking two steps, one ethico-political, the other conceptual. First, no historic discourse can hope to attain genuine universality unless it involves the recognition of, and gives proper weight to, the crimes perpetrated during the establishment and maintenance of Western domination over the globe… Such a moral reorientation must be accompanied, secondly, by the conceptual decentring of historical discourse. This involves, above all, the refusal to treat the pattern of development associated with any particular region or country as a model in terms of which happenings elsewhere are to be understood.47 Boas also touched upon this concept of “decentring” as a way of gaining understanding of the complexity of human development: It is somewhat difficult for us to recognise that the value which we attribute to our civilisation is due to the fact that we participate in this civilisation…but it is certainly conceivable that there may be other civilisations, based perhaps on different traditions and on a different equilibrium of emotion and reason, which are of no less value than ours.48 There is no one template for human development, with universal ideals being the sole property of the West. It is not lost on Muslims (and those who defend them) that those presently bludgeoning them into submission with “Enlightenment values” are simultaneously undermining those self-same values. In its earliest phase modern imperialism generated a racism based on the ideology that the “races” and societies of Asia and Africa were inferior and in need of the civilising influence of the occupying Western powers. The “new imperialism” we are in the midst of, focused as it is on military conflicts to secure access to and influence over the strategic resources and territories of the Middle East, echoes these old prejudices, but as part of a new set of interlocking racist ideas. In the colonial period Muslims could be cast as the savage “enemy without”—today the “enemy” is within. Modern Islamophobia relies on the presence of Muslim populations in Europe; the result of post Second World War labour immigration and to a lesser extent, settlement of Muslims as asylum seekers. This represents both a continuation of previous Islamophobic ideas, and a sharp reconfiguration in the present period. Islamophobia provides the singular and distorting prism through which Muslims are increasingly scrutinised, from Muslim involvement in the education system (the 2014 Trojan Horse affair where Muslims were accused of plotting to take over schools in Birmingham), local politics (the usurping of Tower Hamlets council) and the racialisation of crime (Pakistani men and child abuse). National security and linked issues such as immigration and Britishness are reoccurring themes in domestic politics. This “othering” of an identifiable minority paves the way for the politics of scapegoating and division. Anti-Muslim sentiment, as the Pegida movement shows, has the potential to act as a fleeting and illusory outlet for the discontent felt by those suffering from the neoliberal economic assault. Far-right and xenophobic parties and formations, feeding off the widespread anxiety and despair produced by neoliberal economic policies, have seized on “the Muslim problem”, recast it for their own ends and amplified it to a national level. Islamophobia has been incorporated into parties such as the Front National, which has largely (for public consumption at least) abandoned its anti-Semitic roots in favour of a virulent anti-Muslim agenda. It has been highlighted in this journal and elsewhere that there is a growing social and political instability caused by the retreat of mainstream politics from the public domain, of which far-right formations have been, thus far, the chief beneficiary.49 The failure of much of the left and anti-racist movements consistently to oppose Islamophobia is further compounded when one considers that existing manifestations of racism have become reinvigorated by the scapegoating of Muslims. State institutions such as the police, previously on the retreat around racism, have taken advantage of the present situation and the granting of new anti-terrorism powers to return to “traditional” methods of discrimination, such as the revival of mass racial profiling and stop and search operations. For example, it recently came to light that the New York Police Department set up a secretive “Demographics Unit” in 2001. An Associated Press investigation found that: “Starting shortly after the September 11 terrorist attacks, officers infiltrated Muslim communities and spied on hundreds or perhaps thousands of totally innocent Americans at mosques, colleges, and elsewhere.” These officers “put American citizens under surveillance and scrutinised where they ate, prayed and worked, not because of charges of wrongdoing but because of their ethnicity… Informants were paid to bait Muslims into making inflammatory statements”.50 The NYPD later had to admit the operation had failed to produce any significant terrorism leads. Criminologists have observed how the British police’s racial profiling has shifted to harness new possibilities opened up by the “war on terror”: Recent evidence…has suggested that perceptions of Asian and particularly Muslim people have undergone a transformation. Stereotypes, which assumed that Asian people were conformist, are now thought to be less applicable and, rather, the very stereotypes assumed to explain law-abiding behaviour (eg family pressures, tight knit communities and high levels of social control) are now thought to promote criminal and deviant activity amongst Asian youth… The shift in the perception of such groups has been located in both local and global notions of Asian youth as increasingly involved in gangs, violent, disorderly, riotous and, more recently, as potential terrorists.51 There are other signs that Islamophobia is feeding back into general levels of racist ideas. Analysis of data from the authoritative British Social Attitudes Survey, reported in the Guardian in 2014, found that “the proportion of Britons who admit to being racially prejudiced has risen since the start of the millennium, raising concerns that growing hostility to immigrants and widespread Islamophobia are setting community relations back 20 years”.52 The data showed that there has been “a broad decline in the proportion of people who said they were either ‘very or a little prejudiced’ against people of other races—from a high of 38 percent in 1987 to an all-time low of 25 percent in 2001. However, in 2002, following the 9/11 attacks in New York and the invasion of Afghanistan, there was a sharp rise in self-reported racial prejudice. Over the next 12 years that upward trend continued to a high of 38 percent in 2011.” Tariq Modood, commenting on the report, said: “I don’t think there is any doubt that hostility to Muslims and suspicion of Muslims has increased since 9/11, and that is having a knock-on effect on race and levels of racial prejudice”.53 Alongside, but not separate to, the rise in Islamophobia has been a spiralling debate on immigration into the European Union. One of the consequences, which bridges hostility towards Muslims with xenophobia, has been the insistence by various states that a prerequisite of citizenship is declared allegiance to what are called “core national values”—a measure clearly targeted principally at Muslims. Policies towards asylum seekers are also being refashioned. European states have been eager for some time to have humanitarian agencies set up refugee camps inside or on the borders of conflict zones such as Syria in an effort to avoid a commitment to granting “in country” asylum. The British government, going one step further, to avoid accepting male Muslim refugees (seen as potential terrorists) have drawn up criteria that allow for a handful of “women and girls at risk of sexual violence; the elderly; the disabled and survivors of torture” the chance to be granted asylum.54 The mutually reinforcing effects of anti-Muslim, racist and scapegoating politics have already changed the political landscape in Europe. Liz Fekete has pointed out: The influence of xenophobic and Islamophobic parties, either as junior partners in coalition governments or as the recipients of the public vote, is unprecedented, and reflects a major realignment of forces that has taken place as a direct consequence of the war on terror. With its aggressive call for “integration” (meaning assimilation), to be achieved through “the scrubbing out of multiculturalism”, the realigned right—whose elements range from post- fascist to liberals and even some social democrats—is using state power to reinforce fears about “aliens” and put in place legal and administrative structures that discriminate against Muslims.55 There is no reason to believe that Britain is automatically immune from this right wing populist trend. Nigel Farage, the leader of UKIP, has already signalled the attraction of riding the Islamophobic wave, when, following the Paris attacks, he said mass immigration had “made it frankly impossible for many new communities to integrate…We do have, I’m afraid, I’m sad to say, a fifth column that is living within our own countries, that is utterly opposed to our values. We’re going to have to be a lot braver and a lot more courageous in standing up for our Judeo-Christian culture”.56 Farage, hoping that the forthcoming general election will gift him with leverage in a future coalition government, will be aware of the Sweden Democrats, who have gained rapid electoral success by pushing a platform combining hostility to the European Union with an anti-immigrant, anti-multiculturalism and anti-Muslim rhetoric. The Sweden Democrats’ ability to alter national politics was demonstrated when in December they wielded their parliamentary vote to bring down the two month old centre-left government and force a new general election (although the mainstream parties later got together to reverse this outcome).57

Native American Genocide The Cherokees in 1828 were not nomadic savages. In fact, they had assimilated many European-style customs, including the wearing of gowns by Cherokee women. They built roads, schools and churches, had a system of representational government and were farmers and cattle ranchers. A Cherokee alphabet, the "Talking Leaves" was created by Sequoyah.

In 1830 the Congress of the United States passed the "Indian Removal Act." Although many Americans were against the act, most notably Tennessee "I would sooner be honestly Congressman Davy Crockett, it passed anyway. President Andrew Jackson damned than hypocritically quickly signed the bill into law. The Cherokees attempted to fight removal immortalized" legally by challenging the removal laws in the Supreme Court and by Davy Crockett establishing an independent Cherokee Nation. At first the court seemed to rule His political career destroyed against the Indians. In Cherokee Nation v. Georgia, the Court refused to hear a because he supported the case extending Georgia's laws on the Cherokee because they did not represent Cherokee, he left Washington a sovereign nation. D. C. and headed west to In 1832, the U.S. Supreme Court ruled in favor of the Cherokee on the same Texas. issue in Worcester v. Georgia. In this case Chief Justice John Marshall ruled that the Cherokee Nation was sovereign, making the removal laws invalid. The Cherokee would have to agree to removal in a treaty. The treaty then would have to be ratified by the Senate.

By 1835 the Cherokee were divided and despondent. Most supported Principal Chief John Ross, who fought the encroachment of whites starting with the 1832 land lottery. However, a minority(less than 500 out of 17,000 Cherokee in North Georgia) followed Major Ridge, his son John, and Elias Boudinot, who advocated removal. The Treaty of New Echota, signed by Ridge and members of the Treaty Party in 1835, gave Jackson the legal document he needed to remove the Cherokee. Ratification of the treaty by the United States Senate sealed the fate of the Cherokee. Among the few who spoke out against the ratification were Daniel Webster and Henry Clay, but it passed by a single vote. In 1838 the United States began the removal to Oklahoma, fulfilling a promise the government made to Georgia in 1802. Ordered to move on the Cherokee, General John Wool resigned his command in protest, delaying the action. His replacement, General Winfield Scott, arrived at New Echota on May_17, 1838 with 7000 men. Early that summer General Scott and the United States Army began the invasion of the Cherokee Nation.

In one of the saddest episodes of our brief history, men, women, and children were taken from their land, herded into makeshift forts with minimal facilities and food, then forced to march a thousand miles(Some made part of the trip by boat in equally horrible conditions). Under the generally indifferent army commanders, human losses for the first groups of Cherokee removed were extremely high. John Ross made an urgent appeal to Scott, requesting that the general let his people lead the tribe west. General Scott agreed. Ross organized the Cherokee into smaller

groups and let them move separately through the wilderness so they Painting by Robert Lindneux could forage for food. Although the parties under Ross left in early Woolaroc Museum fall and arrived in Oklahoma during the brutal winter of 1838-39, he significantly reduced the loss of life among his people. About 4000 Cherokee died as a result of the removal. The route they traversed and the journey itself became known as "The Trail of Tears"or, as a direct translation from Cherokee, "The Trail Where They Cried" ("Nunna daul Tsuny").

Ironically, just as the Creeks killed Chief McIntosh for signing the Treaty of Indian Springs, the Cherokee killed Major Ridge, his son and Elias Boudinot for signing the Treaty of New Echota. Chief John Ross, who valiantly resisted the forced removal of the Cherokee, lost his wife Quatie during the western movement of the Cherokee. And so a country formed fifty years earlier on the premise "...that all men are created equal, and that they are endowed by their Creator with certain unalienable rights, among these the right to life, liberty and the pursuit of happiness.." brutally closed the curtain on a culture that had done no wrong.

No better symbol exists of the pain and suffering of the Trail Where They Cried than the Cherokee Rose. The mothers of the Cherokee grieved so much that the chiefs prayed for a sign to lift the mother's spirits and give them strength to care for their children. From that day forward, a beautiful new flower, a rose, grew wherever a mother's tear fell to the ground. The rose is white, The for the mother's tears. It has a gold center, for the gold taken from the Cherokee lands, and seven Legend leaves on each stem that represent the seven Cherokee clans that made the journey. To this day, of the the Cherokee Rose prospers along the route of the "Trail of Tears". Cherokee Rose

The Cherokee Rose is now the official flower of the State of Georgia.

Inaccuracies That Hurt or Insult

In American Indian educational literature much bias results in the perpetuation of stereotypical myths. The negative stereotypes fostered by textbooks are sustained and elaborated by communications media, the primary culprits being television programs, movies, and novels.

Stereotypes sell. To this day, consumers recognize the stylized Indian chief on cans of Calumet Baking Powder and the kneeling Indian maiden on packages of Land O' Lakes butter. The athletic fortunes of the Braves, Indians, Chiefs, Redskins, and Black Hawks are followed by professional sports fans across the country (Steele, 1996). Teachers should challenge and examine cultural stereotypes. Discuss with children the television programs and movies depicting Indians as romantic images of the frontier past.

Many teachers are accustomed to teach "A" is for Apple and "I" is for Indian. If you want children to grow up respecting American Indian people, do not equate Indians with things. Actually, the culturally responsive term is "I" is for Indigenous.

Using the past tense in teaching lessons about Indians is an error; for example, studying, "how the Indians lived." Saying "lived" suggests that there are no more Indians living today. Indian people are very much a part of today, and each tribe has a name and separate culture.

Language

Xenophobic-coded language includes the English language created by monoculturalists or triumphalists to combat multicultural education—words like "politically correct" or "politically incorrect" and so forth. Another means by which language shapes our perspectives has been found in many history books, such as Columbus "discovered" America; Indians were "moved" to Oklahoma; slaves were "brought" to America; the continental railroad was built, conveniently omitting information about the Chinese laborers who built most of it or the oppression they suffered.

Ethnocentric perspective is the inability to separate ourselves from our own cultural backgrounds and biases to understand the behaviors of others. It's the belief that members of one's group are superior to the members of other groups. It shows up, for example, when someone makes references to: "first schools" in America; "first town" in the state; "first church" in town; "New World" versus "Old World" (who is it new to?); "uncivilized tribes," "civilized tribes," "primitive," "greatest nation in the world" (what about tribal nations?), and so forth.

Refrain from using "uncivilized" when referring to any tribal culture. Use the words "unique" or "different" instead. Try not to suggest that cultural groups are superior or inferior.

Lumping all Indians together is a mistake. Tribes of one nation are sovereign nations and are as different from another tribe as are from Swedes. The word "Indian" is a misnomer. "Indian" came from Christopher Columbus' mistaken belief that he actually reached the East Indies. As erroneous as this term is, it is the most commonly used name for the Indigenous peoples of North America. The best way to refer to Indigenous peoples is by their tribal names. For example, if you mean, "Comanches lived this way a long time ago," then say so.

Often teachers are unaware of the impact of saying to their class, "all right children, let's all sit Indian style in a circle." This directive suggests that there is a common style or way in which all Indians sit down on the ground and in a circle. This is yet another example of a Hollywood-invented stereotype Nobody would dare think of saying the same thing to other ethnic groups: "sit down Black style," "sit down Mexican style," "sit down Asian style," and so forth. The way to address this exercise is to say, "sit down in a circle with your legs crossed in front of you."

Avoid allowing children to imitate indigenous people by saying "how" or "ugh." These words are often times used as if they were common phrases in conversation among indigenous peoples. Ask yourself, did all Indians really hold up their hand and ask "How?" How what? Is not "how" also an English word? Asking, "how?" or grunting "ugh" are insulting, nonsensical, verbal symbols of Indianness. So is yelling "Geromino" when jumping off a driving board (Mihesuah, 1996). What about the terminology suggested when someone says or refers to "Montezuma's Revenge"?

Other Proverbial Stereotypes

Little has changed for many school children when it comes to prevailing and proverbial stereotypes of Indian people. According to Slapin and Seale (1992), some children who go back on their promises are called "Indian givers." "Ten Little Indians" is still a popular counting song. Non-Indian children still dress as "Indians" for Halloween. Around Thanksgiving, teachers all over the United States routinely trim their bulletin boards with befeathered headdresses and "Indian" costumes. Books about American Indians are still written, published, and promoted by non-Indians.

In visual representation look for tokenism—European versions of physiology as norms; emphasis of stereotypical but basically irrelevant emblems or symbols; and inaccuracies and lack of cultural differentiation.

Refrain from teaching that Columbus was a hero without examining his relations with indigenous peoples. The same can be said about George Armstrong Custer, Andrew Jackson, George Washington, William Henry Harrison, Teddy Roosevelt, and others who believed Indians to be inferior to Europeans (Mihesuah, 1996).

Also, heroines like Sacajawea, Pocahontas, and other indigenous women in U.S. history were elevated to royalty status like princesses and queens. To suggest that indigenous females were princess and queens is contrary to many tribal beliefs. It's a European concept. Moreover, many tribes are a patriarchy as well as an matriarchy.

As Pauline Turner Strong observes, "Disney has created a marketable New Age Pocahontas to embody our millennial dreams for wholeness and harmony, while banishing our nightmares of savagery without and emptiness within" (Bird, 1996, p. 3). Dressing up your children to demonstrate, reenact, and/or play cowboys and Indians is playing genocide. It's a reminder of the cultural genocide of the indigenous people of North America. This is as dishonest as playing "happy mammy and plantation owner s wife." After all, Pilgrims, Puritans, and other colonists thought that Indians were heathens and savages, and according to some, the devil's disciples. Within 50 years of the "Thanksgiving Feast," thousands of Indians were dead at the hands of colonists and disease. Thanksgiving indeed. In fact, many indigenous people recognize Thanksgiving as a "Day of Mourning" (Mihesuah, 1996).

A common mistake of teachers to is have children make headdresses to wear or cut out headbands from construction paper to be placed on their childrens' heads in so-called Hollywood "Indian style." This again is a stereotype as not all indigenous people wore headdresses. Most headdresses suggested war bonnets, sometimes full- length to represent valor and the honorable accomplishments of a plains warrior. Many headdresses are from tribes other than the typical plains tribes.

The style, materials, and significance of headdresses varied from tribe to tribe and warrior to warrior depending on the warrior status among the people. Not every tribal member wore a feather or headdress. Indigenous men certainly did not wear headdresses to play around in, therefore, nor should children.

Be very cautious when teaching children about song and dance. Singing or dancing "Indian style" and "having a pow-wow" to many children today, is-"cool." On the other hand, many traditional people do not see their tribal cultures as cool. It is traditional and should be treated with respect and honor.

Children should not dance Hollywood Indian-style, nor should children beat on a drum and try to sing traditional songs. Social and traditional songs and dances have religious meaning for many tribes, and any attempt at imitation is ridicule. The ability to beat on the drum and sing song is earned through tribal rites of passages.

The Declaration of Independence refers to indigenous people as "merciless savages." George Washington bought and sold Indian lands without tribes' permission, fought and killed Indians without mercy, and owned almost 500 African American slaves. In his book, The Winning of the West, Theodore Roosevelt wrote that Indians are "filthy," "lecherous," and "faithless," in addition to living lives that "were but a few degrees less meaningless, squalid and ferocious than that of the wild beasts with whom they held joint ownership" (Mihesuah, 1996).

Teach students the other side of the story—how the West was lost. If possible, chronicle each tribal group and their history and relationship with the U.S. government from their beginning to the present.

Teachers' understanding of the social world is based on their own life experiences. Sleeter wrote "that what White people know about the social world is generally correct, but only for understanding White people" (1992 p. 211). She referred to the perspective of most white teachers about race as "dysconscious racism," a term she borrowed from Joyce King. King wrote that "Dysconscious racism is a form of racism that tacitly accepts dominant white norms and privileges. It is not the absence of consciousness (that is, unconsciousness) but an impaired consciousness or distorted way of thinking about race as compared to, for example, critical consciousness" (1991, p. 135).

A problem arises when persons, symbols, or behaviors project a negative or stereotypical image. Sports mascots, logos, and related paraphernalia too often reflect stereotypes rather than authenticity. Various mascot sponsors have invented Indian characters that have nothing to do with the reality of indigenous peoples' lives, past or present. These mascots and logos, and the images that they convey, homogenize hundreds of indigenous cultures, robbing them of their distinctive identities and distorting their roles in U.S. history.

My response has always been that Indian mascots, logos, nicknames, and gestures offend tens of thousands of American Indians because these invented media images prevent millions from understanding an authentic human experience. These sporting acts are examples of cultural violence that have contributed toward negative, distorted images of indigenous peoples (Pewewardy, 1991). The bottom line is whether the indigenous people feel respect and/or honor of being a mascot or logo for sports teams.

Trans-Atlantic Slave Trade Unlike most twentieth-century cases of premeditated mass killing, the African slave trade was not undertaken by a single political force or military entity during the course of a few months or years. The transatlantic slave trade lasted for 400 years, from the 1450s to the 1860s, as a series of exchanges of captives reaching from the interior of sub-Saharan Africa to final purchasers in the Americas. It has been estimated that in the Atlantic slave trade, up to 12 million Africans were loaded and transported across the ocean under dreadful conditions. About 2 million victims died on the Atlantic voyage (the dreaded "Middle Passage") and in the first year in the Americas. The crews of the slave ships could not have remained unaffected by the evil in which they participated. This “oppositeness” of black and white is at the very core of our history as a nation, and was the catalyst for one of the most significant social forces operating today. In All Times, All Peoples: A World History of Slavery (Harper), Milton Meltzer reminds us that, “White, black, brown, yellow, red — no matter what your color, it's likely that someone in your family, way back, was once a slave.” According to Meltzer, slavery has been a part of human history since the development of farming made it profitable to use captured enemies to do the work of the victors. Prior to that time, defeated enemies were killed because there was not enough food to feed the extra mouths. What distinguished the European/American brand of bondage was the link that was forged between slavery and race. Meltzer writes: “Up until 300 years ago, there seems to have been no connection between race and slavery. But just about 300 years ago arose the mistaken belief that whites were superior to people of any other color, and that this superior race had the right to rule others. That racist belief — shared by many of the Founding Fathers — justified the enslavement of Blacks.” Because we are still living with the consequences of the widespread acceptance of that rationale, it is important for all of us to confront this painful history, just as it is important for all of us to confront the history of the Jewish Holocaust and the accounts of “ethnic cleansing” in Bosnia. In these days, when armed hate groups are appearing more and more frequently on the landscape, we need to remember how easy it is for ordinary people, by simply refusing to see evil, to help it thrive. Transatlantic Slave Trade

NESCO The transatlantic slave trade is unique within the universal history of slavery for three main reasons:

 its duration - approximately four centuries  those vicitimized: black African men, women and children  the intellectual legitimization attempted on its behalf - the development of an anti-black ideology and its legal organization, the notorious Code noir. As a commercial and economic enterprise, the slave trade provides a dramatic example of the consequences resulting from particular intersections of history and geography. It involved several regions and continents: Africa, America, the Caribbean, Europe and the Indian Ocean. The transatlantic slave trade is often regarded as the first system of globalization. According to French historian Jean-Michel Deveau the slave trade and consequently slavery, which lasted from the 16th to the 19th century, constitute one of "the greatest tragedies in the history of humanity in terms of scale and duration".

The transatlantic slave trade was the biggest deportation in history and a determining factor in the world economy of the 18th century. Millions of Africans were torn from their homes, deported to the American continent and sold as slaves.

Triangular Trade

The transatlantic slave trade, often known as the triangular trade, connected the economies of three continents. It is estimated that between 25 to 30 million people, men, women and children, were deported from their homes and sold as slaves in the different slave trading systems. In the transatlantic slave trade alone the estimate of those deported is believed to be approximately 17 million. These figures exclude those who died aboard the ships and in the course of wars and raids connected to the trade.

The trade proceeded in three steps. The ships left Western Europe for Africa loaded with goods which were to be exchanged for slaves. Upon their arrival in Africa the captains traded their merchandise for captive slaves. Weapons and gun powder were the most important commodities but textiles, pearls and other manufactured goods, as well as rum, were also in high demand. The exchange could last from one week to several months. The second step was the crossing of the Atlantic. Africans were transported to America to be sold throughout the continent. The third step connected America to Europe. The slave traders brought back mostly agricultural products, produced by the slaves. The main product was sugar, followed by cotton, coffee, tobacco and rice.

The circuit lasted approximately eighteen months. In order to be able to transport the maximum number of slaves, the ship’s steerage was frequently removed. Spain, Portugal, the Netherlands, England and France, were the main triangular trading countries.

Resistances and abolitions prising aboard© UNESC The first fighters for the abolition of slavery were the captives and slaves themselves, who adopted various methods of resistance throughout their enslavement, from their capture in Africa to their sale and exploitation on plantations in the Americas and the Caribbean. Rebellion and suicide were often used as main forms of resistance. The American colonies were frequently disrupted by slave revolts, or the threat of revolt. The administrators of the British and French colonies in the 1730’s observed that a "wind of freedom" was blowing in the Caribbean, thereby indicating the existence of a veritable resistance to slavery. This was to materialize some 50 years later with the slave rebellion in Santo-Domingo. As early as the late seventeenth century, individuals, as well as the various abolitionist societies that had been established, began condemning slavery and the slave trade. This impetus essentially originated from the English-speaking countries. Up until the end of the nineteenth century British, French and North American abolitionists devised a set of moral, religious and occasionally economic arguments as a means of combating the slave trade and slavery (PDF).

An irreversible process The destruction of the slavery system began in the French colony of Santo Domingo towards the end of the eighteenth century. This long-running process (PDF) lasted until 1886 in Cuba and 1888 in Brazil. The slave rebellion on Santo Domingo in August 1791 profoundly weakened the Caribbean colonial system, sparking a general insurrection that lead to the abolition of slavery and the independence of the island. It marked the beginning of a triple process of destruction of the slavery system, the slave trade and colonialism. Two outstanding decrees for abolition were produced during the nineteenth century: the Abolition Bill passed by the British Parliament in August 1833 and the French decree signed by the Provisional Government in April 1848. In the United States, the Republican President, Abraham Lincoln, extended the abolition of slavery to the whole Union in the wake of the Civil War in 1865. The abolition of slavery – which at the time concerned approximately 4 million people - became the 13th Amendment to the Constitution of the United States.

Racial Stereotypes of the Civil War Era Dominating the wartime debate in the north about what should be done with slaves were three recurring questions: Could they learn? Would they work? Could they be civilized? And a fourth was added as the war continued and the need for troops became more desperate: Can they and will they fight?

These questions reflected stereotypes about race that could be traced back at least as far asquestions raised by Thomas Jefferson in Notes on the State of Virginia. Jefferson had conjectured that blacks were inferior to whites with respect to their capacity for reason, imagination, and sentiment. By the 19th century, American "craniometrists" claimed to use science to prove that blacks and whites were from different species. The popularity and persistence of these stereotypes in the North can be seen in articles and cartoons published in Northern newspapers that represented African-Americans as unwilling to work and incapable of learning and being civilized. Often represented as pathetic, comically or exotically picturesque, or hilariously or horrifyingly grotesque, African-Americans were seen as the "other." White Northerners liked to think of themselves as a hard-working, educated, and moral people, and African-Americans were typically stereotyped as lazy, ignorant, and uncivilized: an inverse image of what it meant to be an American. This website explores the ways how these stereotypes were sometimes reinforced and sometimes rebutted by Northern newspapers and by the letters of freedmen and their teachers. It also suggests the ways these images helped shape Northern discussions about the war.

The Bosnian Genocide

Genocide in Bosnia (1992-1995)

Although many different ethnic and religious groups had resided together for 40 years under Yugoslavia’s repressive communist government, this changed when the country began to collapse during the fall of communism in the early 1990s. The provinces of Slovenia and Croatia declared independence, and war quickly followed between Serbia and these breakaway republics. Ethnic tensions were brought to the forefront, and people who had lived peacefully for years as neighbors turned against each other and took up arms. When Bosnia attempted to secede, Serbia – under Slobodan Miloševic’s leadership – invaded with the claim that it was there to “free” fellow Serbian Orthodox Christians living in Bosnia.

Starting in April 1992, Serbia set out to “ethnically cleanse” Bosnian territory by systematically removing all Bosnian Muslims, known as Bosniaks. Serbia, together with ethnic Bosnian Serbs, attacked Bosniaks with former Yugoslavian military equipment and surrounded Sarajevo, the capital city. Many Bosniaks were driven into concentration camps, where women and girls were systematically gang-raped and other civilians were tortured, starved and murdered.

In 1993, the United Nations (UN) Security Council declared that Sarajevo, Goradze, Srebrenica and other Muslim enclaves were to be safe areas, protected by a contingent of UN peacekeepers. But in July 1995, Serbs committed the largest massacre in Europe since World War II in one such area, Srebrenica. An estimated 23,000 women, children and elderly people were put on buses and driven to Muslim-controlled territory, while 8,000 “battle-age” men were detained and slaughtered. The so-called safe area of Srebrenica fell without a single shot fired by the UN.

In 1994, NATO initiated air strikes against Bosnian Serbs to stop the attacks. In December 1995, U.S.-led negotiations in Dayton, Ohio (The Dayton Peace Accords) ended the conflict in Bosnia, and a force was created to maintain the ceasefire. Since the end of the conflict, the International Criminal Tribunal for Yugoslavia (ICTY) at The Hague has charged more than 160 persons. Convictions have included Serb, Croat and Bosniaks, though Serbians and Bosnian Serbs have faced the majority of charges. In 2001, former- President Miloševic was captured, but he died in his cell in 2006. Radovan Karadžic, the supreme commander of the Bosnian Serb armed forces, was captured in 2008, and is being tried in The Hague on genocide charges. Ratko Mladic, chief of staff of the Bosnian Serb Army, was captured in May 2011 and is charged with 11 counts, including genocide and crimes against humanity. In 1991, Yugoslavia began to break up along ethnic lines. When the republic of Bosnia and Herzegovina (Bosnia) declared independence in 1992 the region quickly became the central theater of fighting. The Serbs targeted Bosniak and Croatian civilians in a campaign of ethnic cleansing. The war in Bosnia claimed the lives of an estimated 100,000 people and displaced more than two million. The height of the killing took place in July 1995 when 8,000 Bosniaks were killed in what became known as the Srebrenica genocide, the largest massacre in Europe after the Holocaust. Precursors to Genocide The Federal Republic of Yugoslavia was formed at the end of World War II, comprised of Bosnia, Serbia, Montenegro, Croatia, Slovenia, and Macedonia with numerous ethnic groups making up the population. This included Orthodox Christian Serbs, Muslim Bosniaks, Catholic Croats, and Muslim ethnic Albanians. Tensions in the Balkans were common, but once President Josip Broz Tito came to power in 1943, he ruled with an iron fist and was typically able to keep them in check through a dictatorship. Though he was considered to be a “benevolent dictator” and at times quite ruthless, Tito’s efforts ensured that no ethnic group dominated the country, banning political mobilization and seeking to create a unified Yugoslav identity. However, after his death in 1980, the order he imposed began to unravel. The various ethnic groups and republics inside Yugoslavia sought independence, and as the end of the Cold War neared, the country spiraled out of control. Serb nationalism was fueled as Slobodan Milosevic rose to power in 1987. Milosevic used nationalist feelings to his advantage, making changes to the constitution favoring Serbs, creating a military that was 90 percent Serbian, and extending his power over the country’s financial, media, and security structures. With the help of Serbian separatists in Bosnia and Croatia, he stoked ethnic tensions by convincing Serbian populations that other ethnic groups posed a threat to their rights. Ethnic Cleansing Begins Yugoslavia began to collapse in June 1991 when the republics of Slovenia and Croatia declared independence. The Yugoslav army, largely composed of Serbs, invaded Croatia under the guise of trying to protect ethnic Serb populations there. They took the city of Vukovar, carrying out mass executions of hundreds of Croat men, burying them in mass graves. This was the beginning of the ethnic cleansings that characterized the atrocities committed during the Yugoslav Wars.

Bosnia came next in April 1992. Following their independence, Serbian forces accompanied by Bosnian Serbs attempted to ethnically cleanse the territory of the Bosniaks. Using former Yugoslavian military equipment, they surrounded Sarajevo, Bosnia’s capital city. Snipers hid in the hills and shot at civilians as they tried to get food and water. Mass executions, concentration camps, rape and sexual violence, and forced displacement were all extremely prevalent. The “siege of Sarajevo” is considered to be one of the most dramatic and representative parts Yugoslavia’s breakup, with thousands killed over the course of nearly four years. Attempts at mediation by the European Union were unsuccessful and the United Nations (UN) refused to intervene, aside from providing limited troop convoys for humanitarian aid. Later on, the UN tried to establish six “safe areas,” including Srebrenica and Sarajevo, but these were ineffective. Peacekeepers did not have the capabilities to truly protect the people seeking refuge there, and all except Sarajevo eventually fell under Serb control.

Genocide at Srebrenica In July 1995, Serb forces, led by General Ratko Mladic, descended upon the town of Srebrenica and began shelling it. At this point, the enclave was protected by only 450 Dutch peacekeepers armed with light fuel and expired ammunition – their force was so weakthat a Dutch commander had reported that the unit was no longer militarily operational a month prior. The peacekeepers requested support from the North Atlantic Treaty Organization (NATO) but were denied. Srebrenica fell to the Serbs in one day. Mladic expelled 25,000 women and children from the town, while his forces tried to hunt down approximately 15,000 Bosniak men who had tried to escape to safety in central Bosnia. Up to 3,000 were killed, either by gunshot or by decapitation, while trying to escape. Many Bosniaks sought refuge at a UN base in nearby Potocari, but were not safe there for long. Serb forces caught up with them by the afternoon and the next day, buses arrived at Potocari to take them away, again separating the children and women from the men. Serb troops forced the Dutch peacekeepers to hand over their uniforms and helmets so that they could use them to lure civilians out of hiding and trick them into thinking they were headed to safety. At the end of the four day massacre, up to 8,000 men and teenage boys had been killed, and many women were subject to torture, rape, and other forms of sexual violence. Thousands were buried in mass graves. In order to conceal their crimes, Serb forces dug up the original graves of many victims and moved them across a large piece of territory. There were clear indications that an attack at Srebrenica was being planned, yet the international community did not equip the peacekeeping forces there with the support necessary to protect the thousands who either lost their lives or were terrorized. The atrocities committed at Srebrenica are considered to be the worst on European soil after the Holocaust.

The Response While the war was widely covered in the press and individual policymakers at times took strong stands against human rights abuses in Bosnia, in general the UN, the European Union, the United States and Russia minimized the aggressive nature of the conflict and treated the fighting as a conflict between equal warring parties. Seeking to avoid the moral responsibilitiesof responding to a genocide, many of these countries referred to the conflict as “ethnic cleansing” rather than “genocide”.

The U.S. Response Up until 1995, the American government refused to take the lead onBosnia. The U.S. resisted sending in their own troops, and also vetoed Security Council draft resolutions to increase the number of UN peacekeepers. During his campaign, Bill Clinton criticized the Bush administration for their lack of action, but when he was elected in 1992, his Administration followed the same pattern. In 1995, American foreign policy toward Bosnia changed. Evidence of the atrocities being committed, including those at Srebrenica, was becoming common knowledge and the United States’ lack of action was becoming an embarrassment. President Clinton told his national security advisers that the war was “killing the U.S. position of strength in the war” and he did not want failure in Bosnia to tarnish his chances at re-election. Despite all efforts to keep American troops out of Europe, he eventually realized that there was no effective way to end the war without it.

The International Response The UN was hesitant to directly fight the Bosnian Serbs for fear of threatening their neutrality between nations and groups. The international community finally responded to the war after Serb forces took the town of Zepa, in addition to dropping a bomb in a crowded Sarajevo market. Senior representatives of the United States and its allies agreed to deploy NATO forces to Gorazde and defend the town’s civilian population. This plan was later extended to include the cities of Bihac, Sarajevo and Tuzla. In August 1995, after the Serbs refused to comply with a UN ultimatum, NATO forces in conjunction with Bosnian and Croatian forces began an aerial bombing campaign. With Serbia’s economy crippled by UN trade sanctions and its military forces under assault in Bosnia after three years of warfare, Milosevic agreed to enter negotiations that led to a ceasefire. By the end of the war, roughly 100,000 people had died.

Aftermath In November 1995, the Dayton Accords were signed in Dayton, Ohio, officially ending the war in Bosnia. This peace agreement established two semi-autonomous entities within Bosnia-Herzegovina: the Federation of Bosnia-Herzegovina, inhabited primarily by Bosniaks and Bosnian Croats, and the Republika Srpska (which includes Srebrenica), dominated by Serbs, both with their own political structures, economies, and educational systems, though connected through a central government.

Refugees were guaranteed the right to return to their pre-war homes, but only a small number of Bosniaks opted to go back to Srebrenica, which had been re-inhabited by Bosnian Serbs who had also been internally displaced by the war. An influx of international assistance came after the fighting, including reconstruction efforts by non- governmental organizations, UN agencies, and foreign governments and militaries and over $14 billion in aid.

Dayton’s Drawbacks The Dayton Accords were successful in stopping the violence and allowing the region to create some form of normality, but it has turned out to be a somewhat of band-aid solution that set the stage for further divisions between Bosnia’s ethnic groups. For instance, Bosnia has a three-member presidency requiring one Croat, one Bosniak, and one Serb to represent their constituencies, but because each member is able to veto legislation that is seen as threatening to his own group’s interests, it has been nearly impossible to come to consensus for most of the important issues at the central-government level. Furthermore, this type of system still excludes other minority groups in the country such as the Roma and Jews. The fact is that the Dayton Accords were not meant to be a long-term solution to the problems of the country; they were meant to stop the killing and secure peace. Eventually they were supposed to be replaced with a more streamlined government structure. The hope was that in working together and creating a unified Bosnian identity, the mistrust between ethnic groups would fall away – this has not been the case. Though they may live side-by-side, Bosniaks, Croats, and Serbs essentially lead segregated lives. People identify themselves through their ethnicity rather than their citizenship. The legacy of the Dayton Accords is evident within Bosnia-Herzegovina, as its economic development has lagged behind its Balkan counterparts. Unemployment remains a problem for a large portion of the country, and corruption is very prevalent. The country is currently trying to join the European Union, but a failure on the part of Serb, Bosniak, and Croat leaders to agree on details for a reform program have delayed their application for membership.

Criminal Tribunal The UN Security Council passed resolution 827 establishing the International Criminal Tribunal for the former Yugoslavia (ICTY) in the Hague, Netherlands in May 1993, before the war had even ended, after they were briefed on reports of massacres, rape and torture, extreme violence in the cities, and massive suffering of the hundreds of thousands who had been expelled from their homes. The ICTY was formed to end the impunity of the perpetrators of mass atrocities, and was the first tribunal to prosecute genocide. It also has given survivors of rape, torture, and other heinous crimes the opportunity to tell their stories of what they experienced and what happened to their loved ones and be heard. The ICTY was slow to start. A chief prosecutor was not named until 1994, and even after, the governments of Serbia and Croatia refused to turn their war crimes suspects or share information with the tribunal until their membership to the EU was jeopardized due to their lack of cooperation. NATO showed its weakness again when members failed to arrest suspects in Bosnia out of fear of endangering their forces. However, since delivering its first sentence in 1996, the ICTY has convicted more than 60 people involved with crimes against various ethnic groups in Bosnia, Croatia, Serbia, Kosovo, and Macedonia. More than 160 have been charged, including high and mid-level political, military, and police leaders from multiple sides of the conflict. It was ruled in 2001 that genocide occurred in Srebrenica, and in 2007 the International Court of Justice stated that Serbia violated the Genocide Convention by not doing enough to prevent it. Former leader, Slobodan Milosevic received 3 indictments from the ICTY for war crimes and crimes against humanity in Kosovo in 1999, war crimes and crimes against humanity in Croatia between 1991 and 1992, and genocide, crimes against humanity, and war crimes in Bosnia between 1992 and 1995. His trial, delayed multiple times due to his health, began in February 2002 and he pled not guilty to all 66 counts of war crimes, crimes against humanity, and genocide. In 2006, he was found dead in his cell in The Hague, months before his trial was expected to end. After evading arrest for over a decade, Ratko Mladic, the man accused of leading the siege of Sarajevo and orchestrating the genocide at Srebrenica, began his trial in 2012 and it is expected to end in 2015. He faces 11 charges, including 2 counts of genocide and has pled not guilty to all of them. His behavior in the courtroom has apparently ranged from unremorseful to sarcastic to mocking, at times making gestures at the witnesses. The defense portion of the trial began in 2014, arguing that he was simply following orders – a common justification by those who have committed mass atrocities.

Finding Justice Many survivors have had to live their lives not knowing what happened to their family members. Over 20,000 people are still missing. When Serb forces dug up graves with bulldozers and trucksin Srebrenica in an attempt to move them to hide their crimes, many of the bodies were scattered. As such, finding the remaining missing persons has been extremely difficult. Those who are found are almost impossible to identify due to the condition of their remains. In 1995, President Bill Clinton founded the International Commission on Missing Persons (ICMP) to aid in the search and identification of missing persons found at disaster sites or war zones using forensic methods that matches the DNA of survivors to the unearthed remains. So far, the ICMP has been successful in identifying nearly 7,000 bodies in Srebrenica. Recognizing Genocide While both the ICTY and ICJ have considered the atrocities committed in the former Yugoslav region to constitute genocide, this has not been a shared sentiment around the world. Notably, both Russia and Serbia have denied that the Srebrenica massacre amounted to genocide. In July 2015, the UN Security Council held a meeting in preparation for the 20th anniversary of Srebrenica, and reportedly Serbia asked Russia to veto a draft resolution that would formally condemn the massacre as genocide. Russia used its veto to kill the resolution, stating that calling the crimes a genocide would prompt further tensions in the region. Serbia has acknowledged that the crimes at Srebrenica occurred but has never used the word genocide to describe them. Arrests for Srebrenica-related crimes were not made in Serbia until March of 2015. Denial also runs strong in the Serb-dominated Republika Srpska, with the Bosnian Serb leader Milorad Dodik called Srebrenica, “the greatest deception of the 20th century.

U.S. Ambassador to the UN Samantha Power was a journalist in Sarajevo when the attack on Srebrenica occurred and a first-hand witness to the suffering that the war caused. In response to Russia’s veto, she said, “It mattered hugely to the families of the victims of the Srebrenica genocide. Russia’s veto is heartbreaking for those families and it is a further stain on this Council’s record”. Denialist rhetoric trivializes the experiences of victims and survivors, and minimizes the true weight of what occurred during the 1990s. Reconciliation cannot be possible without recognition of the crimes committed. Nothing can bring back their loved ones or erase their trauma, but by acknowledging these events as what they are, the survivors can begin the healing process and find closure for what they experienced.