Quick viewing(Text Mode)

Deconstructing Race and Gender for the African Traveller

Deconstructing Race and Gender for the African Traveller

The Return of the Taliban: What Now for the Women of ?

By Rasna Warah

There have been a lot of knee-jerk reactions – particularly from liberals – about the United States’ hasty withdrawal from Afghanistan. Those who oppose US military intervention in foreign lands say the withdrawal couldn’t have come sooner – that invading Afghanistan in 2001 after the 9/11 terror attacks on New York and Washington was a mistake and staying on in (“occupying”) the country was an even bigger mistake. They argue that US military intervention in Korea, Vietnam, Somalia and other places has been disastrous, and that these interventions reek of imperialism.

Well and good. But everyone who has something to say about the poorly planned US withdrawal from Afghanistan, including the Taliban and President Joe Biden, has failed to answer these questions: What would the women of Afghanistan have wanted? Why were they not consulted before the US president made the unilateral decision to pull out troops from Afghanistan? And what gives Biden and the all-male Pashtun-dominated Taliban leadership the right to make decisions on women’s behalf?

I was in Kabul in 2002, some three months after the US invaded the country and ousted the Taliban from the capital city. I spoke with many women there who told me that they were relieved that the Taliban had left because life under the misogynistic movement had become unbearable for women and girls. Girls were not allowed to have an education so girls’ schools had to be run secretly from homes. The Taliban were known for barbaric public executions and for flogging women who did not wear burqas or who were accused of adultery. Theirs was an austere, cruel rule where people were not even allowed to sing, dance, play music or watch movies.

Twenty years of war, beginning with the Russian invasion of Afghanistan in 1979 and the subsequent US-backed insurgency of the Mujahideen (mujahidun in Arabic—“those engaged in jihad”) in the 1980s (which later transformed into the Taliban movement), not to mention the US invasion of Afghanistan after the 9/11 terror attacks, had left Kabul’s physical infrastructure in ruins. Entire neighbourhoods had been reduced to rubble and no one quite remembered any more whose army had destroyed which building. The only buildings still left standing were the mosques and the Soviet- built apartment blocks housing civil servants. In 2002, Kabul Municipality had estimated that almost 40 per cent of the houses in the city had been destroyed in the previous fifteen years. Solid waste disposal barely met minimum standards, and running water and electricity were luxuries in most homes.

After the Taliban fled the capital and went underground, an estimated 3 million girls went back to school. At that time, the average Afghan child could expect only about 4 years of schooling. By 2019, this figure had risen to 10 years. Today, more than 13 per cent of adult women in Afghanistan have a secondary school education or higher. Women’s participation in the political sphere also increased dramatically; in 2019, nearly a third (27.2 per cent) of parliamentary seats were held by women.

No wonder women around the world were shocked and dismayed to see how easily Afghan women and girls were sacrificed and abandoned by the world’s leading powers. “My heart breaks for the women of Afghanistan. The world has failed them. History will write this,” tweeted the Iranian journalist and activist Masih Alinejad on 13 August 2021.

As Taliban fighters were gaining control of the capital Kabul on Sunday, 15 August 2021, an unnamed woman living in the city wrote the following in the Guardian:

As a woman, I feel I am the victim of this political war that men started. I felt like I can no longer laugh out loud, I can no longer listen to my favourite songs, I can no longer meet my friends in our favourite café, I can no longer wear my favourite yellow dress or pink lipstick. And I can no longer go to my job or finish the university degree that I worked for years to achieve.

There have been reports of Taliban fighters abducting and marrying young girls, and ordering women not to report to work. Afghan female journalists fear for their lives; many have gone into hiding. The sale of burqas has apparently skyrocketed.

The argument that women in other countries also suffer at the hands of men, and experience gender-based violence does not fly with many Afghan women who have been fighting for the rights of women for the last two decades. For one, there is no law in any country in the world, as far as I know, that denies women an education or bans them from working outside the home. Women in these countries may not yet be truly free, but at least they can rely on the law to protect them. All the gains Afghan women have made over the last two decades will now be lost. I do not for one second believe that the rebranded Taliban emerging in Afghanistan have become feminists overnight, despite their pro-women rhetoric at press conferences. Mahbouba Seraj, an Afghan women’s rights leader, told TRT World that what is happening in Afghanistan is “going to put the country two hundred years back.” “I am going to say to the whole world—shame on you!” she stated.

A series of failures

That is not the first time the US has abandoned Afghanistan. After Russian forces withdrew from Afghanistan in 1989, the US pulled out as well, leaving the Mujahideen, which it had been funding, to its own devices. Yet, in 1979, when Russian forces entered Afghanistan, the US National Security Advisor Zbigniew Brzezinski had described the Mujahideen as “soldiers of God”, and told them, “Your cause is right and God is on your side.” The Mujahideen transformed into the Taliban, and imposed its severe rule on Afghans during the latter part of the 1990s. It also became a den for terrorist organisations like Al Qaeda. The US essentially created a monster that launched the 9/11 attacks 22 years later.

Afghanistan has had a long and turbulent history of conquests by foreign rulers, and has often been described as the “graveyard of empires”. But it has not always been anti-women. In 1919, King Amanullah Khan introduced a new constitution and pro-women reforms. The last monarch, Zahir Shah (1933-1973), also ensured that women’s rights were respected through various laws. But when Shah was overthrown in 1978, the Soviet Union installed a puppet leader. This gave rise to the anti- Soviet Mujahideen, who gained control of the country in the 1990s and eroded many of the rights women had been granted.

There have been reports of Taliban fighters abducting and marrying young girls, and ordering women not to report to work.

There are many parallels with Somalia, which also enjoyed Russian support under President Siad Barre. When the Soviets switched sides and began supporting Ethiopia’s Mengistu Haile Mariam, the US gained more influence, but it could not install democracy in a country that had descended into warlordism after Barre was ousted in 1991. After American soldiers were killed in Mogadishu during the country’s civil war in 1993, the US withdrew from Somalia completely. Conservative forces supported by some Arab countries filled the void. When a coalition of Islamic groups took over the capital in 2006, they were quickly ousted by US-backed Ethiopian forces. Al Shabaab was born. As in Afghanistan, the US had a hand in creating a murderous group that had little respect for women.

After the US invasion in 2001, instead of focusing on stabilising and rebuilding Afghanistan, President George Bush set his eyes on invading on the false pretext that the Iraqi dictator Saddam Hussein had links to Al Qaeda and was harbouring weapons of mass destruction. That war in 2003 cost the US government its reputation in many parts of the Muslim world, and turned the world’s attention away from Afghanistan. Bush will also be remembered for illegally renditioning and detaining Afghans and other nationals suspected of being terrorists at the US naval base in Guantanamo Bay. This ill-advised move, which will forever remain a blot on his legacy, has been used as a radicalisation propaganda tool by groups such as the Islamic State in Syria (ISIS).

The international community is now sitting back and doing nothing, even as it is becoming increasingly evident that the world is witnessing a humanitarian catastrophe that will have severe political repercussions within the region and globally. The international community of nations, including the UN Security Council, cannot do anything except plead with the Taliban to not discontinue essential services, which is a tall order given that three-quarters of Afghanistan’s budget was funded by foreign (mostly Western) aid. The Taliban was allowed to take over the country without a fight. And all the UN Secretary-General could do was issue statements urging neighbouring countries to keep their borders open to the thousands of Afghans fleeing the country.

The mass exodus of Afghans, as witnessed at Kabul’s international airport, is a public relations disaster for the Taliban. It shows that not all Afghans welcome the Taliban’s return. As the poet Warsan Shire wrote about her homeland Somalia, “no one leaves home unless/home is the mouth of a shark”. Afghanistan has once again become a failed state.

The longest war

The impact of the Taliban’s capture of the country is already being felt. The exodus of Afghans is creating a refugee crisis like the one witnessed in 2015 during the civil war in Syria. The US and its NATO allies have essentially created a refugee crisis of their own making. This will likely generate anti-immigration and anti-Muslim sentiments in the US and Europe, and embolden racist right-wing groups. It is also possible that Afghanistan will become the site of a new type of Cold War, with Russia and China forming cynical alliances with the Taliban in order to destabilise the West and to exploit Afghanistan’s vast natural resources, which remain largely untapped. Girls’ education will be curtailed. No amount of reminding the Taliban that Prophet Mohammed’s wife Khadija was a successful businesswoman, and that his third wife Aisha played a major role in the Prophet’s political life will change their minds about women. Women and girls are looking at a bleak future as the Taliban impose punitive restrictions on them that even the expansionist Muslim Ottoman Empire did not dare enforce in its heyday. Afghanistan will become a medieval society where women remain voiceless and invisible.

The worst-case scenario – one that is just too horrific to contemplate – is that terrorist groups like the Islamic State in Iraq and Syria (ISIS) and Al Qaeda will find a foothold in Afghanistan, and unleash a global terror campaign from there, as did Osama bin Laden more than two decades ago.

As in Afghanistan, the US had a hand in creating a murderous group that had little respect for women.

The irony the US having invaded the country two decades before, ostensibly to get rid of Islamic terrorists, Biden has essentially handed over the country to the very group that had harboured terrorists like Osama bin Laden, the alleged mastermind of the 9/11 attacks. “President Joe Biden will go down in history, fairly or unfairly, as the president who presided over a humiliating final act in the American experiment in Afghanistan,” wrote David E. Sanger in the New York Times. (To be fair, it was not Biden who first opened the doors to the Taliban; President Donald Trump invited the Taliban to negotiations in Doha in 2018, which lent some legitimacy to a group that had previously been labelled as a terrorist organisation.)

Dubbed “America’s longest war”, the US military mission in Afghanistan has cost US taxpayers about US$2 trillion, one quarter of which has gone towards reconstruction and development, though critics have pointed out that the bulk of this money was used to train the Afghan military and police, and was not used for development projects. The military mission in Afghanistan has also come at a huge human cost; 3,500 soldiers and other personnel from 31 NATO troop-producing countries and 4,400 international contractors, humanitarian workers and journalists were killed in Afghanistan between 2001 and 2020. Thousands of Afghan lives have also been lost. The United Nations Assistance Mission in Afghanistan estimates that at least 100,000 Afghans have been killed or wounded since 2009.

Was the US and NATO intervention in Afghanistan worth it? Should the US and NATO have stayed a bit longer until the country had well-functioning and well-resourced institutions and until they were sure that the Taliban had been completely routed out? I think so, because I believe that ousting the Taliban was as ethically correct as eliminating ISIS and defeating the German Nazis. The problem in Afghanistan is that the Taliban were never defeated; they simply went underground. Women and girls are looking at a bleak future as the Taliban impose punitive restrictions on them that even the expansionist Muslim Ottoman Empire did not dare enforce in its heyday.

There is no doubt that the “liberation” or “occupation” of Afghanistan by the US-dominated NATO mission in Afghanistan brought about some tangible benefits, including rebuilt and new infrastructure, the growth of a vibrant civil society and more opportunities for women. But the US’s support of Western-backed Afghan governments that are generally viewed as corrupt by the majority of Afghans may have handed the Taliban the legitimacy and support they seem to be enjoying among the country’s largely poor rural population, just as installing highly corrupt Western-backed governments in Somalia in the last fifteen years gave Al Shabaab more ammunition to carry out its violent campaign. The Taliban is also recognised by some neighbouring countries, notably , which is believed to be one of its funders, and which receives considerable military and other support from the US. This raises questions about why the US is aiding a country that is working against its interests in another. This Taliban-Pakistan alliance will no doubt be watched closely by Pakistan’s rival .

Afghanistan, unfortunately, is a sad reminder of why no amount of investment in infrastructure and other “development” projects can fix something that has been fundamentally broken in a country. Like Iraq after the 2003 US-led invasion, it may fragment along tribal or sectarian lines and revert to a civil war situation. Under the Taliban “government”, Afghanistan may become a joyless place where people are not allowed to listen to music, dance or watch movies – where enforcement of a distorted interpretation of Islam casts a dark shadow on the rest of the Muslim world. And Afghan women and girls will once again pay the heaviest price.

Published by the good folks at The Elephant.

The Elephant is a platform for engaging citizens to reflect, re-member and re-envision their society by interrogating the past, the present, to fashion a future.

Follow us on Twitter.

The Return of the Taliban: What Now for the Women of Afghanistan? By Rasna Warah

The news that European Union countries could deny visas to Africans, the majority (90 per cent of those vaccinated) who have received the Covishield vaccine produced by the Serum Institute in India has once again highlighted how disadvantaged Africans are when it comes to travelling abroad. I don’t want to go into the intricacies of why the EU has made this decision, which you can read about here, but I would like us to explore what travelling abroad will mean for Africans during and after the COVID-19 pandemic. Will vaccinations determine who can and cannot travel? Given that less than 2 per cent of the African population is currently fully vaccinated, will this mean that the majority of Africans wishing to travel abroad will have to wait at least a year or two before they can do so? And if Covishield is not approved by the EU, does this mean that those who like me received two doses of the vaccine will be permanently barred from entering Europe?

Believing that the pandemic would not negatively impact Africa was just wishful thinking. While the number of infections and fatalities have been low compared to other regions, the economic shock has been equally – if not more – devastating. Loss of incomes has already impoverished millions of Africans as lockdowns continue with new waves of the pandemic. Moreover, we are – and have always been – at the receiving end of decisions made in other continents (the decision to colonise Africa was taken in Berlin by European powers) – decisions that determine what Africans should or should not do. We are not allowed to make decisions on our own behalf. African countries, including Kenya, for example, did not stop flights from Europe or North America – the epicentres of the pandemic in the first and second waves – but these regions were quick to stop flights from African countries. Nor did we impose “vaccine passports” on citizens of these regions that would allow them to gain entry into our countries. As one of my Twitter followers explained, this should not surprise us because it is the mighty dollar and the euro that determine how Africans treat those who control both currencies.

How could it be any other way? Citizens of African countries are subjected to the most stringent visa conditions for entry into Europe or North America. Those of us who have applied for a visa to a European country, the United States or Canada know how painful and humiliating the process can be. From providing mountains of documentation, including bank statements, to show that one is not a potential illegal immigrant, to bearing the cost of exorbitant non-refundable visa fees, the visa application process is designed to deter Africans from travelling to these countries. This has significantly diminished the travel experience of Africans.

In her book, Travelling While Black: Essays Inspired by a Life on the Move, the Kenyan writer Nanjala Nyabola describes visas as “a cruel and unusual invention” and “a power play, a cash grab, and a half-assed invitation to enter but not belong”. Nyabola not only unravels the experiences of Africans travelling abroad and within the continent but also exposes the “insidious racisms that shape the politics of human mobility”. As she emphasises in her foreword, the book is not a travel memoir, but essays inspired by travel – a book that tells uncomfortable stories that make us think about why they make us uncomfortable. As she so eloquently puts it: “In this book I want to sit in the discomfort of being a black woman and having our intersectional pain ignored . . . I want to reflect on what it means to be at home, and to be un-homed.”

The book begins with her experiences as a humanitarian worker in Haiti, the first black republic and one of the world’s poorest countries, where she learned about “the cultural construction of race”. In a country where NGOs managed mostly by white people practically run the country, she questions why she had to bend and adapt to their whims. Why were the Haitians not running the show?

Much has been written about the inadequacies of aid to Haiti, also known as “Republic of NGOs” (more on this in my forthcoming book), but not quite with the insider-outsider perspective of Nyabola, a black humanitarian worker in a non-African black country where white foreigners have more say than the locals. She concludes that those claiming to help impoverished Haitians should do so not because they feel bad for them, but “because we want them to experience the same fullness of life that we ourselves aspire to”.

Many of the essays in the book focus on another type of traveller – the African refugee or migrant who risks all by making the perilous journey across the Mediterranean in the hopes of reaching Europe. She questions the absurd practice of placing refugees in camps where they are denied freedom of movement and are not allowed to earn an income or to work. Most refugees seeking asylum before the 1951 Convention Relating to the Status of Refugees came into effect, she notes, were not crammed into camps. Jews seeking asylum in Europe and North America during World War II were allowed to integrate socially and economically into the societies that accepted them. Why and how did this change? And why are an increasing number of Africans and Asians entering Europe illegally when there are legal ways to do so? Well, says Nyabola, it’s because “legal and safe passage to Europe has disappeared, for all but a small sliver of the world’s population”.

Jews seeking asylum in Europe and North America during World War II were allowed to integrate socially and economically into the societies that accepted them.

Travelling while black also proves to be a challenge in Asia. On a physically demanding hike on Mount Everest, Nyabola encountered “being raced” by her Nepali guide, who refused to attend to her even when she fell dangerously ill simply because she was black. How can people who themselves do not enjoy white privilege become racist? Is the racism of white people different from that of those who also experience white racism? Nyabola tries to explain the difference by making a distinction between “racism” and “being raced”, the latter a phenomenon that black Africans who visit Asian countries often experience. She explains:

I think there is a qualitative difference between racism and being raced. Racism, I think, is more sinister and deliberate. But being raced or racing other people is something that people do because they aren’t paying attention. It’s cultural laziness: we create all these shorthands that allow us to process difference. . . . They have raced me – decided, based on cultural generalisations, who they think I am – in order to process my presence; and, because of the way popular culture from the West especially projects and processes black women, a lot of that is negative.

She is equally critical of Africans who treat other Africans badly. Her discomfiting experiences in South Africa, where xenophobic attacks against Somalis, Zimbabweans and other Africans have been rising in recent years, are telling, and reflective of a country that has not completely disengaged from the clutches of apartheid. South Africa challenges her belief that Africans can be at home anywhere on the continent – a belief advocated by the leading Pan-Africanists of yesteryear who envisioned independence from colonial rule as the basis for building an inclusive Africa for all Africans. “The truth is that millions of Africans are foreigners and migrants in Africa, un-homed by power and abandoned to physical or structural violence,” she admits.

There are some uplifting chapters in the book that hold out the promise of Pan-Africanism, like her trip to Gorom Gorom in Burkina Faso where she observed “regal families undulating on their camels” and her foray into rural Botswana where she goes to trace the life of Bessie Head, the mixed-race South African writer who Nyabola admires deeply. As an outsider in both the white literary world and in Bostwana, Head suffered loneliness and rejection. The black American literary crowd in the United States had no time for an African woman writer. When she reached out to fellow African writers like Ngugi wa Thiong’o and Chinua Achebe, who were beginning to be recognised in the West as African literary giants, “their responses were curt and perfunctory”. The chapter on Bessie Head’s life will no doubt resonate with female African writers for whom the doors of big established publishing houses are permanently closed.

Nanjala Nyabola’s book does, however, open new worlds to African women travellers like her who are reflecting on how their race and gender have shaped their experiences of dislocation, exile, belonging and not belonging.

Published by the good folks at The Elephant.

The Elephant is a platform for engaging citizens to reflect, re-member and re-envision their society by interrogating the past, the present, to fashion a future.

Follow us on Twitter. The Return of the Taliban: What Now for the Women of Afghanistan?

By Rasna Warah

Most of us have seen images of the calamity taking place in Indian cities – overflowing crematoriums; countless burning bodies on wooden pyres; smoke from the flames engulfing entire neighbourhoods; COVID-19 patients dying outside hospitals or in ambulances due to lack of hospital beds or oxygen cylinders. The visuals are compelling, and Indian and international TV channels have not spared any aspect of this unfolding tragedy from being aired on television. CNN even sent their international war correspondent to – not their health or science correspondent – which speaks volumes.

There is no doubt that the dramatic spike in COVID-19 deaths in India in the last couple of weeks is a catastrophe that could have been avoided. Analysts have pointed to complacency on the part of both Indians and their government in allowing strains of the coronavirus to spread at crowded political rallies and at the recent Kumbh Mela festival, the largest religious gathering in the world. Prime Minister Narendra Modi has even been accused by Indian activist Arundhati Roy of allowing this “crime against humanity” to take place.

But the images being shown on international television channels are making many Africans uncomfortable. Africans know what it means to be the object of what many perceive as “disaster porn” – a voyeuristic peek (in the name of “news”) into the dying moments of those suffering from a disaster. Now Indians are getting the same treatment.

For decades, the world has been fed images of Africans dying from conflict, famines, and other calamities. The photo of the “starving African baby” often appears on the front pages of leading newspapers in the West. Editors argue that such images save lives because they usually lead to a humanitarian response. But Africans know that these images belie a racism that is not immediately apparent because it is often couched in the language of compassion. They know that if the person dying was an American or a European, he or she would not be filmed or photographed in this way because it would rob them of their privacy and dignity. This is why we never saw the mass funerals of Italians when the coronavirus pandemic was at its peak in Italy. Nor did the world see the bodies of the 3,000 Americans who plunged to their death or were burnt alive during the horrific World Trade Centre twin towers terrorist attack on 11 September 2001.

The disaster unfolding in India is one that Western journalists had prepared for, but which did not materialise – until now. As one person on Twitter quipped, the high COVID-19 infection and death rates in the United States and Europe robbed Western journalists of an opportunity to portray it as a Third World problem. Everyone, from billionaire philanthropist Bill Gates to the United Nations Secretary-General Antonio Guterres, had predicted that millions of Africans would die from the virus, and they were all geared up for a fundraising campaign for the continent. But on this one, Africans disappointed them; they just did not die in sufficient numbers. India has now given Western journalists an opportunity to focus their voyeuristic gaze elsewhere.

Africans know that these images belie a racism that is not immediately apparent because it is often couched in the language of compassion.

Various theories about why Africans are not getting infected in large numbers have been proposed, including Africa’s youthful population that is apparently disease-resistant, Africa’s largely rural population,which has more exposure to fresh air, and so is less likely to inhale the virus, and the continent’s hot climate, which is not conducive to the virus’s longevity (even though not all of Africa has high temperatures; many parts of the continent experience near-zero temperatures in the winter).

This “othering” of non-white people is also reflected in COVID-19 reporting. Indi Samarajiva, writing in The Medium, talked about a New York Times article where a journalist wondered whether “there is a genetic component in which the immune systems of Thais and others in the Mekong River region are more resistant to the coronavirus”. Samarajiva called this racist reporting. “Instead of looking at what the Thai people did, they’re asking if it’s something in their veins. Because Thai people couldn’t possibly just be competent, it must be alchemy,” he wrote.

The number of fatalities in African countries is in sharp contrast to the massive death toll in the United States, where millions have been infected and where more than half a million people have died from the disease, making America the country with the highest number of COVID-related deaths. If such horrifying figures were emanating from the African continent, there would no doubt have been a massive fundraising drive for Africans. Even televised images of thousands of middle- class Americans lining up for food donations and food stamps did not prompt international charities to organise a fundraising campaign for America’s hungry and jobless citizens.

Few analysts and media houses have dared to admit that some African countries might actually have been better at handling the pandemic than countries such as the United States, Britain, and Italy, where governments did not impose stringent measures on citizens to contain the virus. Uganda, Senegal and Rwanda, which successfully contained the virus in the initial stages through a series of rigorous measures and strategies, were not hailed as success stories in the fight against COVID-19. Nor did we hear much about Vietnam, where only a few dozen people died from the virus in the early months of the pandemic.

Kenyans have in the past called out racist depictions of tragedies occurring in their country. After the Al Shabaab terror attack on the Dusit D2 building in Nairobi in January 2019, which left 21 people dead, the New York Times carried a disturbing photo of dead bodies slumped over dining tables in a restaurant in the building. Kenyans on Twitter (KOT) reacted quickly and furiously. Under the hashtag #SomeoneTellNYTimes, many Kenyans argued that the newspaper had employed double standards – that if the images were of dead Americans, they would not have been published. Despite the protests, the New York Times refused to pull down the offending photo. Instead, it issued an unapologetic and defensive statement that said that the newspaper believed “it is important to give our readers a clear picture of the horror of an attack like this”, which “includes showing pictures that are not sensationalized but that give a real sense of the situation.” The statement further said that the newspaper takes “the same approach wherever in the world something like this happens – balancing the need for sensitivity and respect with our mission of showing the reality of events.”

India has now given Western journalists an opportunity to focus their voyeuristic gaze elsewhere.

The self-righteousness reflected in this statement reminded me of a photo essay published by TIME magazine in June 2010 that showed the dying moments of an 18-year-old Sierra Leonean woman called Mamma Sessay who had just given birth to twins in a rural clinic. Ten images captured Sessay’s slow and painful death as she struggled to give birth to the second twin, nearly 24 hours after giving birth to the first. It was as if the photographer anticipated her death, and decided to watch it happen before his eyes. He did not take Mamma to the nearest hospital or offer any other kind of help. As one Kenyan female blogger commented: “Here, the author and the photographer strip Mamma of all dignity, parading her in her very desperate moments for the world to see. Would these pictures have been published if she was white?”

Why is death considered a private, sombre affair when the person dying or who has died is white but a public event if the person who is dying or is dead is black or brown? It his high time the international media stopped using this double standard in its reporting of grim news.

Published by the good folks at The Elephant.

The Elephant is a platform for engaging citizens to reflect, re-member and re-envision their society by interrogating the past, the present, to fashion a future.

Follow us on Twitter. The Return of the Taliban: What Now for the Women of Afghanistan?

By Rasna Warah

I was in my mid-20s when I read Nawal el Saadawi’s The Hidden Face of Eve, a book that completely changed the way I viewed feminism. Until then, I thought I knew everything about the women’s movement. After all, I had taken several courses in Women’s Studies as an undergraduate at a very liberal university in Boston, USA, in the 1980s, so I thought I knew all there was to know about patriarchy and misogyny. But Saadawi’s book led me to question all my assumptions.

The Hidden Face of Eve could be described as a Marxist analysis of the root causes of patriarchy. It lays out in clinical detail how advanced forms of agriculture, which produced surpluses that could be sold for profit, created societies where the subjugation and seclusion of women became the norm. Her thesis is simple but devastating in its conclusion: when societies transitioned from subsistence farming and started making a profit off their land, the value of their land increased. The more land you owned, the more power you wielded. So, issues surrounding inheritance – who would be the heirs to the land – became more important. To ensure that the person inheriting your land was your biological son, you made sure that your wife (or wives) had no opportunity to have sexual relations with other men. Because only a woman knows who the father of her child is, it became imperative to ensure that she did not “stray”. Hence the veiling and seclusion of women.

When feudal agricultural societies transformed into urban capitalist ones, the impetus to control women and their bodies did not diminish; it merely took other forms. In some societies, the veil became normalised; in others, women’s “purity” was safeguarded through other means, such as female circumcision or arranged marriages. Women who dared to break these norms were dismissed as prostitutes, witches or mad.

Literary reviewers and feminists have described The Hidden Face of Eve as a book about women’s oppression in the Arab world. The tendency is to see it as an indictment of Islam’s attitude towards women. But it is far from this. On the contrary, Saadawi argued that it is not Islam that has kept women down, but a patriarchal class system that cuts across all religions. It is not men who are the problem, but a system that prevents both women and men from fulfilling their potential. Men too suffer from a patriarchal system that determines what they can and cannot do.

Saadawi emphasised that this system has its origins in the mythical “first woman”, Eve, who dared to eat “the fruit of knowledge” that was forbidden to her. For this “sin”, she was punished, labour pains during childbirth being one among many other penalties she had to endure. Since then, women have continued to be suppressed by men who prefer women to remain ignorant. Christianity, Judaism, and Islam – the major Abrahamic religions – all have an Eve story where a woman’s intellectual awakening is considered threatening. Women who “know too much” are viewed as dangerous, potentially promiscuous, women who might become bad mothers or wives.

Men too suffer from a patriarchal system that determines what they can and cannot do.

What Saadawi taught me was that the oppression of women – whether in feudal societies or in capitalist ones – is not confined to any religion or region. That when critics of Islam, like the Somali- born Dutch American author Ayaan Hirsi, blame the religion for treating women badly, they must also accept that the “enlightened” Western/Christian world, which they believe has accorded women more rights, has developed other types of oppression against women that can be equally devastating. Rape and domestic violence are as prevalent in Western societies as in non-Western ones. Violence against women has become normalised because women are commodified in capitalist societies – they are valued purely on the basis of their capacity to please men. The beauty industry has capitalised on this and created a market for fashion and cosmetics. Hence the growing demand among Western women for silicone breast implants and Botox injections, and the acceptance of pornography as a legitimate form of entertainment.

This revelation – that there is no hierarchy in women’s oppression and that patriarchy and the capitalist system upon which it is built is inherently oppressive – completely changed the way I viewed the family and community structures within which I operated. I realised that being born female had already relegated me, my sisters, my mother and my grandmother to an inferior status. That the women who participated in my oppression, including my mother (who was more eager that I get married than that I obtain a university education), were only doing so because patriarchal norms left them with no choice. That when Saadawi’s mother happily watched her six-year-old daughter being circumcised, she did so because she knew that her daughter would only be accepted in her small Egyptian village if she underwent the procedure. “I did not know what they had cut off my body, and I did not try to find out,” recalled Saadawi. “I just wept and called out to my mother for help. But the worst shock of all was when I looked around and found her standing by my side.”

Women are commodified in capitalist societies – they are valued purely on the basis of their capacity to please men.

The Egyptian feminist and author, who died in March at the age of 89, and who I had the pleasure of meeting briefly at a literary event in Nairobi a few years ago, was born in the village of Kafr Tahla outside Cairo, where it was normal for girls as young as 10 to be married off. She defied all societal expectations, did well in school, and went on to become a medical doctor, only to lose her job in the Ministry of Health when her book Women and Sex was published in 1969. Saadawi did eventually marry – three times – and had a son and a daughter. In 1981, Saadawi was among hundreds of activists imprisoned by President Anwar Sadat, and was only released from jail three months later when Sadat was assassinated. Her imprisonment did not deter her writing or her activism; she remained a strong advocate for women’s rights throughout her life. In 2011, she joined the protesters in Tahrir Square in Cairo who eventually brought to an end the regime of President Hosni Mubarak. But her staunch opposition to Egypt’s Muslim Brotherhood, which gained prominence after Mubarak’s ouster, had many questioning her democratic ideals when she celebrated the removal of President Mohamed Morsi (who supported the Muslim Brotherhood), in a military coup. (Morsi’s successor and the coup leader, Abdel Fattah al-Sisi, is viewed by many as a dictator.)

Saadawi wrote more than 50 books during her lifetime. One of her more well-known novels is Woman at Point Zero, the story of Firdaus, a sex worker sentenced to death for murdering her pimp. In this book, Saadawi shows how patriarchy is relentless in its vilification of women – even those who have allowed their bodies to be used and abused for the pleasure or benefit of men. Women seeking justice for the crimes committed against them find that justice always favours men, including those who have committed the crimes. Women like Firdaus are described as “savage and dangerous”.

Mona Eltahawy, an Egyptian American journalist, sums up how important Saadawi’s writings were to her and to other women around the world: “Patriarchy fucks us over and it has us thinking we are the insane ones, that we are the wrong ones, that we are the unworthy ones . . . And so to be told that you are not insane or unworthy . . . that is the gift of Nawal El Saadawi . . . .”

Published by the good folks at The Elephant.

The Elephant is a platform for engaging citizens to reflect, re-member and re-envision their society by interrogating the past, the present, to fashion a future.

Follow us on Twitter.

The Return of the Taliban: What Now for the Women of Afghanistan?

By Rasna Warah An Australian newspaper called it “Modi’s COVID apocalypse”. The Indian activist and author Arundhati Roy calls it “a crime against humanity”. These descriptions of India’s current public health crisis may seem alarmist, but they are not far from the truth. By the end of April, India was recording more than 300,000 new COVID infections and nearly 3,000 deaths per day, a 30-fold increase from September last year, when the country reported a new infection rate of 11,000 per day. Media reports are showing overflowing crematoriums and hospitals overwhelmed by the number of patients seeking treatment. Reports of people dying in ambulances outside hospitals because the latter did not have enough beds or oxygen cylinders reveal a healthcare system that is on its knees.

However, according to those who are witnessing the catastrophe first-hand, the horrifying images shown in the local and international media are just a microcosm of what is really happening on the ground. Even those with money and connections are unable to secure the healthcare they need. Barkha Dutt, a famous media personality in India who lost her father to COVID last week, told ITV that despite her privileges and connections, she could not get access to the treatment her father needed. She never imagined that she would become the story that she has been covering for months. She said lack of drugs and equipment in New Delhi’s hospitals is even forcing people to go to Sikh temples, which are supplying oxygen for free to those who need it. Many families in New Delhi and other large cities are treating their sick relatives at home with oxygen cylinders, some bought at exorbitant rates on the black market. Crematoriums cannot keep up with the number of bodies arriving at their gates. The smell of death is everywhere.

Many of the current deaths are not exclusively due to the virus, but also to a lack of preparedness on the part of India’s healthcare system, which suddenly became overwhelmed due to a dramatic spike in corona cases. Analysts say the easing of restrictions and complacency on the part of Indians in general led to the crisis. People went back to work and continued with their daily lives as if there was no pandemic. The winter wedding season was in full swing in cities like New Delhi.

On its part, the government did little to avert the crisis by allowing the Kumbh Mela, the world’s largest religious gathering that is held along the banks of the Ganges river, to take place. The gathering became a superspreader event, as did the many political rallies held in states like West Bengal, which were attended by hundreds of people. At one such rally, Prime Minister Narendra Modi even boasted that the presence of large numbers of people at the rallies showed that his political party, the Bharatiya Janata Party (BJP), had massive support. Social distancing and wearing of masks were not prevalent at these crowded meetings.

In January, Modi told leaders at the World Economic Forum that India had “saved humanity from a disaster by containing corona effectively”. He said that India had defied expectations of “a tsunami of corona infections”. Now he is having to eat his own words. Not only has India, the world’s second most populous country, become the epicentre of the disease – with new aggressive variants being reported every week – but it is in the very awkward position of having to seek aid from other countries, including its long-time rival Pakistan, which has offered to help. The UK, USA and other governments plan to send oxygen and other medical supplies to India.

India has tended to view itself as a regional economic powerhouse, and so being reduced to a recipient of humanitarian aid is having a wounding effect. This is not how Modi, whose Hindu nationalist rhetoric has ignited a “Hindu First” movement in India, would like India to be viewed. India’s prime minister now finds himself reduced to having to accept medical aid for a country that has marketed itself as a destination for medical tourism and the “pharmacy of the world” that manufactures affordable drugs for developing nations. The Serum Institute of India is currently producing a large proportion of the AstraZeneca vaccine that is being rolled out in many countries. But Modi has decided to nationalise the institute as well, and has banned exports of the vaccine until the country sorts out its own health crisis, leaving millions of people around the world, including Kenya, in limbo.

India’s public healthcare system was already strained before the pandemic. The government spends a measly 1 per cent of its budget on health. The medical needs of Indians are met mostly by the private sector. Nearly 80 per cent of the healthcare in urban areas is provided by private facilities. In rural areas, 70 per cent of the population relies on private clinics and hospitals, which are unaffordable for the majority. This privatisation of healthcare has come at a huge cost. Poor Indians suffer disproportionately from preventable diseases. Malnutrition rates among mothers and children are also among the highest in the world. What we are witnessing is how neglect of public healthcare systems can have long-term negative consequences, especially during a disaster or an epidemic.

India is also a lesson in how leaders can impact the spread of a disease. Since he took office, Prime Minister Modi has tried very hard to control public perceptions about his achievements and the virtues of the BJP, which he has filled with spin doctors who try to present a rosy image of India under his leadership. Several journalists have been arrested under Modi’s watch and media organisations that call him out are dismissed as unpatriotic. News channels in India are dominated by pro-government news anchors and journalists who have twisted the narrative in favour of Modi, even when he stands in the way of press freedom. In March 2020, in the early days of the pandemic, Modi asked India’s Supreme Court to stop media organisations from publishing any COVID-related news without getting government clearance first. Thankfully, because the Supreme Court is obliged to protect the rights and freedoms enshrined in India’s constitution, including freedom of the press, the court refused his request.

What we are witnessing is how neglect of public healthcare systems can have long-term negative consequences.

Like Jair Bolsonaro in Brazil and Donald Trump in the USA, Modi underplayed the scale of the pandemic and painted independent media and journalists who questioned his policies as enemies of the people. As a result, more than half a million Americans, nearly 400,000 Brazilians and some 200,000 Indians have died from COVID-19. The link between a paranoid, media-hostile leadership and negative health outcomes is evident in these cases.

Many independent journalists and observers believe that the official figures on COVID deaths and infections put out by the Indian government are a gross underestimation, and that the actual figures could be two or three times more than those that are being reported. Crematoriums are reporting more cremations adhering to COVID protocols than what is being given as the official death toll from COVID-19. This could be partly because many deaths are occurring at home and so are not being reported. In addition, people who die from COVID but who were not tested are not recorded as having died from the disease.

Meanwhile, the BJP government, is assuring India’s 1.4 billion citizens that it is doing everything to increase the supply of oxygen and increase vaccination levels among those over the age of 18, but these measures are coming a little too late. The death toll is likely to rise significantly over the coming weeks.

Lack of trust in the government may be the biggest hurdle countries face as they try to contain the virus. In Kenya, the theft of COVID-19 donations last year and massive corruption scandals at the state-run medical supplies agency, KEMSA, have severely diminished citizens’ faith in the government’s willingness and ability to protect them. Moreover, apart from periodic lockdowns and curfews, there seems to be no strategy on how prevention measures will be instituted in the long term. Also no one is quite sure when vaccination will reach “herd immunity” levels; people like me who have received their first dose of the AstraZeneca vaccine under the COVAX facility – a global mechanism for pooled procurement and distribution of vaccines for low and middle income countries – still don’t know for sure if they will get their second jab, a scenario complicated by the fact that Modi has temporarily banned the Serum Institute from exporting the vaccines.

India has three important lessons for Kenya and the rest of the world.

Lesson 1: Do not neglect the public healthcare system

Countries around the world such as South Korea and Uganda that have successfully contained the coronavirus, managed to do so because the containment measures were led and funded by the public sector. Mass testing and other measures could not have taken place if the government did not initiate them, and ensured their successful implementation through a nationwide network of public healthcare facilities. But for this to happen, people must have faith in the government, which is sorely lacking in many countries.

The emphasis on private healthcare in countries such as Kenya and India has also left millions of poor and low-income people completely vulnerable to epidemics and pandemics. Public healthcare systems in all countries should be beefed up so that countries are not caught unawares in the future. Like public education, public health is an investment that reaps economic and social dividends in the future. COVID-19 has shown us the folly of relying solely on the private sector to meet citizens’ health needs and the importance of investing in robust public health systems that play a key role in detecting, containing and stopping the spread of infectious diseases.

Lesson 2: Do not suppress or distort scientific information and data

Donald Trump and Jair Bolsonaro consistently underplayed the threat posed by the novel coronavirus disease. Trump initially referred to it as a minor flu even as hospital beds were filling up, and even as infection rates were rising. Both leaders also mocked the wearing of masks and social distancing, which American and Brazilian scientists advocated. Trump’s rallies were filled with people who ignored corona protocols. In India, some politicians even said that the pandemic was a hoax intended to prevent farmers in Punjab from organising protests against the government’s agriculture policies. By ignoring the science, and peddling false information, these leaders put their countries’ citizens in immense danger. Vilifying the press – which is often the public’s main source of corona-related data and information – in the face of a pandemic is also not a good idea.

Lesson 3. Do not sacrifice public health to gain political mileage

Politicians should not sacrifice people’s lives at the altar of politics. Prime Minister Modi could have banned pilgrims from attending the Kumbh Mela, just as he ordered a nationwide lockdown early last year. But he chose not to do so because he wanted to appease Hindus and his Hindu nationalist base. In addition, he attended massive political rallies where few people wore masks, thereby facilitating the spread of the virus. He put people’s lives in danger because he wanted to score political points for his party. In the United States and Brazil, leaders chose to keep the economy running even if it meant losing hundreds of thousands of lives. In Kenya, politicians engaged in Building Bridges Initiative (BBI) rallies even as corona cases were rising. Moreover, parliamentarians are discussing BBI amendments to the constitution rather than what measures could be taken to protect Kenyans not just from the coronavirus disease and its various variants, but also from the hardships they have had to endure in the past year due to job losses and business closures. This is the type of shortsightedness and lack of compassion and vision among the country’s leadership that has led to the public health crisis facing India today.

Published by the good folks at The Elephant.

The Elephant is a platform for engaging citizens to reflect, re-member and re-envision their society by interrogating the past, the present, to fashion a future.

Follow us on Twitter.

The Return of the Taliban: What Now for the Women of Afghanistan?

By Rasna Warah Never did I imagine that opposing an International Monetary Fund (IMF) loan to Kenya would be viewed by the Kenyan authorities as a criminal act. But that is exactly what transpired last week when activist Mutemi Kiama was arrested and charged with “abuse of digital gadgets”, “hurting the presidency”, “creating public disorder” and other vaguely-worded offences. Mutemi’s arrest was prompted by his Twitter post of an image of President Uhuru Kenyatta with the following caption: “This is to notify the world . . . that the person whose photograph and names appear above is not authorised to act or transact on behalf of the citizens of the Republic of Kenya and that the nation and future generations shall not be held liable for any penalties of bad loans negotiated and/or borrowed by him.” He was released on a cash bail of KSh.500,000 with an order prohibiting him from using his social media accounts or speaking about COVID-19-related loans.

Mutemi is one among more than 200,000 Kenyans who have signed a petition to the IMF to halt a KSh257 billion (US$2.3 billion) loan to Kenya, which was ostensibly obtained to cushion the country against the negative economic impact of COVID-19. Kenya is not the only country whose citizens have opposed an IMF loan. Protests against IMF loans have been taking place in many countries, including Argentina, where people took to the streets in 2018 when the country took a US$50 billion loan from the IMF. In 2016, Eqyptian authorities were forced to lower fuel prices following demonstrations against an IMF-backed decision to eliminate fuel subsidies. Similar protests have also taken place in Jordan, Lebanon and Ecuador in recent years.

Why would a country’s citizens be against a loan given by an international financial institution such as the IMF? Well, for those Kenyans who survived (or barely survived) the IMF-World Bank Structural Adjustment Programmes (SAPs) of the 1980s and 90s, the answer is obvious. SAPs came with stringent conditions attached, which led to many layoffs in the civil service and removal of subsidies for essential services, such as health and education, which led to increasing levels of hardship and precarity, especially among middle- and low-income groups. African countries undergoing SAPs experienced what is often referred to as “a lost development decade” as belt- tightening measures stalled development programmes and stunted economic opportunities.

In addition, borrowing African countries lost their independence in matters related to economic policy. Since lenders, such as the World Bank and the IMF, decide national economic policy – for instance, by determining things like budget management, exchange rates and public sector involvement in the economy – they became the de facto policy and decision-making authorities in the countries that took their loans. This is why, in much of the 1980s and 1990s, the arrival of a World Bank or IMF delegation to Nairobi often got Kenyans very worried.

In those days (in the aftermath of a hike in oil prices in 1979 that saw most African countries experience a rise in import bills and a decline in export earnings), leaders of these international financial institutions were feared as much as the authoritarian Kenyan president, Daniel arap Moi, because with the stroke of a pen they could devalue the Kenyan currency overnight and get large chunks of the civil service fired. As Kenyan economist David Ndii pointed out recently at a press conference organised by the Linda Katiba campaign, when the IMF comes knocking, it essentially means the country is “under receivership”. It can no longer claim to determine its own economic policies. Countries essentially lose their sovereignty, a fact that seems to have eluded the technocrats who rushed to get this particular loan.

When he took office in 2002, President Mwai Kibaki kept the World Bank and the IMF at arm’s length, preferring to take no-strings-attached infrastructure loans from China. Kibaki’s “Look East” economic policy alarmed the Bretton Woods institutions and Western donors who had until then had a huge say in the country’s development trajectory, but it instilled a sense of pride and autonomy in Kenyans, which sadly, has been eroded by Uhuru and his inept cronies who have gone on loan fishing expeditions, including massive Eurobonds worth Sh692 billion (nearly $7 billion), which means that every Kenyan today has a debt of Sh137,000, more than three times what it was eight years ago when the Jubilee government came to power. By the end of last year, Kenya’s debt stood at nearly 70 per cent of GDP, up from 50 per cent at the end of 2015. This high level of debt can prove deadly for a country like Kenya that borrows in foreign currencies.

When the IMF comes knocking, it essentially means the country is “under receivership”.

The Jubilee government would have us believe that the fact that the IMF agreed to this loan is a sign that the country is economically healthy, but as Ndii noted, quite often the opposite is true: the IMF comes in precisely because a country is in a financial crisis. In Kenya’s case, this crisis has been precipitated by reckless borrowing by the Jubilee administration that has seen Kenya’s debt rise from KSh630 billion (about $6 billion at today’s exchange rate) when Kibaki took office in 2002, to a staggering KSh7.2 trillion (about US$70 billion) today, with not much to show for it, except a standard gauge railway (SGR) funded by Chinese loans that appears unable to pay for itself. As an article in a local daily pointed out, this is enough money to build 17 SGRs from Mombasa to Nairobi or 154 superhighways like the one from Nairobi to Thika. The tragedy is that many of these loans are unaccounted for; in fact, many Kenyans believe they are taken to line individual pockets. Uhuru Kenyatta has himself admitted that Kenya loses KSh2 billion a day to corruption in government. Some of these lost billions could actually be loans.

IMF loans with stringent conditions attached have often been presented as being the solution to a country’s economic woes – a belt-tightening measure that will instil fiscal discipline in a country’s economy by increasing revenue and decreasing expenditure. However, the real purpose of these loans, some argue, is to bring about major and fundamental policy changes at the national level – changes that reflect the neoliberal ethos of our time, complete with privatisation, free markets and deregulation.

The first ominous sign that the Kenyan government was about to embark on a perilous economic path was when the head of the IMF, Christine Lagarde, made an official visit to Kenya shortly after President Uhuru was elected in 2013. At that time, I remember tweeting that this was not a good omen; it indicated that the IMF was preparing to bring Kenya back into the IMF fold.

Naomi Klein’s book, The Shock Doctrine, shows how what she calls “disaster capitalism” has allowed the IMF, in particular, to administer “shock therapy” on nations reeling from natural or man-made disasters or high levels of external debt. This has led to unnecessary privatisation of state assets, government deregulation, massive layoffs of civil servants and reduction or elimination of subsidies, all of which can and do lead to increasing poverty and inequality. Klein is particularly critical of what is known as the Chicago School of Economics that she claims justifies greed, corruption, theft of public resources and personal enrichment as long as they advance the cause of free markets and neoliberalism. She shows how in nearly every country where the IMF “medicine” has been administered, inequality levels have escalated and poverty has become systemic.

Sometimes the IMF will create a pseudo-crisis in a country to force it to obtain an IMF bailout loan. Or, through carefully manipulated data, it will make the country look economically healthy so that it feels secure about applying for more loans. When that country can’t pay back the loans, which often happens, the IMF inflicts even more austerity measures (also known as “conditionalities”) on it, which lead to even more poverty and inequality.

IMF and World Bank loans for infrastructure projects also benefit Western corporations. Private companies hire experts to ensure that these companies secure government contracts for big infrastructure projects funded by these international financial institutions. Companies in rich countries like the United States often hire people who will do the bidding on their behalf. In his international “word-of-mouth bestseller”, Confessions of an Economic Hit Man, John Perkins explains how in the 1970s when he worked for an international consulting firm, he was told that his job was to “funnel money from the World Bank, the US Agency for International Development and other foreign aid organisations into the coffers of huge corporations and the pockets of a few wealthy families who control the planet’s resources”.

Sometimes the IMF will create a pseudo-crisis in a country to force it to obtain an IMF bailout loan.

The tools to carry out this goal, his employer admitted unashamedly, could include “fraudulent financial reports, rigged elections, payoffs, extortion, sex and murder”. Perkins showed how in the 1970s, he became instrumental in brokering deals with countries ranging from Panama to Saudi Arabia where he convinced leaders to accept projects that were detrimental to their own people but which enormously benefitted US corporate interests.

“In the end, those leaders become ensnared in a web of debt that ensures their loyalty. We can draw on them whenever we desire – to satisfy our political, economic or military needs. In turn, they bolster their political positions by bringing industrial parks, power plants, and airports to their people. The owners of US engineering/construction companies become fabulously wealthy,” a colleague told him when he asked why his job was so important.

Kenyans, who are already suffering financially due to the COVID-19 pandemic which saw nearly 2 million jobs in the formal sector disappear last year, will now be confronted with austerity measures at precisely the time when they need government subsidies and social safety nets. Season Two of SAPs is likely to make life for Kenyans even more miserable in the short and medium term.

We will have to wait and see whether overall dissatisfaction with the government will influence the outcome of the 2022 elections. However, whoever wins that election will still have to contend with rising debt and unsustainable repayments that have become President Uhuru Kenyatta’s most enduring legacy.

Published by the good folks at The Elephant.

The Elephant is a platform for engaging citizens to reflect, re-member and re-envision their society by interrogating the past, the present, to fashion a future.

Follow us on Twitter.

The Return of the Taliban: What Now for the Women of Afghanistan?

By Rasna Warah

The most tragic thing about the late Tanzanian president, John Pombe Magufuli, is that he and countless other Tanzanians might have died from a virus that he denied existed in his country. Although Tanzanian officials insist that the 61-year-old leader died from heart-related complications, it is widely rumored that he ultimately succumbed to COVID-19. Many of his top government officials have died from the disease in recent months. His mysterious disappearance at the end of February also suggested that he did not want his illness to be made public. As one Kenyan commentator put it, “Magufuli is a sad lesson of the illusion of invulnerability and indestructibility that newly-minted dictators revere.”

It did not have to end this way. When he was first elected president in 2015, Magufuli was lauded for his no-nonsense approach to corruption and for reducing wastage and grandiosity in government, earning him the moniker of “Bulldozer.” Jaded Kenyans, who have in recent years seen millions of dollars being stolen or misappropriated under the debt-ridden government of Uhuru Kenyatta, looked on with envy at their neighbor. Magufuli restricted unnecessary foreign travel by his cabinet members and was known to personally inspect hospitals and other public service facilities to see if they were doing their job. The Tanzanian president did not even shy away from telling off so-called “development partners.” He was probably the only African leader to have told China off for its punitive infrastructure loans (see the ongoing controversy over whether China will seize the Port of Mombasa if Kenya defaults on its loans). Kenyans even created the hashtag #WhatWouldMagufuliDo to remind their own leaders what they were doing wrong.

But as the years went by, Magufuli became increasingly authoritarian, but not in the usual way of African dictators—which often involves the use of military might and the rigging of elections—but in a bizarre Donald Trump kind of way. Despite being a highly qualified scientist (he held a PhD in Chemistry), Magufuli turned his back on science last year by announcing that there was no coronavirus in Tanzania. He refused to impose lockdowns and social distancing measures (let alone consider the option of less extreme response strategies), and stopped releasing public health data on the number of COVID-19 cases in Tanzania. This not only made Tanzanians more vulnerable to the virus, but also posed health risks to countries that shared a border with Tanzania. Schools and restaurants remained open even as stories of people dying in hospitals from “pneumonia” were spreading.

The cracks had already begun to appear within months of his presidency. Shortly after assuming office, Magufuli started displaying the hallmarks of a tin-pot dictator. He banned political activities on the pretext that they interfered with “nation building.” He curtailed media freedom and arrested opposition leaders. Journalists who opposed his policies were harassed. In 2017, he decreed that girls who fell pregnant would not be allowed to attend school, a policy that shocked education officials and women’s rights activists alike. He also started displaying xenophobic tendencies that portrayed foreigners as the enemies of the Tanzanian people.

Commenting on his rule, the researcher Abdullahi Boru Halakhe described the difference between Magufuli and his widely respected and revered predecessor Julius Nyerere as the following: “Nyerere was an outward-looking globalist who saw Tanzania as a leader in world affairs. He invited people of African and non-African origin to witness Tanzania’s nascent experiment with an alternative model of governance and economic independence that was not controlled and exploited by global capital. Magufuli, on the other hand, is an inward-looking provincial nativist who wants a Tanzania for Tanzanians alone.”

Despite its poverty and dependence on donor aid, Tanzania has always been viewed as a nation that does not suffer from the sicknesses of its neighbor Kenya, where an avaricious and visionless political elite has no qualms about hollowing out the state for its own benefit, or of Uganda, where an ageing dictator is ruthlessly crushing a youthful opposition to maintain a hold on power. When a Tanzanian is in the room, Kenyans feel slightly uncomfortable, even ashamed, because we know that unlike Kenya, Tanzania is held together by an ideology that is not centered around primitive wealth accumulation and individualism, and also because we have never had a visionary leader like Nyerere. Fondly called “Mwalimu” (meaning teacher in Kiswahili), he once said that “We, the people of Tanganyika [what Tanzania was called before its union with Zanzibar in 1964] , would like to light a candle and put it on top of Mount Kilimanjaro which would shine beyond our borders giving hope where there was despair, love where there was hate, and dignity where there was before only humiliation.”

While Nyerere’s African socialist experiment of Ujamaa was largely an economic failure in Tanzania, it left a psychological and social legacy of brotherhood and unity among the people. While Kenyans often deride Tanzania for its socialist tendencies that have only spawned more poverty, they have always maintained a certain amount of respect for the country and its leaders, even those who ended up being corrupt—because we know that despite its poverty, Tanzania has always held the moral high ground on affairs to do with the continent. When Kenya was secretly supporting the apartheid regime in South Africa in the 1970s, Tanzania was the frontline state that was actively arming and hosting the African National Congress and Pan-Africanist Congress’s anti-apartheid struggle. When Kenya almost descended into an ethnic-based civil war after the 2007 elections, it was the Tanzanian president, Jakaya Kikwete (among other eminent African leaders), who was brought in to mediate with the warring political factions. Kenyans know that when confronted with an ethical dilemma, Tanzanians will always strive to do the right thing.

When I was in Dar es Salaam for a meeting just after the Kenyan presidential election in 2013, it was a Tanzanian taxi driver who reminded me that Kenyans had made a mistake by voting in two people who had been indicted for crimes against humanity at the International Criminal Court. “Why did they do it?” he asked me. It was a question laced with incredulity, as if to remind me that despite claiming to be the economic powerhouse of the region (although Tanzania and Ethiopia’s economic growth rates have surpassed it in recent years), Kenyans lacked a moral conscience.

The death of a sitting president would most likely have led to a power vacuum in African countries such as Kenya, where ethnic kingpins would no doubt have jostled for power and positions and created conditions for conflict. But Tanzania, once again, has showed us the way. Vice President Samia Suluhu Hassan was sworn in without much fuss or opposition as Tanzania’s president the day after Magufuli’s death, becoming the first female head of state in Eastern Africa, and a potential role model for aspiring women politicians across the region.

Her immediate task will no doubt be to tackle the COVID-19 pandemic that has led to a deadly third wave in many African countries, and to convince the international public health community that Tanzania is back on board. Since she is relatively unknown, even within Tanzania, she will also need to carve her own unique identity as an African leader that is not perceived as a continuation of her predecessor’s legacy.

– This post is from a new partnership between Africa Is a Country and The Elephant. We will be publishing a series of posts from their site once a week.

Published by the good folks at The Elephant.

The Elephant is a platform for engaging citizens to reflect, re-member and re-envision their society by interrogating the past, the present, to fashion a future.

Follow us on Twitter. The Return of the Taliban: What Now for the Women of Afghanistan?

By Rasna Warah

When former prime minister Mohamed Abdullahi Farmaajo was elected president of the Federal Government of Somalia in 2017, many lauded his victory. Unlike his predecessors, Farmaajo was viewed as a leader who would unite the country because he had a nationalistic mindset and was someone who was not influenced by clan interests. Many believed that, unlike his predecessor, Hassan Sheikh, whose tenure was marred by corruption allegations and in-fighting, he would bring together a country that has remained fragmented along clan lines and endured internal conflicts for decades. He was also perceived to be someone who would address corruption that has been endemic in every Somali government since the days of President Siad Barre.

Sadly, Farmaajo’s tenure did not result in significant transformation of Somali governance structures or politics. On the contrary, his open hostility towards leaders of federal states – notably Jubbaland, where he is said to have interfered in elections by imposing his own candidate – and claims that corruption in his government had increased, not decreased, left many wondering if he had perhaps been over-rated. Now opposition groups have said that they will not recognise him as the head of state as he has failed to organise the much anticipated one-person-one-vote election that was due this month, which would have either extended or ended his term. This apparent power vacuum has caused some jitters in the international community, whose backing Farmaajo has enjoyed.

However, it would be naïve to assume that Farmaajo’s exit is a critical destabilising factor in Somalia, because, frankly, the president in present-day Somalia is merely a figurehead; he does not wield real power. The government in Mogadishu has had little control over the rest of the country, where clan-based fiefdoms and federal states do pretty much what they want, with little reference go Mogadishu. National security is largely in the hands of the African Union Mission in Somalia (AMISOM) forces, not the Somalia National Army.

The concept of a state that delivers services to citizens has also remained a mirage for most Somalis who are governed either by customary law known as xeer or the Sharia. Some have even argued that with its strict codes and hold over populations through systems of “tax collection” or “protection fees” combined with service delivery, Al Shabaab actually offers a semblance of “governance” in the areas it controls – even if these taxes are collected through extortion or threats of violence.

In much of Somalia, services, such as health and education, are largely provided by foreign faith- based foundations, non-governmental organisations or the private sector, not the state. Many hospitals and schools are funded by foreign (mostly Arab) governments or religious institutions. This means that the state remains largely absent in people’s lives. And because NGOs and foundations can only do so much, much of the country remains unserviced, with the result that Somalia continues to remain one of the most underdeveloped countries in the world, with high levels of illiteracy (estimates indicate that the literacy rate is as low as 20 per cent). State institutions, such as the Central Bank and revenue collection authorities, are also either non-existent or dysfunctional.

Efforts by the United Nations and the international community to bring a semblance of governance by supporting governments that are heavily funded by Western and Arab countries have not helped to establish the institutions necessary for the government to run efficiently. On the contrary, some might argue that that foreign aid has been counter-productive as it has entrenched corruption in government (as much of the aid is stolen by corrupt officials) and slowed down Somalia’s recovery.

Foreign governments have also been blamed for destabilising Somalia. The US-backed Ethiopian invasion of Somalia in 2006, which succeeded in ousting the Islamic Courts Union (ICU) – which had successfully brought about a semblance of governance in Somalia through a coalition of Muslim clerics and businessmen – spawned radical groups like Al Shabaab, which have wreaked havoc in Somalia ever since. Kenya’s misguided “incursion” into Somalia in 2011, had a similar effect: Al Shabaab unleashed its terror on Kenyan soil, and Kenya lost its standing as a neutral country that does not intervene militarily in neighbouring countries. Certain Arab countries, notably Qatar and the United Arab Emirates, have also been accused of interfering in Somalia’s elections by sponsoring favoured candidates.

All of Somalia’s governments since 2004, when a transitional government was established, have thus failed to re-build state institutions that were destroyed during the civil war or to deliver services to the Somali people. In its entire eight-year tenure, from October 2004 to August 2012, the Transitional Federal Government (TFG) did not have the capacity to become a fully functioning government, with a fully-fledged revenue collecting authority and robust ministries. Ministers had no portfolios and ministries had skeletal staff. The national army was weak and under-funded, and since 2007, the government has relied almost exclusively on African Union soldiers for security, though some donors, notably Turkey, have attempted to revive the Somalia National Army.

Somalia’s first post-transition government was elected in 2012 under a United Nations-brokered constitution. Hassan Sheikh was elected as president with much enthusiasm and in the belief that things would be different under a government that had the goodwill of the people. In his first year in office, President Hassan Sheikh was named by TIME magazine as one of the world’s 100 most influential people. Somalia expert Ken Menkhaus called his election “a seismic event” that “electrified Somalis and both surprised and relieved the international community”. However, it would not be long before his government would also be marred by corruption allegations.

What governance model should Somalia adopt?

There has been some debate about which type of governance model is most suitable for a country that is not just divided along clan/regional lines, but where lack of functioning secular institutions threaten nation-building.

Federalism, that is, regional autonomy within a single political system, has been proposed by the international community as the most suitable system for Somalia as it caters for deep clan divisions by allocating the major clans semi-autonomous regional territories. The 4.5 formula for government representation proposed by the constitution based on the four largest clans (Darod, Hawiye, Dir and Rahanweyne) and 0.5 positions for minorities does acknowledge the reality of a clan-based society, but as Somalia’s recent history has shown, clan can be, and has been, manipulated for personal gain by politicians. As dominant clans seek to gain power in a federated Somalia, there is also the danger that the new federal states will mimic the corruption and dysfunction that has prevailed at the centre, which will lead to more competition for territories among rival clans and, therefore, to more conflict.

Several experts have also proposed a building block approach, whereby the country is divided into six local administrative structures that would eventually resemble a patchwork of semi-autonomous territories defined in whole or in part by clan affiliation.. In one such proposal, the Isaaq clan would dominate Somaliland in the northwest; the Majerteen in present-day Puntland would dominate the northeast; the heterogeneous Jubbaland and Gedo regions bordering Kenya would have a mixture of clans (though there are now fears that the Ogaden, who are politically influential along the Kenya border, would eventually control the region); a Hawiye-dominated polity would dominate central Somalia; the Digil-Mirifle would centre around Bay and Bakol; and Mogadishu would remain a cosmopolitan administrative centre.

Somaliland offers important lessons on the governance models that could work in a strife-torn society divided along clan lines and where radical Islamist factions have taken root. Since it declared independence from Somalia in 1991, Somaliland has remained relatively peaceful and has had its own government and institutions that have worked quite well and brought a semblance of normality in this troubled region.

After Siad Barre ordered an attack on Hargeisa following opposition to his rule there, Somaliland decided to forge its own path and disassociate from the dysfunction that marked both the latter part of Barre’s regime and the warlordism that replaced it during the civil war. It then adopted a unique hybrid system of governance, which incorporates elements of traditional customary law, Sharia law and modern secular institutions, including a parliament, a judiciary, an army and a police force. The Guurti, the upper house of Somaliland’s legislature, comprises traditional clan elders, religious leaders and ordinary citizens from various professions who are selected by their respective clans. The Guurti wields enormous decision-making powers and is considered one of the stabilising factors in Somaliland’s inclusive governance model. Michael Walls, the author of A Somali Nation-State: History, Culture and Somaliland’s Political Transition, has described Somaliland’s governance model as “the first indigenous modern African form of government” that fuses traditional forms of organisation with those of representative democracy. However, Somaliland’s governance model is far from perfect: the consensual clan-based politics has hindered issue-based politics, eroded individual rights and led to the perception that some clans, such as the dominant Isaaq clan, are favoured over others. Tensions across its eastern border with Puntland also threaten its future stability.

In addition, because it is still not recognised internationally as a sovereign state, Somaliland is denied many of the opportunities that come with statehood. It cannot easily enter into bilateral agreements with other countries, get multinational companies to invest there or obtain loans from international financial institutions, though in recent years it has been able to overcome some of these obstacles.

Somaliland is also not recognised by the Federal Government of Somalia, which believes that Somaliland will eventually relent and unite with Somalia, which seems highly unrealistic at this time. This is one reason why the Somali government gets so upset when Kenyan leaders engage with Somaliland leaders, as happened recently when Mogadishu withdrew its ambassador from Nairobi after President Uhuru Kenyatta met with the Somaliland leader Musa Bihi Abdi at State House. Raila Odinga’s recent call to the international community to recognise Somaliland as an independent state has been welcomed by Somalilanders, but is viewed with suspicion by the federal government in Mogadishu

Nonetheless, there has been some debate about whether Somaliland’s hybrid governance model, which incorporates both customary and Western-style democracy, is perhaps the best governance model for Somalia. Is the current Western- and internationally-supported political dispensation in Somalia that has emerged after three decades of anarchy a “fake democracy”? Can Somalia be salvaged through more home-grown solutions, like the one in Somaliland? Should Somalia break up into small autonomous states that are better able to govern themselves?

Balkanisation is usually a deprecated political term referring to, according to Wikipedia, the “disorderly or unpredictable fragmentation, or sub-fragmentation, of a larger region or state into smaller regions or states, which may be hostile or uncooperative with one another”. While usually associated with increasing instability and conflict, balkanisation could nonetheless still be the only solution for a country that has been unable to unite or to offer hope to its disillusioned citizens for more than three decades.

As Guled Ahmed of the Middle East Institute notes, “the 1995 Dayton accords, which ended the Bosnian war, paved the way for ethnic balkanisation of former Yugoslavia into six countries. This resulted in peace and stability and prosperity. So if Eastern European countries can separate along ethnicism, why not balkanise Somalia with multi-ethnicism just like the former Yugoslavia to achieve peace and stability and fair elections based on one person one vote?”, he said.

Ahmed told me that balkanisation would also eliminate Al Shabaab (which has been fighting the government in Mogadishu for the last 14 years) as the independent states created would be more vigilant about who controls their territories and also because people will have more ownership of their government. Somali refugees languishing in Kenya, Ethiopia and elsewhere might also be tempted to finally return home.

Balkanisation can, however, be messy – and bloody. But Somalia need not go down that route. A negotiated separation could still be arrived at peacefully with the blessing of the international community. If the international community is serious about peace and stability in Somalia, it should pave the way for these discussions. Sometimes divorce is preferable to an acrimonious marriage. Published by the good folks at The Elephant.

The Elephant is a platform for engaging citizens to reflect, re-member and re-envision their society by interrogating the past, the present, to fashion a future.

Follow us on Twitter.

The Return of the Taliban: What Now for the Women of Afghanistan?

By Rasna Warah

At a time when “social distancing” is becoming the norm due to the coronavirus pandemic, it may appear self-indulgent to reminisce about a period when going to the cinema was a regular feature of East African Asians’ lives. But perhaps now that the world is changing – and many more people are watching movies at home on Netflix and other channels – it is important to document the things that have been lost in the war against COVID-19 and with the advent of technology. One of these things is the thrill of going to the cinema with the family. What has also been lost is an urban culture embedded in East Africa’s South Asian community – a culture where movie-going was an integral part of the social fabric of this economically successful minority.

Those who pass the notorious Globe Cinema roundabout, which is often associated with pickpockets and street children, might be surprised to learn that the Globe Cinema (which no longer shows films but is used for other purposes, such as church prayer meetings) was once the place to be seen on a Sunday evening among Nairobi’s Asian community. I remember that cinema well because in the 1970s my family used to go there to watch the latest Indian – or to be more specific, Hindi (India also produces films in regional languages like Telegu, Bengali and Punjabi) blockbuster at 6 p.m. on Sundays. Sunday was movie day in my family, and going to the cinema was a ritual we all looked forward to. The Globe Cinema was considered one of the more “posh” cinemas in Nairobi; not only was it more luxurious than the others, but it also had better acoustics.

As veteran journalist Kul Bhushan writes in a recent edition of Awaaz magazine (which is dedicated entirely to Indian cinema in East Africa from the early 1900s to the 1980s), “Perched on a hillock overlooking the Ngara roundabout, the Globe became the first choice for cinemagoers for new [Indian] releases as it became the venue to ogle and be ogled by the old and the young.”

Indian movies were – and are – the primary source of knowledge about Indian culture among East Africa’s Asian community. The early Indian migrants had little contact with the motherland, as trips back home were not only expensive but the sea voyage from Mombasa to Bombay or Karachi took weeks. (At independence in 1947, the Indian subcontinent became two countries – India and Pakistan – hence the reference to Indians in East Africa as “Asians”.) So they relied on Indian films to learn about the customs and traditions of the country they or their ancestors had left behind.

Exposure to Indian languages and culture through films was one way Indians abroad or in the diaspora retained their identity and got to learn about their traditions and customs. I got to learn about the spring festival of Holi and goddesses such as Durga from watching Indian films. I also learnt Hindi, or rather Hindustani – a mix of Hindi (which is Sanskrit-based) and Urdu (which is also Sanskrit-based but which borrows heavily from Persian and Arabic) – which is the lingua franca of Northern India and Pakistan, and which is the language most commonly used in the so-called Hindi cinema.

On the other hand, the sexist culture portrayed in the majority of Indian films also reinforced sexual discrimination among East African Asians. The idea that women are subservient to men, and that it is the woman who must sacrifice her own needs and desires for the “greater good” of the family/community, were – and still are – dominant in Indian cinema. Love stories portrayed in films – where young lovebirds defy societal expectations and cross class, religion or caste barriers – were not supposed to be emulated; they were considered pure entertainment and not reflective of a society where arranged marriages were and still are the norm. I heard many stories of how if an Asian woman dared to cross racial, religious or caste barriers she was severely reprimanded or stigmatised.

Watching Indian movies was also one way of keeping up with the latest fashions. Men and women often tried to copy the hairstyles and clothes of their favourite movie stars. When the hugely successful film Bobby was released in 1973, many girls adopted the hairstyle of the lead actress (who was barely 16 when she starred in the film) Dimple Kapadia. (I used to have a blouse at that time that was a replica of the one the actress wore in the film.) When the famous film star Sharmila Tagore dared to wear a revealing swimsuit in the 1967 film An Evening in Paris, she opened the door for many Indian women to go swimming without covering themselves fully. Since music often defined the success of a film, top playback singers, such as Lata Mangeshkar, Kishore Kumar and Muhammad Rafi, were held in high regard, and people flocked to watch their live concerts in Nairobi. Wealth and opulence were in full display at these events.

The Golden Age

The 60s, 70s and 80s are often described as the Golden Age of Indian/Hindi cinema. Nairobi, Mombasa and Kisumu, where there were large concentrations of Asians, had many cinemas devoted to showing films made in Bombay (now Mumbai) – often referred to as . This was the time when actors and actresses like Rajesh Khanna, Hema Malini, Amitabh Bachchan and Sridevi became superstars.

Cinemas in Nairobi were always full, especially on weekends when Asian families flocked to the dome-like Shan in Ngara, to Liberty in Pangani, or to the Odeon or the Embassy in the city centre. (except for Shan cinema, all the others are no longer cinema halls but are used for other purposes. Shan was rescued from decrepitude by the Sarakasi Trust, which changed its name to The Dome; it is now used for cultural activities.) Over the years, an increasing number of Africans began watching Indian films. Oyunga Pala, the chief curator at The Elephant, recalls going to the Tivoli cinema in Kisumu, where he first got to see Amitabh Bachchan in action.

“Right next to the Liberty Cinema was situated the clinic of a very popular Indian doctor,” recalls Neera Kapur-Dromson in an article published in the Indian cinema edition of Awaaz. “The small waiting room was always crammed with patients. But that never deterred him from taking ample breaks to enjoy a few scenes of the film being screened…”

But for Asian teenage girls and boys in Nairobi, the place to be seen on a Sunday evening was the Belle Vue Drive-In cinema on Mombasa Road. Young Asian men would show off their (fathers’) cars and young women would display the latest fashions – all in the hope of catching the attention of a potential mate. Food was shared – and sometimes even cooked – on the gentle slopes of the parking spots. Going to the Drive-In was like going for a picnic. And as the lights dimmed, the large bulky speakers were put on full volume so that everyone (usually father, mother, and three or four kids in the back seat) in the car – and beside it – could hear the dialogues. Fox Drive-In cinema on Thika Road was also a popular joint, but mainly with the younger crowd who preferred watching the Hollywood movies which were a regular feature there.

It was the same in Kampala. Vali Jamal, recalling his youthful days in Uganda’s capital city, says that the Sunday outing to the Drive-In was the only time there was a traffic jam in Kampala. “Idi Amin got caught in one of them, driving back to Entebbe with his foreign minister Wanume Kibedi,” he writes. “‘Where are we?’ quoth the president, ‘In Bombay?’ And the expulsion happened.”

He continues: “Well, let me not exaggerate, but South Asian wealth was on display on the Sundays accompanied by their notions of exclusion, and let us not forget that those two variables – income inequality and racial arrogance – figured heavily in Amin’s decision to expel us.” (In August 1972, President Idi Amin expelled more than 70,000 Asians from Uganda.)

Urban conversations

In her book, Reel Pleasures: Cinema Audiences and Entrepreneurs in Twentieth Century Urban Tanzania, Laura Fair describes how the Sunday evening shows became a focal point of urban conversations among Tanzania’s Asian community. They were meeting points, like temples, mosques or churches, where people sought affirmation.

As in Kenya, Sunday shows in Tanzania were family and community bonding events. “Cinema halls were not lifeless chunks of brick and mortar; they resonated with soul and spirit. They were places that gave individual lives meaning, spaces that gave a town emotional life. Across generations, cinemas were central to community formation,” says the author. Indian cinema thus played an integral role in the social lives of the South Asian community in East Africa.

It all started in the 1920s when Kala Savani, a textile trader, imported a hand-cranked projector and began showing silent Indian films in a rented warehouse in the coastal town of Mombasa. In 1931, when two brothers, Janmohamed Hasham and Valli Hasham, built the Regal Cinema, he began renting the venue to show Hindi films. Two years later, he built his own 700-seat Majestic Cinema in Mombasa, which showed Indian films and also hosted live shows.

The late Mohanlal Savani was a man of vision, recalls his son Manu Savani in an article chronicling how his father expanded movie-viewing in East Africa. “As time progressed Majestic became an established cinema on the Kenyan coast. The owners of Majestic also became fully fledged film distributors with links stretching, to start with, to Uganda and [what was then known as] Tanganyika.”

Famous Indian movie stars began gracing these cinemas in order to increase their fan following. Notable among these were the legendary Dilip Kumar, a 1950s heartthrob whose portrayal of jilted lovers set many a heart fluttering, and Asha Parekh, who made her name in tragic love stories such as Kati Patang.

Indian cinema had wide appeal not just in Kenya, but also in neighbouring Zanzibar, where the urban night life was dominated by Indian movies. Many a taraab tune came directly from the hit songs of Indian movies. As opposed to Western movies (often referred to as English movies), Indian films appealed to Swahili sensibilities, with their focus on values such as modesty, respect for elders and morality.

In Zanzibar, Lamu and other coastal areas where segregation between the sexes was strictly observed, there were special zenana (women-only) shows, where women dressed up in their finest to join other women in watching Indian and Egyptian films. For many Asian and Swahili women, the zenana afternoon show was a rare opportunity to leave their cloistered existence and let their hair down, and also to meet up with friends outside the confines of their homes. (I once went to a women- only show at Nairobi’s Shan cinema on a Wednesday afternoon with my grandmother when I was about eight or nine years old and I can tell you there was less movie-watching and more talking and gossiping going on during the show.)

Unfortunately, the old cinemas in Zanzibar are no more, which is surprising because the island is host to the Zanzibar International Film Festival. Cine Afrique, the only standing cinema in Zanzibar when I visited the island in 2003, was a pale shadow of its former shelf, with its cracked ceiling and broken seats. I believe it has now been demolished to pave way for a mall. The Empire, another famous cinema on the island, is now a supermarket and the once impressive Royal Cinema is in an advanced stage of decay.

The decline of the movie theatre

There are many reasons for the decline of Indian movie theatres in East Africa, among them piracy, declining South Asian populations and technologies that allow people to watch movies from the comfort of their homes. The introduction of multiplex cinemas in shopping malls has also lessened the appeal of a stand-alone cinemas, and made movie-going less of an “event” and more of something that can be done while doing other things. Indian cinema has also evolved. Unrequited love, family dramas, good versus evil and the “angry young man” genre popularised by Amitabh Bachchan – constant themes in the “masala” Indian films of the 70s and 80s – have been replaced by more sophisticated and nuanced plots, perhaps in response to a large Indian diaspora in the West which is more interested in plots that are more realistic and reflective of their own lives. The escapism of the Indian cinema of yesteryear has given way to realism, which makes cinema-going less “entertaining”.

Indian actors and actresses are also getting more roles in films made in Hollywood, and American and British films are increasingly finding India to be an interesting backdrop or subject for their movies, as evidenced by the huge success of films like Slumdog Millionaire. This has expanded the scope and definition of what constitutes an “Indian movie”.

Some would say that Indian cinema has actually deteriorated, with its emphasis on semi- pornographic dance routines and plots revolving around upper class people and their angst. So- called “art cinema” produced by award-winning directors like Satyajit Ray and Shyam Benegal, which portrays the lives of the downtrodden and addresses important social issues, or distinctly feminist films like Parama (directed by Aparna Sen), which explores the inner worlds of Indian women, are few and far between.

But as any Indian movie buff will tell you (and I include myself in this group), the experience of watching an Indian film in a cinema cannot be matched on a TV or computer screen. Indian cinema in its heyday was a feast for the eyes. If you wanted to enter the magical world of Indian cinema, complete with elaborate and well-choreographed dances, heart-stirring music and emotion, you saw Indian films in a movie theatre.

Alas, those days are fast disappearing thanks to terrorism, technology and now COVID-19. And along with this, a distinctly East African urban culture has been lost forever.

Published by the good folks at The Elephant.

The Elephant is a platform for engaging citizens to reflect, re-member and re-envision their society by interrogating the past, the present, to fashion a future.

Follow us on Twitter.

The Return of the Taliban: What Now for the Women of Afghanistan?

By Rasna Warah

Supporters of the “handshake” between Raila Odinga and Uhuru Kenyatta often portray it as a Nobel Peace Prize-winning moment, like the one when Ethiopian Prime Minister Abiy Ahmed Ali extended an olive branch to Eritrea, thereby ending a 16-year deadlock and hostilities. Some have even described it as “a seminal moment” in Kenya’s history where two political rivals put aside their differences for the sake of the country. The rapprochement is often couched in the language of reconciliation, of two dynastic families coming together by turning old hatreds into friendship, a truce between former foes.

However, those who believe that Odinga and Uhuru averted a civil war by uniting the country have totally missed the point. First of all, whether or not Raila Odinga had a personal hatred or historical grievance against Uhuru Kenyatta and his family is completely irrelevant. You do not form an opposition party because you have a personal vendetta against the leader of the ruling party. In fact, you do not form any party because you want to hurt your enemy. Political parties should be based on certain ideals and an ideology, not on the personal ambitions or pet peeves of their leaders.

The majority of those who consistently oppose the Jubilee government do so not because they hate Uhuru Kenyatta or William Ruto. They do so because they believe that their government is on the wrong path and is actually causing harm to the majority of the country’s population through a variety of laws and policies that have crippled the country economically and curtailed certain freedoms.

Personally, I have no grudge against Uhuru Kenyatta except that he has used his family’s name, connections, wealth and power to gain political leadership. Without these, he might never have ascended to power. I believe he has yet to grow into his own man – on his own, not on his family’s strength.

I am sure if I met him we might even get along. He seems to be the kind of guy you could have a good laugh with over a beer. Being age-mates born in the flush and optimism of independence, I think we might even have a lot in common (something I cannot say about his deputy, Ruto). But would I entrust my drinking buddy with the title deed of my property or with the welfare of my grandmother? Probably not, because I know he is surrounded by predators.

I have voted for Raila Odinga in every election since 2007, not because I particularly like the guy. In fact, I think if we met over a beer we might have little to talk about other than politics. I think I might even have to massage his ego to get a word in. But I voted for him because I believed he represented a progressive worldview and ideology that were in line with my thinking. I felt that despite his flaws (like his relentless ambition for political power even though he wields considerable influence across the country) he still stood for something – and was willing to die for this something if necessary. He was the stuff heroes are made of.

But the handshake shattered my vision of him. Like many people in the opposition, I felt betrayed, used, discarded. A trust was broken. And repairing this trust will not be easy.

Raila’s diehard supporters tell me that I miss the point, that Raila is a master strategist who will enforce his reformist agenda through the handshake. They say that Raila is known to neutralise or demolish a ruling party by infiltrating or joining hands with it, as he did with Moi’s KANU – in essence, he destroys the ruling party from within.

I am not buying this theory. If anything, the handshake has not only legitimised Uhuru’s government but has also strengthened it. What is worse, the opposition as we knew it has been completely neutered and emasculated. As the Linda Katiba campaign led by Martha Karua and others has stated, “Since the handshake our democracy has been distorted and the role of the minority party (opposition) completely eroded. The opposition’s more government than the government.”

Why is this dangerous for a fragile democracy like ours? Well, because in any healthy democracy, the role of the opposition is critical to keep the ruling government in check. Without an opposition, we become a de facto one-party democracy, as we were in the pre-1991 Moi days. And, as those who came of age in that period will tell you, there is no such thing as a one-party democracy – such systems of governance are known as dictatorships.

Some argue that such dictatorships can be benevolent and crucial for national unity. They point to Rwanda and China where authoritarian one-party leadership has improved the standard of living of the majority and brought a semblance of order and predictability in countries that would be impossible to govern without a strong leader or party. I would tell these people that if you want to see the havoc that authoritarian leadership can bring to a nation, just look to India, once a thriving secular democracy – the largest in the world – that is now veering dangerously towards xenophobic Hindu nationalism. Closer to home, look at Uganda, where an aging former rebel leader is arresting and killing young people because he can’t stand the thought of relinquishing power.

Power grab

The Building Bridges Initiative (BBI) is similarly being portrayed as the great unifier, the glue that will hold our ethnically divided country together. What its drafters failed to recognise is that placing a Luo at the high table will not resolve ethnic differences in this country because our ethnic animosities are the result of bad politics and poor leadership, as I pointed out here.

Proponents of BBI say that it will bring about equity and stability. This might all be well and true. But it is what they are not saying that is most worrying. The most important omission is that the BBI is not legally constituted; the BBI team was simply an ad hoc committee that had no legal or constitutional basis. Moreover, it seems to have been formed with the specific aim of changing the 2010 constitution. There are nice-sounding words in the BBI report to show that the team really cares about the poor, women and marginalised groups, but there is no acknowledgment that these groups are already catered for in the constitution, and that it is because the Jubilee government has failed to adhere to the constitution that we still find these groups under-represented or ignored.

There is also the danger that, by throwing in some goodies here and there, the proposed amendments to the constitution, as envisioned by the BBI, would also bring in changes that actually undermine or completely negate the 2010 constitution – for instance, by creating the positions of prime minister and deputy prime ministers and adding 70 more members to the National Assembly (a wage bill we can hardly afford).

And as defenders of the 2010 constitution have repeatedly said, devolution – if properly implemented – could be the true game changer in Kenya. There is a danger that in adopting the proposed amendments and delegating more powers to the national government, we will be back in the days when the centre controlled everything, even our thoughts. Tampering with the constitution under such circumstances would be tantamount to murdering the constitution and committing suicide.

Already we are hearing about area chiefs being ordered to collect signatures for the BBI referendum. Such coercion is reminiscent of the Moi days when chiefs had a direct line to State House and were, therefore, the most dreaded officials in far-flung and remote areas of the country, where an area chief could make your life hell if you did not follow Nyayo’s orders.

To get an idea of the dangers we might be facing, just look at what has happened to Nairobi County. The Jubilee Party wholeheartedly supported a dubious candidate for Nairobi County governor and then proceeded to strip him and the county of its powers. Nairobi was militarised through an executive order and the appointment of a military officer to head the (illegally constituted) Nairobi Metropolitan Services.

Commenting on the (unconstitutional) executive order that led to this state of affairs in an article in Foreign Policy titled Kenya’s Road to Dictatorship Runs Through Nairobi County, Carey Baraka wrote:

The militarisation of Nairobi and the subsequent transfer of the county’s administration into the president’s office is a brazen power grab by Kenyatta; even more worrying is the fact that the moves have gone unchecked by Kenya’s parliament . . . Kenya’s history is replete with unilateral declarations from the president’s office. It is to this past that Kenyatta seeks to return.

If we had a healthy opposition, such a move would have been strongly opposed, but because we had both an inept governor and a neutered opposition, this assault on devolution was met with resignation, even jubilation.

William Shakespeare’s Lady Macbeth says to her ambitious husband who wants to be King of Scotland, “To beguile time/Look like the time.” Macbeth takes his wife’s advice, and uses deception to kill the Scottish king, who he replaces. A reign of terror ensues, resulting in a bloody civil war. A guilt-ridden Lady Macbeth commits suicide. Macbeth is eventually killed.

No one wins in the end. The bard has lessons for us all.

Published by the good folks at The Elephant.

The Elephant is a platform for engaging citizens to reflect, re-member and re-envision their society by interrogating the past, the present, to fashion a future.

Follow us on Twitter.

The Return of the Taliban: What Now for the Women of Afghanistan?

By Rasna Warah

I have resisted commenting on the recently launched Building Bridges Initiative (BBI) report, mainly because in Kenya today if you oppose the BBI, you are labelled as being in Deputy President William Ruto’s camp, and if you support it, you are seen as being on the side of President Uhuru Kenyatta and his new ally, former opposition leader, Raila Odinga. And since I do not belong to either of these groups, I was afraid that by commenting on the report, I might inadvertently be labelled pro-Uhuru or pro-Ruto.

Critics of the BBI have mainly focused on whether amending the constitution through the BBI process is, in fact, unconstitutional as it would bypass many of the requirements for amending the 2010 constitution, which are onerous and virtually impossible to fulfill without a national consensus. Some critics, like the Kenya Conference of Catholic Bishops, say that by giving the president power to appoint a prime minister and two deputy prime ministers, the BBI is calling for a return to an imperial presidency.

On the other hand, supporters of the BBI – particularly the “handshake” stakeholders and many commentators in the mainstream media – have lauded the BBI for being the magic pill that will unite the country and spur social and economic development.

Intellectual surrender

Having now read the abridged version of the BBI report, I can conclusively say that it has failed to address the biggest crisis facing this country – that of poor leadership. The most offensive and egregious section of the report is undoubtedly the opening Validation Statement, which places the responsibility for all that is wrong with this country squarely on the shoulders of Kenyans – not on our leaders, who got us into the mess we are in in the first place.

The report states: “Kenyans decried the fact that Kenya lacked a sense of national ethos and is increasingly a nation of distinct individuals instead of an individually distinct nation. And we have placed too much emphasis on what the nation can do for each of us – our rights – and given almost no attention to what we each must do for our nation: our responsibilities.”

As Wandia Njoya pointed out in a recent article, what the BBI has effectively done is told Kenyans that they are to blame if their rights are violated. And if moral and ethical standards have dropped across the country, it’s not because the country’s politicians have lowered moral and ethical standards and have set a bad precedent, but because Kenyans just don’t know how to behave properly. It’s called blaming the victim.

It suggests that Kenyans are somehow wired to be evil or corrupt, that decades of state-inflicted brutality against citizens – an offshoot of a neocolonial dispensation where citizens are treated as gullible and exploitable subjects – has nothing to do with the culture of impunity we find ourselves in. That the contemptuous way in which we are treated by state institutions – at police stations, in public hospitals, in government offices – is somehow our fault. And that the example of how to behave was not established by the state and its officials that consistently fail to deliver justice to Kenyans and turn a blind eye to violence committed by state and security organs, especially against the poor. Remember, this is a country where a chicken thief can end up spending a year in jail, but a minister who has stolen billions from state coffers can get away scot-free.

Njoya writes:

We are told that discussing history is blaming colonialists and refusing to take responsibility for our own actions. That discussing ethnic privilege and patronage is attacking every single member of that ethnic group. That discussing patriarchy is blaming men. That explaining systemic causes of problems is explaining away or excusing those problems. Every public conversation in Kenya is a war against complex thinking. We have reached the point where Kenyan public conversations are pervaded by this system of intellectual simplification.

Hence the BBI’s proposal to set up a new commission to address “indiscipline in children, breakdown of marriages and general erosion of cultural values in today’s society”. Presumably, this commission will take on the role of parents, school teachers and community leaders “by mainstreaming ethics training and awareness in mentoring and counselling sessions in religious activities and through community outreach programmes”.

What is being implied here is that if only Kenyans were more religious, they might not behave so badly. (I wonder if the drafters of the report know that Kenyans are among the most religious people in the world. Yet we are consistently ranked as among the most corrupt countries on the planet.)

The BBI report recognises that ethnic divisions have polarised the country, but it does not acknowledge that ethnic polarisation is the result of a political leadership that forms opportunistic tribal alliances for its own advantage and is happy to pit one ethnic community against another in order to win elections.

Moreover, its recommendations on how to reduce ethnic animosity appear to be based on the idea that if you force different ethnic communities to live in close proximity to each other, Kenya will miraculously become a society where all ethnic groups live together in peace and harmony.

There is also this misguided belief that if the people in authority are from an ethnic group that is distinct from the ethnic group that these people lord over, there will be more accountability (a model borrowed from the Kenya Police and the colonial and post-colonial district and provincial commissioners’ templates). Hence the Ministry of Education should “adopt policy guidelines that discourage local recruitment and staffing of teachers”.

Many sociologists and behavioural scientists might argue that, in fact, if you want more accountability and cohesion in a community, the leadership should come from that same community. So, for instance, if police officers belong to the same ethnic community that they serve and protect, they are more likely to be more accountable to that community because any signs of misconduct on the part of the officer will be perceived as having a direct bearing on the welfare of that community. A bribe-taking officer is more likely to be reprimanded by his community because it is his community that suffers when he takes a bribe. A Kalenjin police officer posted in Malindi, for instance, will not care what the Giriama community he is extorting bribes from or is brutalising think of him because he is not part of them and is not accountable to them or to their community leaders and elders. This accountability is further diminished by the current practice of police officers regularly being transferred to different localities.

Similarly, in schools, particularly those in remote or marginalised areas, it is important that the teachers be from that community because they also play the role of mentors and role models. We are more likely to follow in the footsteps of someone who looks like us and who has a similar history than someone who doesn’t. Which is why Vice President-elect Kamala Harris has opened the doors to leadership for so many girls and women of colour in the United States.

This is not to say that the BBI report glosses over the problems facing marginalised communities. On the contrary, it makes it a point to highlight that “the marginalised, the under-served and the poor” are suffering and are in urgent need of “an immediate helping hand and employment opportunities to help them survive”. What the report fails to recognise is that the Constitution of Kenya 2010 was designed to ensure that such communities are not condemned to perpetual poverty. Devolution was supposed to sort out issues of marginalisation by ensuring that previously marginalised communities and counties are empowered to improve their own welfare. By making them recipients of hand-outs, the BBI has added insult to their injury.

Thankfully, the report does recommend that previous reports by task forces and land-related commissions, including the Ndung’u Land Commission and the Truth, Justice and Reconciliation Commission (TJRC), be implemented. My question is: If President Uhuru Kenyatta did not implement the recommendations of the TJRC, which handed its report to him in May 2013 shortly after he assumed the presidency, what guarantees do we have that he and his BBI team will implement the recommendations now? The president has also failed on his promise of a Sh10 billion fund for victims of historical injustices. What has changed? Clearly not the leadership (and here I mean the entire leadership, not just Uhuru’s). Silences and omissions

Moving on to another marginalisation issue: women’s representation. We all know that Parliament has actively resisted the two-thirds gender rule spelled out in the constitution. So what epiphany has occurred now that suddenly there is an urgent desire to include more women in governance institutions? If Parliament had just obeyed the constitution, there would not be a proposal in the BBI to ensure that no more than two-thirds of members of elective or appointive bodies be of the same gender. It would be a given.

And yet while BBI gives with one hand, it takes with the other. The BBI task force proposes that the position of County Women’s Representative in the National Assembly be scrapped.

What’s worse, the BBI actually appears to welcome the recommendation of “some Kenyans” that Independent Electoral and Boundaries Commission (IEBC) commissioners be appointed by political parties. Really? If you think that the 2007, 2013 and 2017 elections were fraudulent and chaotic, then wait for serious fraud and possible violence in an election where the electoral body’s commissioners represent party interests. (If I had my way, I would disband the IEBC altogether and put together a non-partisan body comprising foreign officials to run elections in this country. Maybe then we would have some hope of a free, fair and corruption-free election.)

The BBI is also silent on the role of the IEBC in vetting candidates, and ensuring that they adhere to Chapter Six of the Constitution on leadership and integrity. Let us not forget that many of the candidates in the last two elections had questionable backgrounds, and some were even facing charges in court. Why did the IEBC not ensure that those running for office had clean records?

On the economy, or what it calls “shared prosperity”, the BBI, emphasises the role of industry and manufacturing in the country’s economic development but is silent on agriculture, which currently employs about half of Kenya’s labour force and accounts for nearly 30 per cent of Kenya’s GDP, but which remains one the most neglected and abused sectors in Kenya. It’s a miracle that our hardworking and much neglected farmers are able to feed all of us, given that they receive so little support from the government, which consistently undermines local farmers by importing cheap or substandard food and by providing farmers with few incentives.

Besides, it is highly unlikely that Kenya will become a factory for the region, let alone the world, like China, because it simply does not have the capacity to do so. Why not focus on services, another mainstay of the economy?

The BBI also talks of harnessing regional trade and cooperation and sourcing products locally but, again, we know this is simply lip service. If Uhuru Kenyatta’s government was keen on improving trade within the region, it would not have initiated a bilateral trade agreement with the United States that essentially rubbishes and undermines the country’s previous regional trade agreements with Eastern and Southern African countries and trading blocs.

On the yoke around every Kenyan’s neck – corruption – the BBI’s approach is purely legalistic and administrative. It wants speedy prosecution of cases involving corruption and wastage of public resources and it wants to protect whistleblowers. (Good luck with the latter. In my experience, no whistleblower protection policy has protected whistleblowers, not even in the United Nations.)

BBI also wants to digitise all government services to curb graft. But as the economist David Ndii pointed out at the recent launch of the Africog report, “Highway Robbery: Budgeting for State Capture”, if corruption is built into the very architecture of the Kenyan government, no amount of digitisation will help. Remember how the Integrated Financial Management Information System (IFMIS) was manipulated to steal millions from the Ministry of Devolution in what is known as the NYS scandal? Computer systems are created and run by people, and these people can become very adept at deleting their digital footprints from these systems. As the former Auditor-General, Edward Ouko, pointed out, when corruption is factored into the budget (i.e. when budgets are prepared with corruption in mind), corruption becomes an essential component of procurement and tendering processes. So let’s think of more creative and innovative ways of handling graft within government.

Which is not to say that the BBI task force has not struggled with this issue. There are various proposals to amend public finance laws to make the government more accountable on how it spends taxpayers’ money. But we know that these laws can be undermined by the very people responsible for implementing them, as the various mega-corruption scandals in various ministries and state institutions have shown.

A Trojan horse?

Many Kenyans suspect that perhaps the real and only reason for the BBI is that it will allow for the creation of new powerful positions – such as that of prime minister to accommodate both Raila Odinga and Uhuru Kenyatta – and will set the stage for a return to a parliamentary system of governance instead of the current presidential “winner-takes-all” system. But while the latter might appear to be a worthwhile endeavour, the fact that former opposers of the new constitution and the parliamentary system now appear to be endorsing both suggests that there is something more to this than meets the eye. As Prof. Yash Pal Ghai has repeatedly stated, the constitution endorsed at Bomas was premised on a parliamentary system and was only changed at the last minute to accommodate a presidential system. That is how we ended up where we are now.

It also appears strange that those who benefitted most from the presidential system now want to change the constitution. As Waikwa Wanyoike, put it:

Worse, those hell-bent on immobilising the constitution have done so by conjuring up and feeding a narrative that it is an idealistic and unrealistic charter. Because they wield power, they have used their vantage points to counter most of the salutary aspects of the constitution. Uhuru Kenyatta’s consistent and contemptuous refusal to follow basic requirements of the constitution in executing the duties of his office, including his endless defiance of court orders, stands out as the most apt example here.

Yet all this is calculated to create cynicism among Kenyans about the potency of the constitution. Hoping that the cynicism will erode whatever goodwill Kenyans have towards the constitution, the elites believe that they can fully manipulate or eliminate the constitution entirely and replace it with laws that easily facilitate and legitimise their personal interests, as did Jomo Kenyatta and Moi.

If indeed we want to go back to a parliamentary system through a referendum, then we should hold the referendum when the current crop of politicians (some of whom, including Uhuru Kenyatta and William Ruto, were opposed to the 2010 constitution in the first place) are not in leadership positions because many Kenyans simply don’t trust them to do what is in Kenyans’ best interest. After all, a fox cannot be relied on to guard a chicken coop.

Already the president has urged Parliament to pass laws that conform to the BBI proposals – this even before the proposed referendum that will decide whether the majority of the country’s citizens are for or against the BBI’s raft of recommendations. In other words, the BBI proposals may become laws even before the country decides whether these laws are acceptable and are what the country needs. Are the goodies proposed in the BBI, such as providing debt relief to jobless graduates and allocating a larger share of national revenue to the counties, just enticements to lure Kenyans onto the BBI bandwagon so as to ensure that the current political establishment consolidates its hold on power? Is the BBI a Trojan horse disguised as a guardian angel? Only time will tell.

One possibility, however, is that a groundswell of public opinion against the BBI might just overturn the whole process.

Published by the good folks at The Elephant.

The Elephant is a platform for engaging citizens to reflect, re-member and re-envision their society by interrogating the past, the present, to fashion a future.

Follow us on Twitter.

The Return of the Taliban: What Now for the Women of Afghanistan?

By Rasna Warah “Cities are the absence of physical space between people and companies. They are proximity, density, closeness. They enable us to work and play together, and their success depends on the demand for physical connection.” – Edward Glaeser, Triumph of the City (2011)

In February this year, just before the coronavirus pandemic forced the Kenyan government to impose a partial lockdown in the country, I moved to Kenya’s capital, Nairobi, a city with a population of 4.4 million, from Malindi, a small town along Kenya’s coast with a population of just 120,000. I had been intending to move back home for several years but 2020 seemed an opportune time to do it. I had spent ten long years in Malindi and was ready to get back to the thick of things where the action was.

Now I know, for most people who live in Nairobi, the city is not “home” – the “true north” of most Nairobians, as Alexander Ikawah pointed out in a recent article, is their rural home, the place they identify most with. Ikawah says that Nairobi is just a place where “city villagers” work; where they have “houses”, not “homes”.

But I am not among these people. I was born in Nairobi, and so was my father and my grandfather. Kenyan Asians don’t typically have a rural home (Asians in Kenya were not encouraged to settle in rural or agricultural land both before and after independence and so are concentrated mainly in urban areas). And even if they have an ancestral home in India or Pakistan, they don’t tend to refer to it as “home”, nor does this ancestral home loom large in their imagination. In fact, many Kenyan Asians have never visited their “motherland”.

I have lived in London in the UK and Boston in the USA, and have travelled to many, many, cities around the world – New York (my favourite city), Istanbul (a cultural delight where East meets West), Mogadishu (a wounded city with nice beaches), Kabul (wounded but with majestic snowy peak backdrops), Havana (a salsa-lover’s dream, arguably the world’s most egalitarian city), Paris (a romantic city with many bridges), Mumbai (a buzzing “maximum city” of people, people, and more people), Beijing (interesting but with high levels of air pollution), Cairo (history lives here), Florence (a beautiful outdoor museum), Johannesburg (a legacy of apartheid, not my favourite city), Dar es Salaam (a friendly coastal city with huge potential), to name a few – but for me, Nairobi is not only home, it is also the place where most of my memories reside.

I will not go into the details about my reasons for leaving Nairobi in the first place, but it had a lot to do with trying to regain some perspective on life after having led a busy treadmill-like work existence where career success depended so much on pleasing a boss and undermining colleagues to move up the career ladder. I was hoping that a break would allow me to do things I hadn’t had time for before, like writing and spending more time with my husband. I dreamed of looking out of the window and seeing palm trees swaying in the wind, and breathing in the salty Indian Ocean breeze. Oh what bliss (and it was)…until I discovered that meaningful social interaction was much more important to me than the sounds and smells of nature. Voluntary self-isolation, I discovered, is neither natural nor healthy. Human beings are wired to be social animals – that is how they survived as a species.

While living in a small sleepy town where nothing much happens gave me the freedom to pursue writing (I ended up writing three books during my self-imposed “exile”) and other interests, I had a gnawing sense that I was in danger of disconnecting and self-isolating myself from all that was meaningful in my life. I yearned for intellectual stimulation and missed cultural and literary events. I longed to go to the cinema and hang out with my family. My social interactions in Malindi were superficial; I was in danger of becoming like the many expatriate (mostly Italian and British) retirees in the town, whose lives revolve around bridge parties and afternoon siestas induced by copious amounts of wine. The truth is, I was lonely. I had not found my “tribe” in Malindi.

Then COVID-19 happened. It is unfortunate that my return to Nairobi coincided with a dusk-to-dawn curfew and partial lockdown, so my intentions of absorbing myself into city life have once again have been put on hold. I am back to self-isolating again.

Cities are not the problem

The coronavirus pandemic has raised questions about whether cities will lose their allure, and whether people will look to leading simpler rural or small town lives. The fact that the virus emanated from the city of Wuhan in China and spread across the world through networks of cities and transport hubs is making people wonder whether we should be seeking more dispersed and less dense forms of settlement.

However, Tomasz Sudra, a former colleague who is now retired from the United Nations Human Settlements Programme (UN-Habitat), told me that it was unfair to blame cities for COVID-19 because the virus could have been contained early if the Chinese government had not decided to suppress “bad news”.

“The medical doctor who blew the whistle on the virus and died from it was forced to confess that he was spreading false news and was arrested,” he said. “The epidemic [in China] became a pandemic because the government suppressed the free flow of information.”

Cities have not only been associated with the rapid spread of diseases, but environmental degradation as well. The concentration of human and industrial activity in cities and the over- reliance on motorised forms of transport have been blamed for the air pollution that characterises so many of the world’s large cities. Images of smog-free cities as a result of lockdowns (especially in China, where air pollution levels are so excessive that city residents routinely wear face masks) have been circulating on social media. People are asking whether the climate crisis could be blamed on cities, and whether COVID-19 will force us to seek alternative lifestyles.

John Gray, writing in the 3 April 2020 issue of the New Statesman, says that the current crisis is a “turning point” in history. “The era of peak globalization is over. An economic system that relied on worldwide production and long supply chains is morphing into one that will be less interconnected. A way of life driven by unceasing mobility is shuddering to a stop. Our lives are going to be more physically constrained and more virtual than they were,” he predicts.

Is the city – itself a product of globalisation and the movement of goods and people from one shore or trading route to another – losing its attraction? Will there be a return to the nostalgic longing for rural life popularised by people like Mahatma Gandhi, who said that “true India” could only be found in the country’s villages? I don’t think so. The world, including India, is more urban than it was in Gandhi’s time. “True India” is no longer only in India’s villages, but in its teeming cities and towns, which currently host 34 per cent of the country’s population.

Just over a decade ago, there were more rural folk on this planet than city folk, but that changed around 2007 when the world’s urban population equaled the world’s rural population for the first time. Though some regions of the world, notably Europe, North America and Latin America, became predominantly urban much earlier (around the 1950s), the rapid urban growth rates in poorer parts of the world in the last fifty years have demonstrated that the pull of the city is stronger than ever. Cities must be offering something that villages don’t, or can’t.

I must confess that I have spent much of my professional life writing about what is wrong with cities and what can be done about it. At UN-Habitat, where I worked as an editor for more than a decade, the emphasis was on urban poverty and all its manifestations, including informal settlements (also known as slums). In 2006, UN-Habitat declared that one out of every three city dwellers lives in a slum, with sub-Saharan Africa having the largest proportion of its urban population living in slum conditions, with little or no access to water, sanitation, electricity and adequate housing. Asia hosted the largest number of slum dwellers, though some sub-regions in the continent were doing better than others. Slums, warned UN-Habitat, were threatening to become a “dominant and distinct type of settlement in cities of the developing world”.

This grim assessment was followed by another one in 2008, when UN-Habitat sounded the alarm on rising inequalities in cities, and warned that economic and social inequalities in urban areas had the potential to destabilise countries and make them economically unsustainable. Highly unequal cities – where the rich lead vastly different lives from the poor – are breeding grounds for social unrest, and social unrest disrupts economic activities, went the argument. UN-Habitat stated that pro-poor and inclusive urban development could significantly decrease these inequalities and make cities more sustainable. While the UN agency acknowledged that energy consumption in cities was impacting negatively on the environment, it made a case for mitigating the impact of carbon emissions through solutions such as environmentally-friendly public transport and the use of green energy.

Cities are not the problem; how we plan them is the central issue, said the experts.

The benefits of city life

Throughout history, cities have a played a central role in creating and sustaining civilizations. Cities are not just places where economic activities are concentrated, they are also crucibles of innovation and culture. The rise and fall of cities has often been associated with the rise and fall of civilizations. Cities such as Rome and Athens had their “golden ages”; some survived a loss of status; others became relics.

In 2006, I was asked to write a short chapter on the benefits of urban living for UN-Habitat’s 2006 State of the World’s Cities report, which focused almost entirely on the gloomy topic of slums. The thinking was that there was a danger that in highlighting the problems in cities and slums, we might inadvertently throw the baby out with the bath water and that as the UN’s “City Agency”, it would be counterproductive to focus only on the negative aspects of urban life. In other words, by presenting cities as places where nasty things happen, we might actually be sending an anti-urban message to the general public and to policymakers.

Because cities were – and still are – viewed as the engines of economic development, and economic growth is generally credited for reducing poverty levels (though this has not been the case in some countries), I had to make an argument that made economic sense to governments and the public at large. So I argued that because so much economic activity in a country is concentrated in its cities, “cities make countries rich”. I further pointed out that the concentration of populations and enterprises in urban areas greatly reduces the unit cost of piped water, sewerage systems, drains, roads, and other infrastructure. Therefore, the economies of scale that cities offer are not replicable in small, less dense human settlements. Building a hospital or a road in a town or village with a population of just 50,000 is far less efficient per capita than building a hospital or road in a large urban area that hosts a population of 5 million (regardless of the ethics of making such a choice).

The central argument was that rural people don’t just up and move to a city; the main driver of rural-to-urban migration is economic opportunities and the chance to lead a better quality of life. In almost all countries, rural poverty levels are higher than urban poverty levels. (For instance, the poverty rate in rural Kenya is about 40 per cent, compared to around 28 per cent in peri-urban and urban areas.) Indeed, the data showed that despite the pathetic and hazardous living conditions in slums, people who lived in slums often viewed them as a “first step” out of rural poverty. As Edward Glaeser, a Professor of Economics at Harvard University, says in his book, Triumph of the City: How Our Greatest Invention Makes Us Richer, Smarter, Greener, Healthier, and Happier, “Cities don’t make people poor; they attract poor people. The flow of less advantaged people into cities from Rio to Rotterdam demonstrates urban strength, not weakness.”

However, villages are not stagnant places either; some, like Mumbai, which was once a fishing village, grow to become megacities (defined as cities with populations of more than 10 million). Some cities, like Nairobi, were not even villages originally; Nairobi literally grew out of nothing except a railway depot built at the beginning of the 20th century. The world’s great cities did not only grow because they were centres of trade and commerce; they also grew because they were religious, political, administrative or cultural centres, and this is what drew – and continues to draw – people to them.

Many rural people move to cities because they believe that they and their families will have better access to health and education. Cities also offer women more opportunities for social and economic mobility. Unrestrained by discriminatory customs and traditions, urban women are more likely than their rural counterparts to have access to property and other assets. Child and maternal mortality rates are also lower in cities, including in slums, compared to rural areas.

The downside is that city life exposes people to hazards such as indoor and outdoor air pollution, congestion, and crime, which significantly impacts the health and lives of urban dwellers. Cities can be incubators of disease, crime and other vices; but these disadvantages have never stopped cities from growing, even when plagues and other health hazards infest cities and kill populations. The 1665 Great Plague of London, for example, killed thousands, but did not diminish London’s stature. COVID-19 has decimated populations in the city of New York – the city with the highest COVID-19- related death rate in the United States – but even images of mass graves of the disease’s victims are unlikely to deter people from moving there.

Safety nets are also weaker in cities, which is one reason why so many people in the developing world (where there are few government-funded welfare systems) identify with their rural homes, where, as Ikawah points out, social capital obtained through filial ties is much stronger (though associational life in slums, through cooperatives and self-help groups, have helped reduce some of this deficit).

Cities have also been derided for promoting mindless consumerism. They have been accused of driving a type of capitalism that encourages people to go on endless shopping expeditions to buy things they might never use or need. Large shopping malls – a distinct feature of modern cities – are filled with products that keep the wheels of capitalism moving. Alain Kamal Martial Henry predicts that the coronavirus will overthrow this “Western bourgeois model” imposed by capitalism. And this may lead to the eventual demise of cities and urban living.

The problem that has no name

I asked Daniel Biau, a former colleague who served as the Deputy Executive Director of UN-Habitat from 1998 to 2005, whether we could from henceforth witness a decline in urban growth levels, and whether people will now seek to move out of large cities to places that are less dense and concentrated.

Biau was not convinced that the coronavirus pandemic will change the way people view cities. “As usual, a few journalists will write about risky cities but their alarming views will be completely ignored by ordinary people who know very well that cities are, above all, places of job opportunities, social interactions, education and cultural development,” he said.

He predicts that in the digital age, it is likely that small and medium-sized cities will grow faster than big metropolises because teleworking will become the norm. “Already in France 40 per cent of the working population is currently teleworking,” he said.

“History has shown that some cities could shrink due to economic or environmental reasons. But cities have never disappeared due to health reasons. This is why the UN should provide guidelines for the promotion of safer and healthier cities as part of the wider sustainable cities development paradigm,” added Biau in an email exchange.

Cities will exist – and continue to grow – because of human beings’ need for social interaction, physical contact and collaboration. As Glaeser points out in his book, “The strength that comes from human collaboration is the central truth behind civilization’s success and the primary reason why cities exist. We should eschew the simplistic view that better long-distance communication will reduce our desire and need to be near one another. Above all, we must free ourselves from the tendency to see cities as their buildings, and remember that the real city is made of flesh, not concrete.”

However, despite their density and diversity, cities can also be lonely places. The “little town blues” that I talked about earlier are also experienced in large cities. People living in high-rise apartment blocks in big cities or in suburbs on the periphery of cities often report not knowing their neighbours and lacking a sense of “community”.

Some believe that rapid suburbanisation since the 1950s, especially in the United States, led to increasing disillusionment among married women, whose isolated lives in well-planned (but boring) suburbs led them to question patriarchial norms and the virtues of being stay-at-home wives and mothers. This angst (described by Betty Friedan as “the problem that has no name” in her book, The Feminine Mystique) sowed the seeds of the American women’s movement in the 1960s and ‘70s, and led many women to seek careers outside the home.

Some cities are better at fostering human interaction than others through carefully planned urban designs, and more people-friendly infrastructure, such as parks and other public spaces, including pedestrian-only streets. Recently, after a wave of rape cases in India, urban planners have also been thinking about how cities can be made more woman-friendly, with more street lighting and more gender-sensitive public transport. The designers of these cities understand one basic fact: cities are not about buildings and infrastructure; they are about people and communities.

The COVID-19 lockdowns have demonstrated how abnormal and disturbing self-isolation and social distancing can be. The pandemic has underscored the fact that human beings have an inherent need to interact with other human beings, even if it is at a cursory level. This physical connection with a diverse range of people from different backgrounds is what makes cities attractive, and is the reason why the city – in all its beauty and ugliness – is one of humanity’s greatest achievements.

Published by the good folks at The Elephant.

The Elephant is a platform for engaging citizens to reflect, re-member and re-envision their society by interrogating the past, the present, to fashion a future.

Follow us on Twitter.