The Politics of Hard Times
Total Page:16
File Type:pdf, Size:1020Kb
Frederick Jackson Turner, (1861-1932)
"The existence of an area of free land, its continuous recession, and the advance of American settlement westward explain American development." With these words, Frederick Jackson Turner laid the foundation for modern historical study of the American West and presented a "frontier thesis" that continues to influence historical thinking even today. Turner was born in Portage, Wisconsin, in 1861. His father, a journalist by trade and local historian by avocation, piqued Turner's interest in history. After his graduation from the University of Wisconsin in 1884, Turner decided to become a professional historian, and received his Ph.D. from Johns Hopkins University in 1890. He served as a teacher and scholar at the University of Wisconsin from 1889 to 1910, when he joined Harvard's faculty. He retired in 1924 but continued his research until his death in 1932.
Turner's contribution to American history was to argue that the frontier past best explained the distinctive history of the United States. He most cogently articulated this idea in "The Significance of the Frontier in American History," which he first delivered to a gathering of historians in 1893 at Chicago, then the site of the World's Columbian Exposition, an enormous fair to mark the four-hundredth anniversary of Columbus' voyage. Although almost totally ignored at the time, Turner's lecture eventually gained such wide distribution and influence that a contemporary scholar has called it "the single most influential piece of writing in the history of American history."
Three years before Turner's pronouncement of the frontier thesis, the U.S. Census Bureau had announced the disappearance of a contiguous frontier line. Turner took this "closing of the frontier" as an opportunity to reflect upon the influence it had exercised. He argued that the frontier had meant that every American generation returned "to primitive conditions on a continually advancing frontier line." Along this frontier -- which he also described as "the meeting point between savagery and civilization" -- Americans again and again recapitulated the developmental stages of the emerging industrial order of the 1890's. This development, in Turner's description of the frontier, "begins with the Indian and the hunter; it goes on with the disintegration of savagery by the entrance of the trader... the pastoral stage in ranch life; the exploitation of the soil by the raising of unrotated crops of corn and wheat in sparsely settled farm communities; the intensive culture of the denser farm settlement; and finally the manufacturing organization with the city and the factory system."
For Turner, the deeper significance of the frontier lay in the effects of this social recapitulation on the American character. "The frontier," he claimed, "is the line of most rapid Americanization." The presence and predominance of numerous cultural traits -- "that coarseness and strength combined with acuteness and acquisitiveness; that practical inventive turn of mind, quick to find expedients; that masterful grasp of material things... that restless, nervous energy; that dominant individualism" -- could all be attributed to the influence of the frontier.
Turner's essay reached triumphalist heights in his belief that the promotion of individualistic democracy was the most important effect of the frontier. Individuals, forced to rely on their own wits and strength, he believed, were simply too scornful of rank to be amenable to the exercise of centralized political power.
Turner offered his frontier thesis as both an analysis of the past and a warning about the future. If the frontier had been so essential to the development of American culture and democracy, then what would befall them as the frontier closed? It was on this forboding note that he closed his address: "And now, four centuries from the discovery of America, at the end of a hundred years of life under the Constitution, the frontier has gone, and with its going has closed the first period of American history."
More than a century after he first delivered his frontier thesis, historians still hotly debate Turner's ideas and approach. His critics have denied everything from his basic assumptions to the small details of his argument. The mainstream of the profession has long since discarded Turner's assumption that the frontier is the key to American history as a whole; they point instead to the critical influence of such factors as slavery and the Civil War, immigration, and the development of industrial capitalism. But even within Western and frontier history, a growing body of historians has contested Turner's approach.
Some have long disputed the very idea of a frontier of "free land." Turner's formulation ignored the presence of the numerous Indian peoples whose subjugation was required by the nation's westward march, and assumed that the bulk of newly acquired lands were actually democratically distributed to yeomen pioneers. The numerous Indian wars provoked by American expansion belie Turner's argument that the American "free land" frontier was a sharp contrast with European nations' borders with other states.
On a more analytic level, an increasing number of Western historians have found the very concept of a frontier dubious, because it applies to too many disparate places and times to be useful. How much do Puritan New England and the California of the transcontinental railroad really have in common? Many such critics have sought to replace the idea of a moving frontier with the idea of the West as a distinctive region, much like the American South.
Where Turner told the triumphalist story of the frontier's promotion of a distinctly American democracy, many of his critics have argued that precisely the opposite was the case. Cooperation and communities of various sorts, not isolated individuals, made possible the absorption of the West into the United States. Most migrant wagon trains, for example, were composed of extended kinship networks. Moreover, as the 19th century wore on, the role of the federal government and large corporations grew increasingly important. Corporate investors headquartered in New York laid the railroads; government troops defeated Indian nations who refused to get out of the way of manifest destiny; even the cowboys, enshrined in popular mythology as rugged loners, were generally low-level employees of sometimes foreign-owned cattle corporations.
Moreover, these revisionist scholars argue, for many places the West has not been the land of freedom and opportunity that both Turnerian history and popular mythology would have us believe. For many women, Asians, Mexicans who suddenly found themselves residents of the United States, and, of course, Indians, the West was no promised land.
The more foreboding and cautionary tale which increasing numbers of Western historians have offered in place of Turner's account has provoked sharp controversy. "New" Western historians -- many of whom actually echo and draw upon fairly old scholarly works -- often argue that their accounts offer a more inclusive and honest reckoning of the Western past. Western historians who still adhere roughly to Turner's approach accuse their opponents of mistaking a simple- minded political correctness for good scholarship in their quest to recount only the doom and gloom of the Western past. Often the rhetoric reaches an acrimonious crescendo. But in a sense, the very acrimony of these debates takes us full circle back to Turner and his legacy, for debates about the significance of Western history are hardly ever confined to the past. In our understanding of what we are as a nation, if on no other level, the Western past continues to define us today. http://www.pbs.org/weta/thewest/people/s_z/turner.htm Therapeutic Laws. Legislation designed to make you feel good, not to do anything. By Jack Shafer. Posted Saturday, Sept. 28, 1996, at 12:30 AM PT
Bill Clinton wants to be an activist chief executive, but a paradox of his own making stands in the way. In his last State of the Union address, he repudiated big government. "We know there's not a program for every problem," he said. "The era of big government is over." With the help of Dick Morris, Clinton has turned this paradox, this--let's face it--logical contradiction, into an electoral strength. Clever rhetoric has helped. But so did his embrace of what might be called "therapeutic legislation."
Therapeutic legislation is intended to make people feel good, not actually to accomplish anything. Sometimes, it addresses a virtually nonexistent problem or, at least, a problem that ranks lower on any sensible scale of national concerns than the fuss and self-congratulation would indicate. Sometimes, it addresses real, major problems, but in an almost totally symbolic manner. Often, therapeutic legislation exploits the electorate's short attention span, its capacity to become suddenly obsessed with an issue and then--especially if provided with legislative catharsis--to forget it just as quickly. In any case, therapeutic legislation costs the taxpayer little or nothing and generally offends almost no one. (In an important subclass of therapeutic legislation, however, stagily offending an unpopular interest group--e.g., the tobacco lobby--is part of the therapy.)
This week, Clinton signed another of the many therapeutic laws for which he has taken credit. This one makes stalking across interstate lines or on U.S. government property a federal offense, punishable by five years to life in prison. The law was sponsored by Sen. Kay Bailey Hutchison, Republican of Texas, proving that Democrats aren't alone in the dirty habit of pleasuring themselves this way. The anti-stalking law is typical of much therapeutic legislation in that it addresses a hunger for the federal government to do something about a matter--usually crime or education--that is properly the concern of the states. I wouldn't be so callous as to suggest that stalking isn't an urgent problem, fully worthy of immediate action by a Congress that can't pass a budget on time. But is stalking across state lines or on federal property really such a pressing concern? Undoubtedly it is terrifying when it happens (as it apparently happened to Sen. Hutchison). The reason Congress and the president have outlawed it with such a flourish, however, is as a way of expressing symbolic concern over stalking in general. Sen. Hutchison's office concedes that it has collected no information on the number of interstate stalking cases.
Indeed, if there were thousands of interstate stalkers, if they did pose a serious law-enforcement problem, Hutchison's legislation would have smoked out some sort of constituency to oppose the bill. If a stalkers' lobby itself didn't pipe up, at least civil libertarians who deplore the double-jeopardy implications of a federal stalking law would have criticized it. Instead, Hutchison's solution to the nonproblem passed 99-0 in the Senate. A law that passes with no opposition is a good bet to be therapeutic legislation. (And it is doubly hypocritical for Republicans, who claim to believe in less government and in state government, to be clotting the federal statute books with laws that mess in areas of state concern.)
Many therapeutic laws are superfluous. Some are passed unanimously. But the defining characteristic of a therapeutic bill is its thrift: It doesn't increase the budget; it requires no new taxes; and it offends no special-interest group. The anti-stalking bill cost Clinton and several hundred members of Congress absolutely nothing, but allowed them to inflate their anti-crime résumés.
A good third of Clinton's acceptance speech at the Democratic Convention was used to publicize therapeutic laws passed on his watch or new ones he wanted Congress to consider:
He called for a ban on "cop-killer" bullets; reiterated his support for a victims'-rights constitutional amendment; argued for an extension of the Family and Medical Leave law and a measure to keep moms and their babies in hospitals longer than 48 hours; promoted a measure that would place taggents in explosives; and asked for a Brady Bill amendment to keep guns out of the hands of perpetrators of domestic violence.
He touted the television V-chip; praised the Kennedy-Kassebaum law (an ultra-therapeutic law that guarantees portability of insurance but places no ceiling on the rates insurers can charge); applauded the ban on "assault" rifles; and bragged about the new FDA regulations that curb the advertising and sale of cigarettes to children.
To much applause, he deplored the fact that "10 million children live within just four miles of a toxic waste dump" (four miles?) and urged that we make it illegal "even to attempt to pollute" (whatever that means).
Clinton isn't the
only therapeutic politician, just the best. Linguistic nationalists are pushing their English-first measures. The ultrapatriotic want an amendment to ban flag- burning (hell, why not just mandate flag-waving?). The spit-and-polish crowd campaigns for school uniforms. The drug warriors seek more drug-free zones. To ward off child molesters, the city of San Mateo, Calif., has proposed background checks and fingerprinting of Little League coaches, den mothers, and others who volunteer their time to children (never mind, as the Wall Street Journal reports, that less than 10 percent of all child molestations take place in an institutional setting; that most accused child molesters have no previous convictions; and that child abuse is down in the '90s). And with the continued Balanced Budget Amendment follies, Congress indulges itself in the grandest of therapeutic fantasies. If it really wants to balance the budget it should just do so, rather than passing feel- good laws that say the budget should be balanced.
No doubt somewhere in the above list I've included a law that you, dear reader, support and believe is more than merely therapeutic. Your particular law, or two, address problems fully worthy of a national fuss and Rose Garden signing ceremony. But surely even you will agree that most of these laws are merely therapeutic. We can all agree on that, without agreeing on which are the exceptions.
Therapeutic laws become props for rhetoric that might be called demagoguery, except that it disgraces the memories of Joe McCarthy and Huey Long and the ambitions of Pat Buchanan to call Clinton a demagogue. The genuine demagogue assails minorities and labels his foes Communists. The modern "semigogue" speaks liltingly about children and education and health and public safety. He artfully constructs his debate to make his foes sound as if they are against children, for gun violence, against safe streets, and for pollution. The semigogue in chief has buried Dole with so many positives during this election season, it's enough to make you long for the days of negative campaigning.
And for genuine activism. Even though my personal tastes in legislation tend toward the kind that begin, "Congress shall pass no law," I admired the old Bill Clinton who attempted to reorganize the $1 trillion health-care business and who forthrightly called for a workfare program that would cost more, not less, than simple handouts. That Clinton didn't pussyfoot around. He stood for what he believed in. He stimulated a thunderous and enlightening debate. He demonstrated to the electorate that real change is not cheap and easy.
He also got his ass kicked. http://www.slate.com/id/2066 Paradigms of Panic. Asia goes back to the future. By Paul Krugman, March 13, 1998
There were warning signs aplenty. Anyone could have told you about the epic corruption--about tycoons whose empires depended on their political connections and about politicians growing rich in ways best not discussed. Speculation, often ill informed, was rampant. Besides, how could investors hope to know what they were buying, when few businesses kept scrupulous accounts? Yet most brushed off these well- known vices as incidental to the real story, which was about economic growth that was the wonder of the world. Indeed, many regarded the cronyism as a virtue rather than a vice, the signature of an economic system that was more concerned with getting results than with the niceties of the process. And for years, the faint voices of the skeptics were drowned out by the roar of an economic engine fueled by ever larger infusions of foreign capital.
The crisis began small, with the failure of a few financial institutions that had bet too heavily that the boom would continue, and the bankruptcy of a few corporations that had taken on too much debt. These failures frightened investors, whose attempts to pull their money out led to more bank failures; the desperate attempts of surviving banks to raise cash caused both a credit crunch (pushing many businesses that had seemed financially sound only months before over the brink) and plunging stock prices, bankrupting still more financial houses. Within months, the panic had reduced thousands of people to sudden destitution. Moreover, the financial disaster soon took its toll on the real economy, too: As industrial production skidded and unemployment soared, there was a surge in crime and worker unrest.
But why am I telling you what happened to the United States 125 years ago, in the Panic of 1873?
Anyone who claims to fully understand the economic disaster that has overtaken Asia proves, by that very certainty, that he doesn't know what he is talking about. The truth is that we have never seen anything quite like this, and that everyone--from the country doctors at the International Monetary Fund and the Treasury Department who must prescribe economic medicine to those of us who have the luxury of irresponsibility--is groping frantically for models and metaphors to make sense of this thing. The usual round of academic and quasiacademic conferences and round tables has turned into a sort of rolling rap session, in which the usual suspects meet again and again to trade theories and, occasionally, accusations. Much of the discussion has focused on the hidden weaknesses of the Asian economies and how they produced fertile ground for a financial crisis; the role of runaway banks that exploited political connections to gamble with other people's money has emerged as the prime suspect. But amid the tales of rupiah and ringgit one also hears surprisingly old-fashioned references--to Charles Kindleberger's classic 1978 book Manias, Panics, and Crashes, and even to Walter Bagehot's Lombard Street (1873). Asia's debacle, a growing number of us now think, is at least in part a souped-up modern version of a traditional, 1873-style financial panic.
The logic of financial panic is fairly well understood in principle, thanks both to the old literary classics and to a 1983 mathematical formalization by Douglas Diamond and Philip Dybvig. The starting point for panic theory is the observation that there is a tension between the desire of individuals for flexibility--the ability to spend whenever they feel like it--and the economic payoff to commitment, to sticking with long-term projects until they are finished. In a primitive economy there is no way to avoid this tradeoff--if you want to be able to leave for the desert on short notice, you settle for matzo instead of bread, and if you want ready cash, you keep gold coins under the mattress. But in a more sophisticated economy this dilemma can be finessed. BankBoston is largely in the business of lending money at long term--say, 30-year mortgages--yet it offers depositors such as me, who supply that money, the right to withdraw it any time we like.
What
a financial intermediary (a bank or something more or less like a bank) does is pool the money of a large number of people and put most of that money into long-term investments that are "illiquid"--that is, hard to turn quickly into cash. Only a fairly small reserve is held in cash and other "liquid" assets. The reason this works is the law of averages: On any given day, deposits and withdrawals more or less balance out, and there is enough cash on hand to take care of any difference. The individual depositor is free to pull his money out whenever he wants; yet that money can be used to finance projects that require long-term commitment. It is a sort of magic trick that is fundamental to making a complex economy work.
Magic, however, has its risks. Normally, financial intermediation is a wonderful thing; but now and then, disaster strikes. Suppose that for some reason--maybe a groundless rumor--many of a bank's depositors begin to worry that their money isn't safe. They rush to pull their money out. But there isn't enough cash to satisfy all of them, and because the bank's other assets are illiquid, it cannot sell them quickly to raise more cash (or can do so only at fire-sale prices). So the bank goes bust, and the slowest-moving depositors lose their money. And those who rushed to pull their money out are proved right--the bank wasn't safe, after all. In short, financial intermediation carries with it the risk of bank runs, of self-fulfilling panic.
A panic, when it occurs, can do far more than destroy a single bank. Like the Panic of 1873--or the similar panics of 1893; 1907; 1920; and 1931, that mother of all bank runs (which, much more than the 1929 stock crash, caused the Great Depression)--it can spread to engulf the whole economy. Nor is strong long-term economic performance any guarantee against such crises. As the list suggests, the United States was not only subject to panics but also unusually crisis-prone compared with other advanced countries during the very years that it was establishing its economic and technological dominance.
Why, then, did the Asian crisis catch everyone by surprise? Because there was a half-century, from the '30s to the '80s, when they just didn't seem to make panics the way they used to. In fact, we--by which I mean economists, politicians, business leaders, and everyone else I can think of--had pretty much forgotten what a good old-fashioned panic was like. Well, now we remember.
I'm not saying that Asia's economies were "fundamentally sound," that this was a completely unnecessary crisis. There are some smart people--most notably Harvard's Jeffrey Sachs--who believe that, but my view is that Asian economies had gone seriously off the rails well before last summer, and that some kind of unpleasant comeuppance was inevitable. That said, it is also true that Asia's experience is not unique; it follows the quite similar Latin American "tequila" crisis of 1995, and bears at least some resemblance to the earlier Latin American debt crisis of the 1980s. In each case there were some serious policy mistakes made that helped make the economies vulnerable. Yet governments are no more stupid or irresponsible now than they used to be; how come the punishment has become so much more severe?
Part of the answer may be that our financial system has become dangerously efficient. In response to the Great Depression, the United States and just about everyone else imposed elaborate regulations on their banking systems. Like most regulatory regimes, this one ended up working largely for the benefit of the regulatees--restricting competition and making ownership of a bank a more or less guaranteed sinecure. But while the regulations may have made banks fat and sluggish, it also made them safe. Nowadays banks are by no means guaranteed to make money: To turn a profit they must work hard, innovate--and take big risks.
Another part of the answer--one that Kindleberger suggested two decades ago--is that to introduce global financial markets into a world of merely national monetary authorities is, in a very real sense, to walk a tightrope without a net. As long as finance is a mainly domestic affair, what people want in a bank run is local money--and, guess what, the government is able to print as much as it wants. But when Indonesians started running from their banks a few months ago, what they wanted was dollars--and neither the Indonesian government nor the IMF can give them enough of what they want.
I am not one of those people who believes that the Asian crisis will or even can cause a world depression. In fact, I think that the United States is still, despite Asia, more at risk from inflation than deflation. But what worries me--aside from the small matter that Indonesia, with a mere 200 million people, seems at the time of writing to be sliding toward the abyss--is the thought that we may have to get used to such crises. Welcome to the New World Order. http://www.slate.com/id/1931 Who's afraid of a bear market? Almost everyone, but don't expect a crash to scare off day traders. In fact, it might turn you into one.
Editor's note: Though this tale is told in the voice of a single person, it is the product of the experiences and reporting of both authors.
By Joey Anuff and Gary Wolf
May 30, 2000 | Recently people have been asking me if the recent NASDAQ hiccup has reduced the number of day traders. My answer? Probably not. Terror is the day trader's best friend. Day traders say you become an investor when your trade goes very wrong. You buy 1,000 shares, the price plummets and suddenly you start telling yourself that UBID is a "great value" that is sure to rise significantly "over time." But the opposite is also true. An investor can become a day trader when his or her trade goes unexpectedly right.
You buy some obscure stock that your cousin told you about, it quadruples and you can't stand to hang on to it any more, so you take your profits. Now what are you going to do with all that cash? It's already sitting in your trading account. Scores of "bargains" are flashing before your eyes every day. Come on. Just make one trade. One little purchase, just for the heck of it. There you go. Now how do you feel? Not so good, right? A little nervous? In fact, kind of panicky? Look, your stock is already up a quarter point. Little beads of sweat are breaking out on your upper lip. You are mouthing words, but no sound is coming out. That's OK, because now it's time to sell. Go ahead, sell it. Sell! SELL!
Very long-term investors, those who plan to hold their stocks for 20 years or more, can afford to ignore the movement of the market. But for people who might need their money in the next decade, the threat of a sudden collapse is terrifying. Given how much you are going to resent it if half your money suddenly disappears, perhaps you would actually be safer not holding any of those stocks you own overnight. When things head south, you don't want to be the last one out the door. On the other hand, you don't want to get left out when the market bounces. So you buy a few promising candidates and keep one nervous eye on the exits.
Congratulations, you're halfway there.
The following excerpt was adapted from "Dumb Money: Adventures of a Day Trader," by Joey Anuff and Gary Wolf
I started trading during the glory days of the Internet bull market. I quickly doubled my account size due to an extremely fortunate speculation on EBAY, and I entertained myself with visions of growing my low six- figure trading account into a mid-six-figure trading account. Since I was trading the sum total of all the money I'd ever earned, borrowed, swindled or inherited, this felt like quite an accomplishment. I gave myself extra points for having made a small fortune while the business page of the newspaper (which for the first time in my life I read religiously) was warning every day about the dangerous and unsupportable run-up in the prices of Internet stocks.
Sometimes I wondered if it was easier for me than for most people. I had a lot of tolerance for risk and seemed to have an uncanny knack for being at the right place at the right time. There's no denying that I had benefited from luck; but, on the other hand, perhaps luck is a talent like any other. You have to make the best of what you are given, I mused, and I seemed to have been given a gift for brilliant speculation.
During this time, a friend was regaling me with stories about the great money manager John Templeton, whom he had met while researching a magazine story. Templeton, now in his nineties, had almost single- handedly invented the foreign-stock mutual fund. One of his earliest investors, Leroy Paslay, gave $65,500 to Templeton to invest in 1954 and by 1996, Paslay's shares were worth $37 million. The Templeton Growth Fund had an average annual return of 14.3 percent. "Templeton thinks stocks are overpriced," my friend told me, as if he were imparting the secret code that unlocked the treasure chest of unlimited wealth.
I laughed out loud. Fourteen percent? You've got to be kidding. In my first half-year of trading I'd made more than 100 percent. The only times I went wrong were times I stayed away from the trading screen.
That conversation about Templeton, however, bore poisoned fruit, in the form of a 1932 reprint of a 1841 classic about bull markets and their victims: "Extraordinary Popular Delusions and the Madness of Crowds," by Charles MacKay. MacKay's book was one of Templeton's favorites. Since I was just beginning a reading binge about great stock market speculators (in an effort to better understand my heretofore undiscovered talent), I instantly found a copy and started reading.
That day marked the end of my innocent triumphs. I don't know what I expected: No, I do know. I expected a friendly romp through the history of stock market dopes whose bankruptcy would provide a suitably dark background against which the brighter stars of speculative adventure could be seen and appreciated. Today's deluded losers, I was sure, were the dummies who kept saying the Internet was a fad and the Internet stocks were a joke. My bank account told me who was having the last laugh about that.
I looked forward to a double handful of anecdotes of others who'd played the wrong runs, been plagued by the wrong timing, or been doomed by the wrong attitude, the better to torture my friends with. Although Templeton's returns were pretty weak by today's standards, he had been a player in his time, and I took it for granted that he would feel a kinship with me and my little victories over popular misunderstanding. If he was a fan of Charles MacKay, then I was prepared to be a fan, too.
The unexpected horror MacKay's tales inspired in me is hard to describe. Everybody knows about "Tulipomania." The Dutch went mad about tulips a few hundred years ago, and bid the price up outrageously, after which they came to their senses and the price fell. Big deal. Several hundred years later, their folly still lingers as the most trite of metaphors to be hauled out anytime someone thinks someone else is paying too much for something. Tulips were regularly mentioned on the stock discussion boards by desperate characters who were shorting a technology stock that continued, despite their evilest prayers, to go up and up. When I hear "tulip" I translate it this way: "I don't understand why this stock keeps rising but if it goes up another three points I'm going to be ruined." My response? "Tulip, you say? Time to buy more!"
So it wasn't MacKay's story of tulipomania that knocked me off my game, but his tale of the numerous London bankruptcies in the wake of the South-Sea bubble of 1720. Nearly everybody who touched the crappy shares of the South Sea company, with the exception of a few insiders, was damaged by the experience. Since MacKay tells it so well, there's no point in telling it again here. But the similarities to the run up in the Internet stocks were terrifying:
The advent of profitable trade with the Americas, sometime in the future, which would transform the economy of England; why, that was e-commerce. The South Sea directors and their cronies, who sold their shares into successive waves of buying; these were the Internet stock-option millionaires. The manipulation of the price to ludicrous heights using publicity; this was the laughable "we're now an Internet company" press release. The proliferation of meaningless business ventures by London sharpies in the wake of the first great success, including "a company for carrying on an undertaking of great advantage, but nobody to know what it is;" why, this was UBID, and KTEL, and Perfumania, and scores of others.
Explained MacKay: "It did not follow that all these people believed in the feasibility of the schemes to which they subscribed; it was enough for their purpose that their shares would ... be soon raised to a premium, when they got rid of them with all expedition to the really credulous." Hey, wait a second: that was me. In the end, everybody went broke.
One dose of pessimism led to another. Soon, I was devouring books about economic catastrophe. I read Charles Kindleberger's "Manias, Panics, and Crashes," from which I learned that one sign of impending doom is the emergence of financial swindles that capitalize on the general belief that it is easy to become rich. Each day, my e-mail queue was peppered with announcements of new stock picking services guaranteed to drown me with cash. In the past, I'd ignored them. Now, I saw them as omens of the rout to come.
Kindleberger even predicted the proximate cause of the latest phase of the bull run, which was the lowering of interest rates in late summer, 1998, in response to the Asian crisis. The head of the Federal Reserve Bank, Alan Greenspan, had long been critical of the uncorrected boom in share prices, but when the Asian economies faltered he cut interest rates anyway, though this was likely to set off another manic stock market run to record highs -- as it promptly did.
Kindleberger, in his dry prose, writing long before the fact, reveals why even conservative bankers can't head off a dangerous bubble under these circumstances: "When commodity and asset markets move together, up or down, the direction of monetary policy is clear. But when a threatening boom in share prices or real estate or both rears up when commodity prices are stable or falling, the authorities face a dilemma." Indeed, they do. Greenspan chose a nice, easy rate cut, and I had a hundred thousand dollars to show for it. For the first time, however, I began to reflect on the possibility that I would have all of this money stuck in a promising Internet stock on the very day that the market went South in a panic.
My fate was sealed when I settled down for the weekend with the ultimate stock market snuff movie: John Kenneth Galbraith's "The Great Crash of 1929." "The only reward to ownership in which the boomtime owner has an interest is the increase in values," Galbraith wrote. "Could the right to the increased values be somehow divorced from the other and now unimportant fruits of possession and also from as many as possible of the other burdens of ownership, this would be much welcomed by the speculator. Such an arrangement would enable him to concentrate on speculation which, after all, is the business of the speculator."
This was an exact description of my heart's desire, but Galbraith's sympathy ended there. He seemed to take a dry delight in the fact that after the big drop in 1929 came the long, murderous compaction of the early '30s, by which point most of the amateur speculators had gotten all their funds pummeled out of them.
I had always assumed that even if the market dropped radically, all the traders in the world would simultaneously scream "Buying opportunity!" and up we'd race again. I believed that after a big dip there would be lots of cash available to go back into the market. This, I learned, is a common misconception based on a failure to recognize that after a long, relentless bull run, a run that forces all the lingering skeptics to give in, all the cash is already in the market. And then it is gone. Gone where? Gone, as a great speculator once said, "where the woodbine twineth." He meant, I'm pretty sure, up the spout. Gone, in the words of another grim accountant, to join "the silent majority of vanished savings." In the great crash of 1929, stocks fell by 50 percent. During the next three years, they fell by another 80 percent. How were you supposed to make your money back under those circumstances?
Then and there, I concluded that my trading method was insane. Yes, I'd made a bundle of cash. But a circus chimp could have doubled his money buying the stocks I was playing during the winter of 1998- 1999. It was simple. Dow hits 7,500 in August. Buy anything. Dow hits 10,000 by year end. Sell everything. Luckily, I was too dumb to know better than to get in at the bottom. But after dosing myself with MacKay and Kindleberger and Galbraith, I wasn't dumb enough to stay in at the top. I was finished trusting the market to reward me for not trying. It was time to trade risk for security.
When I'd started out, cashing in all my mutual funds and wiring the proceeds, along with all my savings, into my E-Trade account, I had made a conscious decision to trade it as if I were going to blow it. After all, I had just turned 27. I had a life of savings ahead of me. Might as well pull off all the risky investment stunts now, while I could still chalk it all up to youthful folly.
On the other hand, I don't think I really believed I'd lose it all. Perhaps "I don't care" was just a mantra to soothe the nausea and to quiet the blind panic that came late at night. In any case, now that I had had my pay day -- nay, my pay season -- and was ready to double my stake again, I meditated on how I would feel if it were gone. Every few days, I tried on my old casual point of view to see if it still fit. "Easy come, easy go," I tried to say one night to my brother, but my voice trailed off into an anxious squeak. I'd done it. I no longer could convince myself I didn't care. If I lost it all now, I knew it would shape my identity for the rest of my life. I couldn't live with that. Okay, the hard part was over. I'd admitted that I cared. Next, I needed a sensible trading strategy. I wanted rules. I wanted science.
In Edwin Lefevre's classic trader's travelogue, "Reminiscences of a Stock Operator," the author recounts a conversation with a friend so unnerved by his investments that it literally keeps him awake at night. "I am carrying so much cotton that I can't sleep thinking about it," he told Lefevre. "It is wearing me out. What can I do?" Lefevre's reply is simplicity itself: "Sell down to the sleeping point." After spending the uncorrection of late 1998 bumbling into absurd profits, I think I knew what my sleeping point was. It was zero percent stocks, 100 percent U.S. dollars.
After all, who gets hurt when the music stops? The people who are still standing. The market equivalent of looking around in the sudden silence and realizing that all the chairs are taken is opening your trading account and staring at a list of securities whose value has fallen by 50 percent and is still dropping. If I was going to stay in the market, I wanted no risk of an overnight decimation. I was going to expose myself to the vicissitudes of the market for the briefest periods of time possible. My old mantra, "I don't care," was going to be replaced by a new one: "End the day in cash." That meant serious day trading.
I knew that my plan to reduce my risk by trading more often flew in the face of every investment commonplace. Nonetheless, the way I figured it, day trading was the safest possible strategy. After all, my favored Internet stocks had already risen 400 percent, 500 percent even 1,000 percent. At this point, conventional wisdom was out the window. Buy and hold? The only person who held eBay all the way up was the guy who bought it at the IPO, went out for a bike ride, got hit by a car, and lay comatose in the hospital for the next three months.
No, now that the Dow had crossed 10,000, "buy and hold" seemed laughable. I could just picture the big stock market managers, the traders for the mutual funds and the pension plans and the municipalities, for whom these new profitless Internet stocks had always seemed excessively risky, finally shaking their heads and capitulating. Every day I saw it on CNBC; the buy and hold crowd was getting in. It was time for me to get out.
That was the beginning of my day-trading career. By the end, I'd learned some rather surprising things about the mechanisms of the stock market. "Dumb Money," the story of my education as a day trader, was published on April 18, a week after last month's NASDAQ correction. Nearly every day since, I've heard somebody say that a bear market would make day trading go away. This is wrong. Things are just getting started. salon.com | May 30, 2000 http://archive.salon.com/tech/feature/2000/05/30/dumb_money/index.html The Star, the Born-Again Sinner, and the Gangster, Updating Constance Rourke's famous American archetypes. By Adam Kirsch, March 31, 2004
Americans may explain themselves to themselves more than any people on earth. Ever since Emerson and Whitman, our native writers have come back to the old questions: What is an American? How are we different from our ancestors in Europe or Africa or Asia? And why can't we come to a conclusive answer after centuries of asking?
One of the best answers was offered 73 years ago in Constance Rourke's American Humor: A Study of the National Character. Rourke's is one of those books that is always being rediscovered. First published in 1931, it was issued in paperback in 1971 and reissued in 1986; and now it is available to another generation of readers, in a new edition introduced by Greil Marcus. Rourke was a pioneer of what was not yet called "cultural studies" and enjoys a cult status among critics and writers, but she deserves a much wider audience— especially now, when endless books and op-eds are being written to explain why our "national character" inspires such envy and mistrust around the world. For American Humor shows, like no other book, how much of that character has remained the same for the last 200 years, and, equally important, the ways we have changed.
Rourke, born in 1885, was part of a generation of critics—including Edmund Wilson and Van Wyck Brooks—that taught Americans to look at their culture in a new way. Instead of the genteel literary heritage of New England, which provided the official, schoolroom version of American culture, Rourke sought the essence of Americanness in folk culture and especially in popular comedy. Much like George Orwell, who in the 1930s searched boys' stories and seaside postcards for clues to the English character, Rourke studied what Marcus' introduction calls "old almanacs, newspaper files, forgotten biographies, songbooks, joke manuals, penny dreadfuls, the unreliable leavings of nineteenth-century American culture."
What she found there were three archetypal figures, emerging from popular comedy: the Yankee, the backwoodsman, and the minstrel. Each member of "the trio," as Rourke often called them, took recognizable form in the 1820s and flourished until the Civil War. More important, she wrote, they remained at the heart of "a consistent native tradition," which she traced through the classic American writers—Whitman, Hawthorne, Henry James—and up to the modernists of her own day, including T.S. Eliot. "Humor has been a fashioning instrument in America," Rourke concluded. "Its objective … has seemed to be that of creating fresh bonds, a new unity ... and the rounded completion of an American type."
Each member of the trio contributed to that type. The Yankee, Rourke wrote, was "astute and simple, gross and rambling, rural to the core," hiding his sharp intelligence under a taciturn mask. He loved whittling, swapping, and practical jokes, and he always parried a question with another question. On the stage, where he was given outlandish New England names like "Jedediah Homebred" and "Jerusalem Dutiful," the Yankee was shown thwarting his enemies—especially the snobbish Briton—thanks to his sly rustic wit.
If the Yankee turned silence into advantage, the backwoodsman triumphed through sheer volume: "He shouted as though he were intoxicated by shouting." Born in the wilds of Kentucky and Tennessee, the backwoodsman—faced with hostile Indians and unforgiving soil—met adversity with comic self-inflation. Davy Crockett, the classic backwoodsman of legend, was "shaggy as a bear, wolfish about the head, and could grin like a hyena until the bark would curl off a gum log"; he could "whip his weight in wild cats" and "put a rifle-ball through the moon." The tall tale, with its deadpan exaggeration, was the natural idiom of the backwoodsman.
Third, and most intriguing, was the minstrel: the white performer in blackface, of whom "Jim Crow" Rice was the first and most famous. Rourke acknowledges that "blackface minstrelsy has long been considered a travesty in which the Negro was only a comic medium." But she honors it nonetheless, for providing a picture, however distorted, of genuine African-American folk culture: "[T]he songs and to a large extent the dances [in minstrel performances] show Negro origins," Rourke insists, "though they were often claimed by white composers." "The Negro," in this strangely mediated form, communicated African music and dance to America; a century before the Jazz Age, Stephen Foster took the tune for "Camptown Races" from a black folk melody. The minstrel's "humor" combined energetic nonsense-verse—what Rourke calls "unreasonable headlong triumph launching into the realm of the preposterous"—with the "tragic undertone" found in work songs and spirituals.
Rourke's achievement in bringing "the trio" to life is remarkable, and the quotations and anecdotes she gathers from her 19th-century sources remain startlingly fresh. But reading American Humor in 2004, one can't help but wonder: Do these three figures still "induce an irresistible response," as they did for Rourke in 1931? Do the Yankee, the backwoodsman, and the minstrel still offer "emblems for a pioneer people" when the people aren't such pioneers anymore?
The answers, I think, are "no," "yes," and "sort of," in that order. Of the trio, the Yankee is certainly the least visible in today's popular culture. Partly this is because New England has lost its distinctive rural character, which could still be recognized as late as Robert Frost's North of Boston in 1914. But the vanishing of the Yankee is also due to our diminished taste for his virtues: self-deprecation and a poker face. Far more to our taste is the outrageous boastfulness of the backwoodsman, who finds descendants in the action hero and the rap star. In the superhuman feats of the first and the braggadocio of the second, we see the strutting of the figure Rourke called "the gamecock of the wilderness." And, of course, the baleful tradition of the minstrel can be seen in the relentless appropriation of black popular culture by white performers, from Elvis to the present. But the qualities Rourke admired in minstrel performances— the triumphant energy, the tragic undertone—are still very much a part of the American aesthetic. The difference is that now we can experience it in genuine African-American culture—from the jazz of Louis Armstrong to the prose of Ralph Ellison—as well as in hybrids and imitations.
Most interesting of all, however, is to speculate about what a contemporary version of Rourke's book might include. If a Rourke of 2031 were to use popular culture to identify our most common archetypes, what would she find? First of all, I think, would be the Star, a type unknown in 1830 but absolutely central today. The Star is our secular, consumerist version of the Greek god: The pinnacle of aspiration and the focus of fantasy, he or she gets to enjoy what the rest of us only dream about. The Star—whether he is an actor or singer or sports figure—is not simply admired for what he is done; he is worshipped for who he is, gratuitously. The intensity of our worship and need also gives rise to the subcategory of the Fallen Star, from Marilyn Monroe to Kurt Cobain. The Fallen Star allows us to mix pity with our envy, reassuring us that, while we may dream of becoming one, the Star is best seen from a distance.
If the Star is the American triumphant, the Born-Again Sinner is the American repentant. The Sinner can be born again in the literal, Christian sense—this has been a common American experience ever since the 1820s, though Rourke only touches on religion in American Humor. But the posture of repentance, with the corresponding expectation of forgiveness, has transcended its evangelical origin, and today it shows up just about every time an American does something wrong. Bill Clinton's lip-quivering apology for the Monica Lewinsky affair is the most famous recent example. On the other hand, Martha Stewart was widely blamed, after her conviction, for not giving a better performance as the Sinner—for failing to break down and ask forgiveness, as the archetype demands. Whether such contrition is genuine hardly matters; the archetype is so powerful that simply to act like a Born-Again Sinner is almost a guarantee of absolution.
Finally, there is the latest incarnation of an ancient American trope: the Gangster, whose ancestors are the backwoodsman, the cowboy, and the pirate. What defines him is not just his criminality or his violence, but the way he puts these things at the service of his own defiant moral code. The Gangster exalts personal loyalty and masculine power, in opposition to what he sees as an inhumane and hypocritical mainstream culture. Americans like to see the Gangster punished, in the end. But we want him to be killed, not imprisoned—his ending should be as outsized as his life. The Star, the Born-Again Sinner, and the Gangster account for a great deal of today's American culture. But they are notably less comic than the archetypes Rourke found in our national psyche; after 200 years, perhaps America's youthful high spirits have turned into something darker and more resigned. http://www.slate.com/id/2098065 The United States Business Cycle, 1890-1940
http://econ161.berkeley.edu/TCEH/Slouch_Crash14.html Stealing FDR's dime. Dismissing FDR as simply a "liberal icon" who must be replaced by Reagan on the dime diminishes both presidents -- as Nancy Reagan clearly knows. By Robert Scheer
Dec. 10, 2003 | You've got to love Nancy Reagan for the steadfast way she guards her husband's legacy against opportunistic political poachers -- the most recent example being her quick rejection of the boneheaded partisan move by nearly 90 congressional Republicans who signed on to a bill to have Reagan replace Franklin Delano Roosevelt on the dime. "I do not support this proposal, and I'm certain Ronnie would not," was her no-nonsense reply.
Of course her husband would agree. His father had a job in Roosevelt's New Deal, the program that saved his family and millions of others from starvation during the Great Depression. That's why Ronald Reagan voted for Roosevelt and became an active Democrat. Even after his conservative transformation, Reagan often insisted that he never left the party of Roosevelt, but rather that the Democratic Party changed over the years and left him.
Nancy Reagan, the daughter of a successful physician, did not suffer through the Depression years, but this is a classy lady not given to fads. "When our country chooses to honor a great president such as Franklin Roosevelt by placing his likeness on our currency, it would be wrong to remove him and replace him with another," she said. "It is my hope that the proposed legislation will be withdrawn."
What made this right-wing political ploy particularly objectionable was that FDR's commemoration on the dime, a year after his death, was in honor of the March of Dimes' support of the National Foundation for Infantile Paralysis, which Roosevelt founded in 1938. In the foundation's first year, more than 2.6 million dimes were mailed to the White House in what was to become one of the great private charity efforts. It led to the eventual eradication of polio, which Roosevelt had.
Roosevelt picked the dime as the fund-raising device because he felt that everyone could afford to make at least that contribution. Like the AIDS epidemic, polio was a pervasive plague throughout the world. In a message now echoed in AIDS fundraising, Roosevelt viewed the fight against polio as a means of cultivating a greater awareness of our common humanity.
A Kansas City Star article reviewing this history quoted Roosevelt about the annual dances held on his birthday to raise funds and about the dime collections: "In sending a dime ... and in dancing that others may walk, we the people are striking a powerful blow in defense of American freedom and human decency. For the answer to class hatred, race hatred [and] ... religious hatred is the free expression of our love of our fellow man."
Not only did Roosevelt lead us against Hitler's fascism, but he also invigorated the populace, during the Depression and war, with the notion that responsibility for the commonweal be met by both the private and public sectors.
It is sad that Rep. Mark Souder, R-Ind., who initiated the campaign to get FDR off the dime, should dismiss him as simply a "liberal icon" who must be replaced with Reagan, "the conservative icon." That diminishes both presidents, who had leadership styles more complex than Souder's simplistic labels can hold. Fortunately, there are still some Republicans who can think outside of that tiny partisan box. An example is Secretary of State Colin Powell, who in his autobiography makes clear that he endorses much of what has come to be known as the Reagan revolution. But Powell cautions:
"Because I express these beliefs, some people have rushed to hang a Republican label around my neck. I am not, however, knee-jerk anti-government. I was born a New Deal, Depression-era kid. Franklin Roosevelt was a hero in my boyhood home. Government helped my parents by providing cheap public subway systems so that they could get to work, and public schools for their children, and protection under the law to make sure that labor was not exploited .... I received a free college education because New York taxed its citizens to make this investment in the sons and daughters of immigrants and the working class." I was in Powell's class at the City College of New York and can attest that Roosevelt was a hero in all our classmates' homes, just as he had been in Reagan's. That shared memory of Roosevelt's immense contribution to this nation ought to be reason enough for not trying to steal FDR's dime. http://www.salon.com/opinion/scheer/2003/12/10/dime/ Seed Money, Henry Wallace's company and how it grew. By Daniel Gross, Posted Thursday, Jan. 8, 2004, at 8:42 AM PT
Monsanto and Pioneer Hi-Bred, the two largest seed companies in the world, are busily denying reports that they conspired in the 1990s to fix prices on genetically engineered corn and soybean seeds. The allegations sound straight out of the darkest fantasies of populist left- wingers like William Jennings Bryan—or Ralph Nader. Two giant companies—Pioneer Hi-Bred is a unit of Dow component DuPont— stand accused of plotting to squeeze profits out of unsuspecting farmers in the heartland by fixing the price of the genetically mutated seeds they need in order to survive.
But history throws some mean curveballs. If the charges against Pioneer Hi-Bred turn out to be true and if collusion boosted the company's profits, the largest single beneficiaries may have been the descendants of one of the most lionized left-wing politicians of the 20th century. What's more, by engaging in serious intervention in the agricultural markets, the company would be holding true to the socialist legacy of its founder.
If you thought the most successful commercial enterprise founded by a former New Republic editor was Slate or Andrewsullivan.com, think again. Henry Wallace—farmers' advocate, agriculture secretary and vice president to Franklin Delano Roosevelt, New Republic editor, and Progressive candidate for president in 1948—founded the Hi-Bred Corn Co. in 1926. In fact, he was perhaps the greatest socialist capitalist of the 20th century.
Wallace had a remarkable career. (Here's a good online biography of Wallace, and here's a good dead- tree biography.) Born in Iowa, he edited the magazine founded by his family, Wallaces' Farmer, and started the company that became among the first to genetically engineer hybrid corn strains to produce greater-yielding crops. As FDR's agriculture secretary, he developed the Agricultural Adjustment Administration, which installed a system of price supports. In 1940, Wallace was elevated to vice president, but in 1944 was bumped in favor of Harry S. Truman and became secretary of commerce.
In 1946, Wallace broke with Truman and the Democratic Party when he attacked Truman's hard-line anti- Communist posture in a famous Madison Square Garden speech. He edited the New Republic briefly, and in 1948 ran for president on the Progressive ticket. Against the Cold War, in favor of nuclear disarmament, and willing to accept the support of Communists, Wallace was a sideshow in the turbulent 1948 campaign. He won about a million votes, but his stances earned him the enmity of former allies like Truman—"A vote for Wallace is a vote for all the things for which Stalin and Molotov stand"—and of thinking leftists. The sainted socialist intellectual Irving Howe referred to Wallace as a "completely contrived creature of Stalin."
Defeated and discredited, Wallace left politics in the 1950s and focused on his business. His company, which changed its name to Pioneer Hi-Bred, continued to grow and thrive after Wallace's death in 1965. It provided the basis not just for a family fortune but for a legacy of philanthropy and advocacy. In 1959, Wallace gave shares of Pioneer to start the Wallace Genetic Foundation, which supports agricultural research. Today it counts assets of about $80 million and is run by his daughter, Jean Wallace Douglas. His son, Robert B. Wallace, who died in 2002, founded the Wallace Global Fund, which supports sustainable development and now has assets of about $110 million. Grandson H. Scott Wallace, who sat on Pioneer's board throughout the 1990s, was a longtime executive at the National Legal Aid and Defender Association.
In 1997, DuPont acquired a 20 percent stake in Pioneer Hi-Bred. And in 1999, it moved to acquire the remaining 80 percent for $40 a share. At the time, Jean Wallace Douglas controlled an 8.1 percent stake valued at $770 million, while Robert B. Wallace controlled a 6 percent stake worth $573 million.
Henry Wallace has long been a polarizing, even contradictory figure. And it seems that history has neatly compartmentalized his business and political activities. Today, Wallace's business career is conveniently left out of the version of his biography posted on left-leaning sites, like this one, just as his now- embarrassing politics are left out of the brief biography that DuPont has on its Web site. A man who went down in political history marked as a Communist dupe was a hard-nosed and brilliant businessman. And if it turns out that a little meddling in the market helped his successors continue to keep the company thriving, perhaps you could say Wallace's socialist lineage lives on. http://www.slate.com/id/2093577 The beauty of alcohol, A Dedicated Drinker Sings Sweet Praises Of The Bottle. BY DAVID BOWMAN
Name your poison. Vodka is ethereal. Wine is biblical. Mashed malt is liquor at its earthy best. There are drinking clubs in Manhattan where guys sip tumblers of Scotch so rare it costs a hundred bucks a swallow. Can you imagine paying that much to wet your whistle? I can. Scotch is beautiful. I love Scotch so much I don't drink it. Otherwise I'd be a fall-on-your-face drunk. My wife would leave me. The dog, too. I'd never write again, just slump at a bar purring with pleasure.
So I drink hops—dependable, working-class beer. Yuppie microbreweries may rise and fall, but beer will always possess that Old Milwaukee stigma—despite the fact that it was first brewed in ancient Egypt. Women would mix bread dough with yeast in a large tub and give it a week to ferment. Then they'd filter the mash and season it with spices and dates. The resulting beer was so thick it had to be strained. Their men would drink it for breakfast, lunch and dinner. Those pharaoh's boys walked around permanently skunked yet still managed to write hieroglyphs and build pyramids.
The feminine origins of Egyptian Bud interest me because American femininity has a dark history with alcohol. In the 19th century, women used temperance as a wedge to get the right to vote. For 50 years they sang songs like "No Hope for the Drunkard" and "Breaking Mother's Heart" and "Crape on the Door of the Licensed Saloon," until the Volstead Act was passed in 1919. America went dry, but women could vote the next year. One wrong made a right.
So we can never take the legality of alcohol for granted. Cigarettes will soon be banned. Kegs could be next. Prohibition Part II might be instigated by insurance companies, or Mothers Against Drunk Driving. Or maybe by readers of the sobriety genre, those memoirs that begin by describing the romance of the bottle. The look of it. The color of light shining through dark liquids. The perfect shape of the martini glass. The moisture on the coaster. The simple swizzle stick. But each book contains a righteous but. The memoirist describes stumbling down the stairs bone drunk. Waking up in a puddle of pee. I picture readers giving prim smiles and bursting into some dour temperance hymn.
It is against this ghost of temperance that I offer up my account of the pursuit of liquid bread. I know that drinking six, seven, eight beers a night will eventually kill me, but it will have been worth it. If you told me that enjoying F. Scott Fitzgerald or Merle Haggard would kill me, I wouldn't toss my books or albums. But art doesn't kill. They say love can. That's why beauty and alcohol are considered sublime.
Most social drinkers aren't running for their lives, however. I was a bartender for a year in the early 1980s at a dingy little art bar in Manhattan, where I still live—the first one to instigate the practice of draping the tables with drawing paper and providing customers with crayons. As I watched those men and women coloring and drinking, it seemed to me that most social drinkers enjoyed the buzz more than they did the taste of the waters they sipped. All those rum and Cokes, seven and sevens—soda-pop drinks. But the truth is that liquor is an acquired taste. Remember your first drink, when you were 17? You had it right: yuck! It took me a decade to educate my tongue to appreciate that first taste of a chilly English ale in summer, a room-temperature stout when it's cold outside.
But that first beer of the day—at sundown in summer or in the winter dark—is just a start. My tongue prefers variety. I drink at least four different types of beer per session. I can go on and on about beer. I can tell you how temperature affects the flavor. Even how the shape of the glass does.
Is my life that hard? Sure is. Maybe it's harder than yours. Maybe easier. It doesn't matter. Our pain is no credential here. We're all pilgrims. None of us were intended to have an easy time of it.
Drinking isn't for everyone, of course—but why on earth would anyone want to face a day completely straight or sober? I suppose because in the end alcohol kills the body. I recently saw a doctor because I was applying for life insurance, and I made the mistake of telling that sawbones how much I drink. He looked at me with the disgust of an ax-wielding temperance matron gazing into the interior of a saloon….
SALON | March 1, 1999 http://archive.salon.com/feature/1999/03/cov_01featureb2.html Built on the buzz, By Maria Russo
May 03, 2001 | "Nature is parsimonious with pleasure," writes historian David Courtwright in "Forces of Habit: Drugs and the Making of the Modern World." But human ingenuity has stepped in to lessen the miseries and add to the delights of earthly existence. Courtwright calls it "the psychoactive revolution": Compared with 500 years ago, people across the planet now have easy access to a variety of consciousness-altering substances. The menu of options differs from culture to culture but the drive to take a temporary vacation from our normal waking state has made some drugs into perhaps the only truly global commodities.
Alcohol joins caffeine and tobacco to round out what Courtwright calls the "big three" of currently legal psychoactive drugs. As he sees it, modern civilization is practically unthinkable without this trio. But why have they fared so well while equally intoxicating substances—like, say, marijuana—are banned and stigmatized, and others—like kava, khat and betel—are popular only in distinct geographic areas? And why is tobacco currently falling in popularity, while alcohol and caffeine are holding their grip on us?
Once the big three caught on among European elites, they became crucial components in the ocean- crossing commerce and empire building that shaped modern economies. Tobacco, coffee, tea and spirits were lucrative; users quickly grew dependent on them, guaranteeing a steady demand for commodities that could be heavily taxed. These drugs have also always been an ideal way to control and pacify laborers, providing them with temporary relief from the fatigue and boredom of agricultural and, later, industrial life. Some of these workers—such as those Eastern European peasants paid in vodka for the potatoes and grain they delivered to distilleries or West Indian distillery workers paid in rum—found themselves caught in devastating economic traps.
As habits, the big three also work with and reinforce one another nicely. Had too much to drink last night? You'll be especially eager for that morning cup of coffee to clear your head. Feeling too wired now? Time for a cocktail!
Over time, as you ingest more of these substances, your body's tolerance for each of them increases, so you need more and more to get the same result. These endless cycles not only tap into a vulnerability in the human psyche—we're hard-wired to seek to mitigate pain and increase pleasure—but are also the essential building blocks of capitalism:
The peculiar genius of capitalism is its ability to betray our senses with one class of products or services and then sell us another to cope with the damage so that we can go back to consuming more of what caused the problem in the first place.
The economic impact of legal drugs extends from barley farmers to bartenders to the social workers who run drug rehab clinics to the lawyers who defend drunken drivers to the scholars who study the history of drugs.
Before the late 19th century, governments were aware of the destructive properties of drugs, but the revenues to be had from taxing the drug trade were more compelling than the moral imperative to outlaw it. Gradually, as the booming print media made the downside of certain drugs more visible public demand mounted to ban some drugs outright.
The losers during this time were the "little three"—opium, cannabis and coca. With narcotics like opium and cocaine, governments justified bans by pointing to extreme and visible health problems and social costs. Marijuana, however, appears to have been the victim of historical bad luck: Its health effects are no more dire than those of alcohol or tobacco, but over the centuries the plant lacked "international corporate backing or fiscal influence." Unlike tobacco and alcohol, which was banned temporarily in the US during Prohibition but was quickly reinstated to legal status, marijuana never became part of "the personal habits of influential leaders and celebrities." (That makes sense, of course, given the go-go nature of capitalism and the traits required for success—anyone who did make marijuana part of his personal habits would most likely not have become an influential leader or celebrity in the first place.) Instead, the mellow plant lodged itself first in peasant and then in youth subcultures, where it gained a reputation as a "gateway drug" to harder, harsher illegal substances like heroin and cocaine. (It is a gateway drug, but then so are nicotine and alcohol.)
Alcohol has the most fascinating and contradictory history. Alcohol is as lethal as they come; its score on pharmacologist Maurice Seevers' famous 1957 "addiction liability rating"—the degree to which a drug produces tolerance, emotional and physical dependence, physical deterioration and antisocial behavior in those under its influence or those withdrawing from it—is 24, much higher than that of heroin (16) and cocaine (14), and ridiculously higher than that of marijuana (8). Yet alcohol is not just tolerated in many cultures, it's frequently exalted.
Courtwright points to the alcohol industry's "size and fiscal importance" to account for the drug's unassailable legality and social cachet. He does note that scientists have found moderate drinking to be healthful, and he observes that "humanity ...has long experience of alcohol, and has evolved all manner of rules and taboos to reduce the harmfulness of drinking." But human beings turned the making and drinking of alcohol into a venerable tradition, one that may depend on the high that alcohol delivers but one that also can't be reduced to a mere buzz.
In contrast to the array of alcoholic beverages that Western civilization has developed and matched to particular moments of life, the distinctions between brands of cigarettes are mainly a matter of marketing. Perhaps that's partly why, of the big three, tobacco is faring the worst these days; its cultural roots are just not deep enough. With many municipalities banning smoking, cigarettes have been largely exiled from the civilized table, where caffeine and alcohol have been prettied up by the gourmet delivery systems of wine and coffee. Connoisseurship disguises well the irresistible craving for a buzz. As Courtwright puts it, "Tobacco...is becoming a loser's drug." While their caffeine-addicted friends can now find a Starbucks on every corner beckoning them in for a quick hit in a cozy environment, smokers are reduced to hopping around on freezing sidewalks outside office buildings.
No longer do cigarettes serve as "the small change of sociability," in Courtwright's phrase, or help a woman appear more independent and sophisticated, as they once did for countless Hollywood actresses.
Stripped of its social trappings as it increasingly is, smoking is beginning to appear as nothing more interesting than a smelly, breath-fouling, teeth-staining, illness-causing personal addiction. That doesn't mean that the cigarette has gasped its last—there still is, for many people, a surge of pleasure that comes from lighting up. That's not going away soon, but it certainly will become more difficult to get as regulation spreads from restaurants to outdoor spaces such as parks. What Courtwright calls the growing "lower- class concentration" of tobacco also makes it more politically vulnerable as well.
Caffeine, in Courtwright's book, emerges triumphant among its mind-altering brethren as the least harmful, most life-enhancing drug yet discovered. It's the earth's most widely used drug, with a per capita consumption of 70 milligrams a day. Caffeine alters brain chemistry in notable ways, producing euphoric effects such as a rush of energy and an elevation of mood. It's addictive in the sense that tolerance increases the more you use it and withdrawal can lead to symptoms such as headaches and lethargy. Nonetheless, the drug's negative side effects, however troublesome, are not dire; too much caffeine causes nothing worse than insomnia or tremors.
Precious few lives are ruined by caffeine. But doctors like to warn that the jury is still out on other potential health problems that may be caused by regular caffeine consumption.
Still, all in all, the social costs of our love affair with caffeine are remarkably low, and its role as a spur to productivity and an aid to coping with the more difficult, sad and painful aspects of life gives the plucky little molecule a strongly positive aura. Even more than the other players in Courtwright's "psychoactive revolution," it's hard to imagine the modern world without it…. http://dir.salon.com/books/feature/2001/05/03/drugs/index.html?sid=1027938 The "Tijuana Bibles," America's original X-rated underground comics, evoke a time when sex was dirty, innocent and handmade. BY SUSIE BRIGHT
------
Your Momma don't dance, and your Daddy don't rock 'n' roll, but your grandparents sure knew how to make pornography!
I just finished turning every big prurient page of "Tijuana Bibles: Art and Wit in America's Forbidden Funnies, 1930s-1950s." And even though the book is filled with top-quality scholarly research and analysis from the finest minds in funnies, with beautiful reproductions on smooth creamy paper, the end result of perusing this bodacious collection of old-timer erotica is that I feel mischievous and irrepressibly rude.
What are Tijuana Bibles? They have nothing to do with Mexico, and they're certainly not godly—although the people who collect the 700 or so TBs that remain in existence might be called fanatics. They're little underground comic books that were illegally published during the '30s, '40s and '50s, filled with cartoons of famous characters of the day—from Cary Grant to Rita Hayworth, from Gandhi to Betty Boop— [fornicating] their bloomers off. They were the original "dirty comic books," sold in barber shops and schoolyards, passed from one sweaty hand to the next.
Nowadays, when I ask people born in the 1950s or later to tell me the story of their "first dirty picture," they inevitably tell me about Playboy magazine or one of its imitators. What they remember was a photograph. It's a fantasy photo suggesting what MIGHT happen with a beautiful girl; it's her luscious body as a promise of something to come.
But if you grew up before the men's magazine era, your first glimpse of something really filthy is likely to have been one of these Tijuana funnies—aka "[fornication] books." In these books, of course, everything is hand-drawn. In quality, [fornication] books range from abysmal—second-grade-level dirty jokes, hideous drawings—to quite high—elegant caricatures, witty stories. The women's bodies are as comely as any Miss May's, but instead of simpering unapproachably on a bearskin rug, they are actively [fornicating]. The male partners, who by comparison are inevitably drawn to look like buffoons.
As pornography, the Tijuana Bibles fall somewhere between X-rated drawings on toilet-stall walls and early stag films. There's a certain wacky innocence to them, despite their furtive prurience. As Art Spiegelman points out in his stimulating introduction, "Though there are bound to be those who loudly declaim that the Tijuana Bibles demean women, I think it important to note that they demean everyone, regardless of gender, ethnic origin or even species. It's what cartoons do best, in fact."
A glance at the "Index of Subjects Parodied" at the back of this wonderfully entertaining book makes clear how wide a cultural swath the Tijuana Bibles cut: Aunt Jemima, Barney Google, bellhops, Ingrid Bergman, Bringing Up Father, Al Capone, Chiang Kai-shek, farmer's daughters, Lou Gehrig. In fact, just about any stock figure from popular culture is liable to drop his or her pants in a Tijuana Bible—the amorous adventures of the Fuller Brush Man are a favorite, while Benito Mussolini and Joe Stalin are also depicted enjoying the fleshly perquisites of dictatorship.
If comics like this existed today we'd probably all be either dying with laughter or adding our names to a libel suit. Just look at what happened to Larry Flynt (an heir to the Tijuana Bible throne if there ever was one) when he published his notorious cartoon of Jerry Falwell having sex with his mother in an outhouse! But now that the targets' names have faded from public memory, the Tijuana Bibles just seem like a good-natured roll in nostalgia hay.
Funny how having a little historical perspective makes the raunchiest porno look like darling little illustrations, socially perceptive parodies and evocative clues to our past. I can hardly wait for our great- great-grandchildren to look at a collection of Internet pornography and say to each other, "Look honey, weren't they just the darnedest little cutups!" Aug. 19, 1997 http://archive.salon.com/aug97/tijuana970819.html The Hangover Theory, Are recessions the inevitable payback for good times? By Paul Krugman, Posted Friday, Dec. 4, 1998, at 12:30 AM PT
…Call it the over-investment theory of recessions, or "liquidationism," or just call it the "hangover theory." It is the idea that slumps are the price we pay for booms, that the suffering the economy experiences during a recession is a necessary punishment for the excesses of the previous expansion. The hangover theory is perversely seductive because it turns economics into a morality play, a tale of hubris and downfall. And it offers adherents the special pleasure of dispensing painful advice with a clear conscience, secure in the belief that they are not heartless but merely practicing tough love.
Powerful as these seductions may be, they must be resisted—for the hangover theory is disastrously wrongheaded. Recessions are not necessary consequences of booms. They should be fought, not with austerity but with policies that encourage people to spend more, not less. The hangover theory can do real harm. Liquidationist views played an important role in the spread of the Great Depression—with Austrian theorists strenuously arguing, in the very depths of that depression, against any attempt to restore "sham" prosperity by expanding credit and the money supply.
The hangover theory goes something like this: an investment boom gets out of hand. Maybe reckless bank lending drives it, maybe it is simply a matter of irrational exuberance by entrepreneurs. All that investment leads to excess capacity—of factories that cannot find markets, of office buildings that cannot find tenants. Eventually, investors go bust and investment spending collapses. The result is a slump whose depth is in proportion to the previous excesses. Moreover, that slump is part of the necessary healing process: The excess capacity gets worked off, prices and wages fall from their excessive boom levels, and only then is the economy ready to recover.
But let's ask a seemingly silly question: Why should the ups and downs of investment demand lead to ups and downs in the economy as a whole? Don't say that it's obvious—although investment cycles clearly are associated with economy-wide recessions and recoveries in practice, a theory is supposed to explain observed correlations, not just assume them. And in fact the key to the Keynesian revolution in economic thought—a revolution that made hangover theory obsolete—was John Maynard Keynes' realization that the crucial question was not why investment demand sometimes declines, but why such declines cause the whole economy to slump.
Here's the problem: As a matter of simple arithmetic, total spending in the economy is necessarily equal to total income (every sale is also a purchase, and vice versa). So if people decide to spend less on investment goods, doesn't that mean that they must be deciding to spend more on consumption goods— implying that an investment slump should always be accompanied by a corresponding consumption boom? And if so why should there be a rise in unemployment?
Most modern hangover theorists don’t answer the riddle. Some vaguely suggest that unemployment is a frictional problem created as the economy transferred workers from a bloated investment goods sector back to the production of consumer goods. But in that case, why doesn't the investment boom—which presumably requires a transfer of workers in the opposite direction—also generate mass unemployment? And anyway, this story bears little resemblance to what actually happens in a recession, when every industry—not just the investment sector—normally contracts.
Why recessions happen is simple. A recession occurs when a large part of the private sector tries to increase its cash reserves simultaneously. Yet, for all its simplicity, the insight that a slump is about an excess demand for money makes nonsense of the whole hangover theory. If the problem is people want to hold more money than there is in circulation, why not simply increase the supply of money? You may tell me that it's not that simple, that during the previous boom businessmen made bad investments and banks made bad loans. Well, fine. Junk the bad investments and write off the bad loans. Why should this require that perfectly good productive capacity be left idle?
The hangover theory, then, turns out to be intellectually incoherent; nobody has explained why bad investments in the past require the unemployment of good workers in the present. Yet the theory has powerful emotional appeal. Usually that appeal is strongest for conservatives, who can't stand the thought that positive action by governments (let alone printing money) can ever be a good idea. But moderates and liberals are not immune to the theory's seductive charms—especially when it gives them a chance to lecture others on their failings.
The Great Depression happened largely because policy-makers imagined that austerity was the way to fight a recession. http://slate.msn.com/id/9593/