Friday

An Observation From the Future

The celebrity economy.

The last of this century's great trends was noted by acute observers in 1996, yet somehow most people failed to appreciate it. Although business gurus were proclaiming the predominance of creativity and innovation over mere routine production, in fact the growing ease with which information could be transmitted and reproduced was making it ever harder for creators to profit from their creations. Today, if you develop a marvelous piece of software, by tomorrow everyone will have downloaded a free copy from the Net. If you record a magnificent concert, next week bootleg CDs will be selling in Shanghai. If you produce a wonderful film, next month high-quality videos will be available in Mexico City.

How, then, can creativity be made to pay? The answer was already becoming apparent a century ago: creations must make money indirectly, by promoting sales of something else. Just as auto companies used to sponsor Grand Prix racers to spice up the image of their cars, computer manufacturers now sponsor hotshot software designers to build brand recognition for their hardware. And the same is true for individuals. The royalties the Four Sopranos earn from their recordings are surprisingly small; mainly the recordings serve as advertisements for their arena concerts. The fans, of course, go to these concerts not to appreciate the music (they can do that far better at home) but for the experience of seeing their idols in person. Technology forecaster Esther Dyson got it precisely right in 1996: "Free copies of content are going to be what you use to establish your fame. Then you go out and milk it". In short, instead of becoming a Knowledge Economy we have become a Celebrity Economy.

Luckily, the same technology that has made it impossible to capitalize directly on knowledge has also created many more opportunities for celebrity. The 500-channel world is a place of many subcultures, each with its own culture heroes; there are people who will pay for the thrill of live encounters not only with divas but with journalists, poets, mathematicians, and even economists. When Andy Warhol predicted a world in which everyone would be famous for 15 minutes, he was wrong: if there are indeed an astonishing number of people who have experienced celebrity, it is not because fame is fleeting but because there are many ways to be famous in a society that has become incredibly diverse.

This was an excerpt from Paul Krugman's book, "The Accidental Theorist."
To read the complete article, click
HERE.




White Collars Turn Blue
by Paul Krugman

A note to readers: This was written for a special centennial issue of the NYT magazine.
The instructions were to write it as if it were in an issue 100 years in the future,
looking back at the past century.

When looking backward, one must always be prepared to make allowances: it is unfair to blame late 20th-century observers for their failure to foresee everything about the century to come. Long-term social forecasting is an inexact science even now, and in 1996 the founders of modern nonlinear socioeconomics were still obscure graduate students. Still, even then many people understood that the major forces driving economic change would be the continuing advance of digital technology, on one side, and the spread of economic development to previously backward nations, on the other; in that sense there were no big surprises. The puzzle is why the pundits of the time completely misjudged the consequences of those changes.

Perhaps the best way to describe the flawed vision of fin-de-siecle futurists is to say that, with few exceptions, they expected the coming of an "immaculate" economy -- an economy in which people would be largely emancipated from any grubby involvement with the physical world. The future, everyone insisted, would bring an "information economy", which would mainly produce intangibles; the good jobs would go to "symbolic analysts", who would push icons around on computer screens; and knowledge rather than traditionally important resources like oil or land would become the main source of wealth and power.

But even in 1996 it should have been obvious that this was silly. First, for all the talk of an information economy, ultimately an economy must serve consumers -- and consumers don't want information, they want tangible goods. In particular, the billions of Third World families who finally began to have some purchasing power as the 20th century ended did not want to watch pretty graphics on the Internet -- they wanted to live in nice houses, drive cars, and eat meat. Second, the Information Revolution of the late 20th century was -- as everyone should have realized -- a spectacular but only partial success. Simple information processing became faster and cheaper than anyone had imagined possible; but the once confident Artificial Intelligence movement went from defeat to defeat. As Marvin Minsky, one of the movement's founders, despairingly remarked, "What people vaguely call common sense is actually more intricate than most of the technical expertise we admire". And it takes common sense to deal with the physical world -- which is why, even at the end of the 21st century, there are still no robot plumbers.

Most important of all, the prophets of an "information economy" seem to have forgotten basic economics. When something becomes abundant, it also becomes cheap. A world awash in information will be a world in which information per se has very little market value. And in general when the economy becomes extremely good at doing something, that activity becomes less rather than more important. Late 20th-century America was supremely efficient at growing food; that was why it had hardly any farmers. Late 21st-century America is supremely efficient at processing routine information; that is why the traditional white-collar worker has virtually disappeared from the scene.

With these observations as background, then, let us turn to the five great economic trends that observers in 1996 should have expected but didn't.

.............

Soaring resource prices. The first half of the 1990s was an era of extraordinarily low raw material prices. Yet it is hard to see why anyone thought this situation would continue. The Earth is, as a few lonely voices continued to insist, a finite planet; when 2 billion Asians began to aspire to Western levels of consumption, it was inevitable that they would set off a scramble for limited supplies of minerals, fossil fuels, and even food.

In fact, there were some warning signs as early as 1996. There was a temporary surge in gasoline prices during the spring of that year, due to an unusually cold winter and miscalculations about Middle East oil supplies. Although prices soon subsided, the episode should have reminded people that by the mid-90s the world=s industrial nations were once again as vulnerable to disruptions of oil supply as they had been in the early 1970s; but the warning was ignored.

Quite soon, however, it became clear that natural resources, far from becoming irrelevant, had become more crucial than ever before. In the 19th century great fortunes were made in industry; in the late 20th they were made in technology; but today's super-rich are, more often than not, those who own prime land or mineral rights.

The environment as property. In the 20th century people used some quaint expressions -- "free as air", "spending money like water" -- as if such things as air and water were were available in unlimited supply. But in a world where billions of people have enough money to buy cars, take vacations, and buy food in plastic packages, the limited carrying capacity of the environment has become perhaps the single most important constraint on the average standard of living.

By 1996 it was already clear that one way to cope with environmental limits was to use the market mechanism -- in effect to convert those limits into new forms of property rights. A first step in this direction was taken in the early 1990s, when the U.S. government began allowing electric utilities to buy and sell rights to emit certain kinds of pollution; the principle was extended in 1995 when the government began auctioning off rights to use the electromagnetic spectrum. Today, of course, practically every activity with an adverse impact on the environment carries a hefty price tag. It is hard to believe that as late as 1995 an ordinary family could fill up a Winnebago with dollar-a-gallon gasoline, then pay only $5 to drive it into Yosemite. Today such a trip would cost about 15 times as much even after adjusting for inflation.

The economic consequences of the conversion of environmental limits into property were unexpected. Once governments got serious about making people pay for the pollution and congestion they caused, the cost of environmental licenses became a major part of the cost of doing business. Today license fees account for more than 30 percent of GDP. And such fees have become the main source of government revenue; after repeated reductions, the Federal income tax was finally abolished in 2043.

The rebirth of the big city. During the second half of the 20th century, the traditional densely populated, high-rise city seemed to be in unstoppable decline. Modern telecommunications had eliminated much of the need for close physical proximity between routine office workers, leading more and more companies to shift their back-office operations from lower Manhattan and other central business districts to suburban office parks. It began to seem as if cities as we knew them would vanish, replaced with an endless low-rise sprawl punctuated by an occasional cluster of 10-story office towers.

But this proved to be a transitory phase. For one thing, high gasoline prices and the cost of environmental permits made a one-person, one-car commuting pattern impractical. Today the roads belong mainly to hordes of share-a-ride minivans, efficiently routed by a web of intercommunicating computers. However, although this semi-mass-transit system works better than 20th-century commuters could have imagined -- and employs more than 4 million drivers -- suburban door-to-door transportation still takes considerably longer than it did when ordinary commuters and shoppers could afford to drive their own cars. Moreover, the jobs that had temporarily flourished in the suburbs -- mainly relatively routine office work -- were precisely the jobs that were eliminated in vast numbers beginning in the mid-90s. Some white-collar jobs migrated to low-wage countries; others were taken over by computers. The jobs that could not be shipped abroad or handled by machines were those that required the human touch -- that required face-to-face interaction, or close physical proximity between people working directly with physical materials. In short, they were jobs best done in the middle of dense urban areas, areas served by what is still the most effective mass-transit system yet devised: the elevator.

Here again, there were straws in the wind. At the beginning of the 1990s, there was much speculation about which region would become the center of the burgeoning multimedia industry. Would it be Silicon Valley? Los Angeles? By 1996 the answer was clear; the winner was ... Manhattan, whose urban density favored the kind of close, face-to-face interaction that turned out to be essential. Today, of course, Manhattan boasts almost as many 200-story buildings as St. Petersburg or Bangalore.

The devaluation of higher education. In the 1990s everyone believed that education was the key to economic success, for both individuals and nations. A college degree, maybe even a postgraduate degree, was essential for anyone who wanted a good job as one of those "symbolic analysts".

But computers are very good at analyzing symbols; it's the messiness of the real world they have trouble with. Furthermore, symbols can be quite easily transmitted to Asmara or La Paz and analyzed there for a fraction of the cost of doing it in Boston. So over the course of this century many of the jobs that used to require a college degree have been eliminated, while many of the rest can, it turns out, be done quite well by an intelligent person whether or not she has studied world literature.

This trend should have been obvious even in 1996. After all, even then America's richest man was Bill Gates, a college dropout who didn't seem to need a lot of formal education to build the world's most powerful information technology company.

Or consider the panic over "downsizing" that gripped America in 1996. As economists quickly pointed out, the rate at which Americans were losing jobs in the 90s was not especially high by historical standards. Why, then, did downsizing suddenly become news? Because for the first time white-collar, college-educated workers were being fired in large numbers, even while skilled machinists and other blue-collar workers were in high demand. This should have been a clear signal that the days of ever-rising wage premia for people with higher education were over, but somehow nobody noticed.

Eventually, of course, the eroding payoff to higher education created a crisis in the education industry itself. Why should a student put herself through four years of college and several years of postgraduate work in order to acquire academic credentials with hardly any monetary value? These days jobs that require only six or twelve months of vocational training -- paranursing, carpentry, household maintenance (a profession that has taken over much of the housework that used to be done by unpaid spouses), and so on -- pay nearly as much as one can expect to earn with a master's degree, and more than one can expect to earn with a Ph.D.. And so enrollment in colleges and universities has dropped almost two-thirds since its turn-of-the-century peak. Many institutions of higher education could not survive this harsher environment. The famous universities mostly did manage to cope, but only by changing their character and reverting to an older role. Today a place like Harvard is, as it was in the 19th century, more of a social institution than a scholarly one -- a place for the children of the wealthy to refine their social graces and make friends with others of the same class.

The celebrity economy. The last of this century's great trends was noted by acute observers in 1996, yet somehow most people failed to appreciate it. Although business gurus were proclaiming the predominance of creativity and innovation over mere routine production, in fact the growing ease with which information could be transmitted and reproduced was making it ever harder for creators to profit from their creations. Today, if you develop a marvelous piece of software, by tomorrow everyone will have downloaded a free copy from the Net. If you record a magnificent concert, next week bootleg CDs will be selling in Shanghai. If you produce a wonderful film, next month high-quality videos will be available in Mexico City.

How, then, can creativity be made to pay? The answer was already becoming apparent a century ago: creations must make money indirectly, by promoting sales of something else. Just as auto companies used to sponsor Grand Prix racers to spice up the image of their cars, computer manufacturers now sponsor hotshot software designers to build brand recognition for their hardware. And the same is true for individuals. The royalties the Four Sopranos earn from their recordings are surprisingly small; mainly the recordings serve as advertisements for their arena concerts. The fans, of course, go to these concerts not to appreciate the music (they can do that far better at home) but for the experience of seeing their idols in person. Technology forecaster Esther Dyson got it precisely right in 1996: "Free copies of content are going to be what you use to establish your fame. Then you go out and milk it". In short, instead of becoming a Knowledge Economy we have become a Celebrity Economy.

Luckily, the same technology that has made it impossible to capitalize directly on knowledge has also created many more opportunities for celebrity. The 500-channel world is a place of many subcultures, each with its own culture heroes; there are people who will pay for the thrill of live encounters not only with divas but with journalists, poets, mathematicians, and even economists. When Andy Warhol predicted a world in which everyone would be famous for 15 minutes, he was wrong: if there are indeed an astonishing number of people who have experienced celebrity, it is not because fame is fleeting but because there are many ways to be famous in a society that has become incredibly diverse.

Still, the celebrity economy has been hard on some people -- especially those of us with a scholarly bent. A century ago it was actually possible to make a living as a more or less pure scholar: someone like myself would probably have earned a pretty good salary as a college professor, and been able to supplement that income with textbook royalties.Today, however, teaching jobs are hard to find and pay a pittance in any case; and nobody makes money by selling books. If you want to devote yourself to scholarship, there are now only three options (the same options that were available in the 19th century, before the rise of institutionalized academic research). Like Charles Darwin, you can be born rich, and live off your inheritance. Like Alfred Wallace, the less fortunate co-discoverer of evolution, you can make your living doing something else, and pursue research as a hobby. Or, like many 19th-century scientists, you can try to cash in on scholarly reputation by going on the paid lecture circuit.

But celebrity, though more common than ever before, still does not come easily. And that is why writing this article is such an opportunity. I actually don't mind my day job in the veterinary clinic, but I have always wanted to be a full-time economist; an article like this might be just what I need to make my dream come true.

Paul Krugman is a (full-time) Professor of Economics at the Massachusetts Institute of Technology.

The article was included in Paul Krugman's book, "The Accidental Theorist (and Other Dispatches from the Dismal Science).
Paperback edition: 1998


(Paul is currently a New York Times columnist)


You can find his New York Times pieces as they come out, HERE.

Saturday

Pagan Sexology

Excerpt from interview with Carol Queen

From Modern Pagans

RE/SEARCH: You're a sexologist; can you discuss sex in a Pagan context--

CAROL QUEEN: Whether it's in a Pagan, Wiccan or sex-positive "alternative" context, people are hungry to hear sex spoken about positively. People don't want to be screwed up. People desire sexual comfort and joy; if their sex life is unsatisfactory, they aren't happy.

I'm deeply grounded in Pagan beliefs about who we are as humans and who we can be; I believe in the notion of divinity as a part of each one of us. People really want the message that sex can be a spiritually powerful and connective force. In this day and age, most sex is not about procreation--it's about connecting, and entering an altered state through sex. Even people who disdain anything "spiritual" desire that.

Women were always the ones who were supposed to say "No" to men; they were always supposed to be the upholders of the relationship. And now, a lot of people are looking to women leaders for better attitudes about sex. There has been a lot of discussion about male sexuality as problematic. I know that I was deeply entrenched in a feminism that didn't respect or care much where men were coming from sexually. Then I came to Paganism, with its image of the Goddess and the God uniting in equality--each bringing something equally important to the union. That actually kicked a lot of the sex-negative and male-phobic struts out from under me.

When I was still a lesbian, Paganism was what allowed me to "come out" as bisexual. The big image of the Goddess and the God in union, sacred, to be respected and honored got me to the place where I could say, "You know what? I want balance in my life--I want erotic balance in my life."

Not everybody is bisexual, but a lot of people might be open to this kind of erotic balance if it weren't for social forces: homophobia and "heterophobia." I always put quotation marks around heterophobia because it's not the kind of virulent, hate-based condition that homophobia is--but it still affects people's individual lives so they're afraid to explore what they might do and who they might be. The ways that people are stopped from being Who They Might Be in this culture are legion. It's not just about sex; it's about assuming our full power.

That's another appealing aspect of Paganism: it allows us to take our power seriously. If we're all expressions of Goddess and God, then we have to take ourselves a little more seriously in the world! Discovering Paganism was like discovering sex-positivity: "Click!--this is how I want the world to be! This explains things, and it doesn't explain them in a way that casts people as victims, with a negative, overarching understanding of who we are as people..." (continued)