Wednesday, July 30, 2008

"Monsters, monsters from the id!"

While rummaging through movie classics recently, I ran across one of the greatest of science fiction films ever, Forbidden Planet. Released in 1956, it starred Leslie Nielsen as spaceship commander Adams, Walter Pidgeon as the mysterious Dr. Morbius*, and Anne Francis as his lovely, innocently wise daughter, Alta. Loosely based on Shakespeare's Tempest, no movie did more, in one stroke, to make science fiction a respectable genre.

Today, Forbidden Planet seems somewhat awkward. The original trailer didn't quite know what to do: proclaim it just another B.E.M. ("bug-eyed monster") movie, or pitch it "highbrow"? But its innovations make up a long list copied in obvious ways by almost all later movie and television science fiction. Looking for the origin of the Star Trek transporters and warp drive, or the suspended animation of Lost in Space and 2001? Looking for the origin of the whole Star Trek paradigm -- an Earth ship encountering humans stranded on an alien world, humans needing but not wanting rescue; or the familiar trio of captain, executive officer, and doctor? Looking for the origins of a sophisticated visual science fiction with literary roots? The origin of Star Wars' charming, superhuman robots? The sinister potential of advanced technology? Far-out electronic "space" music for a score? Here it is.

MGM knew what it was doing and spared no expense or care for detail. The production values were astonishing for the time. The robot later known as Robby was introduced by Forbidden Planet and would later make multiple movie and television appearances. Here he is an aloof but ever-helpful machine that (who?) can, from a small sample, make apparently endless supplies of anything desired. Reminiscent of his avatar Caliban, he is the remains of a race of superintelligent beings called the Krell, whose technology Morbius reconstructs and uses to enhance his own mind to superhuman levels. Glimpses of the Krell's technical achievements are thrown out to whet the audience's appetite. The only mystery is the why the Krell vanished.

The movie's most brilliant stroke is that the Krell and their likeness are never shown, only hinted at. Also never shown directly is the monster that kills several starship crew members and which is only glimpsed in one scene produced by some Disney personnel "lent" to MGM. A technique borrowed from horror flicks -- never showing the danger directly -- moves Forbidden Planet far beyond the staple sci-fi movies of the Fifties. The cinematography in Technicolor and the score entirely produced by electronics, not traditional instruments, reinforces these qualities.

The full force of the story doesn't kick you in the head until the last scene. Watching the detonation of the planet from far away, Anne Francis and Leslie Nielsen are left to contemplate the failure of a high civilization, the awesome Krell, whose technical mastery put them under the illusion that they had escaped their own animality.

POSTSCRIPT: The electronically-scored soundtrack is available on a separate CD, which was first released in 1986 for the film's thirtieth anniversary. A special two-disc DVD set was issued for the movie's fiftieth anniversary in 2006.

The score's creators, husband-and-wife team Louis and Bebe Barron, were not sure they were doing sound effects, or music -- until John Cage convinced them that it was music.
---
* An interesting merger of "morbid" and "Möbius," as in the non-orientable Möbius strip.

Labels: , ,

Saturday, July 26, 2008

Is modern art that bad?

Are the Philistines on the march over at the Guardian? Joe Queenan just published there a splenetic outburst against modern art. He doesn't seem to like much after about 1900. Terry Teachout does his own examination of Queenan's hostility at the Wall Street Journal. Complaints about modern art -- I don't mean just negative reactions to anything new, but sustained, visceral dislike for the modern -- are as old as modern art itself.

There is a problem with artistic modernism, especially music. But the problem is more recent than Queenan thinks and dates from the 1950s, not the 1900s. It is true that the "break" that defines modernism happened some time in the decade or so before 1914, and much of modern art's problem with the general public dates from those years. But the literature and visual art of that period have long since been been assimilated by both critics and the general public, and the same is true of most of its music. The later art of the 20th century has just been a working out of that moment: the breakdown of inherited realist, classical, and romantic esthetics.

Since 1945, the fates of the arts have diverged. Literature has proven the most conservative, largely abandoning the modernistic experiments of, say, Proust, Joyce, and Nabakov. There is widespread admiration for their achievements, but few serious imitators. Literature's close relation to time-bound narrative, reinforced by the ubiquity of movies and television, made that fate difficult to avoid. The visual arts suffered a different fate: widespread appreciation of and big money for signature breakthrough works. Still, there were fewer and fewer serious imitators and practitioners in the last century's concluding decades.

Something entirely different happened to music. While undergoing wrenching changes from the end of the nineteenth century -- the rise of commercialized popular music, the influence of non-Western musical cultures, the exhaustion of the classical-romantic paradigm -- Western "art" music was still vigorous down to the Second World War. The distinction between "popular" and "high" music had not yet become a chasm. Contrary to Queenan and other critics,* the twentieth-century repertory is second only to the nineteenth century's in being studied, played, and listened to.**

The true failure of modern art happened after 1945. The sheer destructiveness of the Second World War had multiple, devastating impacts on European centers of art, reinforcing the existing disruptions of war, revolution, and exile. Classical music, especially the core Romantic and Austro-German traditions, has taken its time recovering from the way it was misused by the collectivist movements and totalitarian states of the first half of the twentieth century. The straddling of popular and classical musical cultures, by the 1950s, seemed fatally compromised by either accusations of "selling out" or knuckling under to the agit-prop demands of "socialist realism." On the other hand, the unprecedented explosion of techniques and resources for popular music, starting in the late 50s, pulled audiences elsewhere. Then television miniaturized everyone's mind.

And something else went wrong in the decades after 1945: the academicization of once-radical artistic tendencies, especially Expressionism, a movement that started in Germanic countries as a reaction to the popular cheapening of Romanticism. Expressionism was a brief but potent episode of "hyper-romanticism," in the sense of validating the artist's expression of (usually negative) inner feelings, regardless of external form or audience comprehension. It's impossible to systematize such a tendency. The originating works of this movement have still not lost, and probably will never lose, their power to shock. Yet, starting in the 1920s with Schoenberg's twelve-tone technique, attempts have been made to reduce it to formula. Formulas enabled lesser talents to create lifeless imitations of something that can't be mimicked. Appreciation and artistic creation became mired in a familiar Germanic-academic tendency to load esthetic values down with a lot of heavy theorizing.† After 1945, the spread of higher education put this questionable and half-digested theorizing on everyone's dinner table, as it were.††

Esthetics begins and ends with the senses, not Theory.‡ The origins of modern Western art lie in the happy symbiosis of the inward feeling of the northern peoples (Germans, Celts, Slavs, and others) with classical notions of proportion, form, and timing preserved by the Italians and French. Modernism began to sprout in the late nineteenth century when that symbiosis broke down, and the Germanic and the non-Germanic went their separate ways.
---
* Like Henry Pleasants, whose Agony of Modern Music (1955) is entertaining, right in many details, and wrong in the big picture.

** I use "nineteenth century" loosely, running from late classical (late Haydn, mature Mozart and Beethoven) through late romantic, bordering on modern (Mahler and Strauss). The "high" modern period ran from the 1890s until the 1950s, from Debussy through, say, Bartók and Bernstein.

† A complaint made earlier and more effectively by Tom Wolfe in his Painted Word and From Bauhaus to Our House. The Germans themselves have a nifty term, Augenmusik -- "music for the eyes" and not for the ears.

†† Perceptive readers will sense the tortured ghost of Allan Bloom and his prolix, controversial Closing of the American Mind haunting this posting. The trouble with Bloom's book is that he took twice as many words as needed to make his (largely valid) point.

‡ "... this blathering jargon, which so warms the hearts of philosophy professors ..." (Schoenberg himself).

Labels: , , , , , , ,

Wednesday, July 23, 2008

Mae I help you?

PRE-POSTSCRIPT: Within the "MAE/MAC" story are wheels within wheels. They're government-backed and subsidized. But they also have their own PACs and spread the campaign donation funds around to Congress. Nonlinear feedback government corruption!

Read here for more from the Wall Street Journal, which was all over this long before it became "news."
-----
The non-recession continues, with strengthening economic growth figures. Since we live in an age when the media is its own parody, leave it to the Onion to tell it to us straight.*

But what about rising energy prices? That's due more to the falling dollar than anything else. The falling dollar does boost exports. Among other things, strong exports are keeping us from slipping into recession.

But what about the mortgage/finance crisis? That's real, but it effects only one part of the economy. Its roots lie mostly with bad government policies, although demographics play a role too: the Boomers have exited their prime house-buying years. In fact, I wouldn't be surprised if this thing doesn't end with the government-backed mortgage sector going through a controlled disintegration. It's just like the savings and loan crisis from 15 years ago, except Fannie Mae, Freddie Mac, and Ginnie Mae are government-created and government-guaranteed. If their loans go bad, government has to step in and make good on them for their investors. That's what "government-backed" means. Don't expect that fact to stop a lot of whining about "bailing out investors." If we want to not do that, we should stop the government from backing private-sector loans.

The Wall Street Journal has waged a lonely, decade-plus-long campaign against the reckless credit practices of the government-backed mortgage industry. (See this from a year ago.) Reality has caught up, at last, but -- alas -- not the rest of the media.

Boogie Nights return: Here's a depressing item from Megan McArdle on a recent, ignorant declaration by a bunch of University of Chicago professors protesting the positive influence of two of the school's crown jewels, its Economics and Finance departments. Here is more of the Boomer, New Left "progressive" illiteracy at work. Instead of telling people the truth (which is known in the "global south," by the way) -- that the accelerating integration of economies has been immensely beneficial to poorer countries -- we get stale neo-Marxist blather from the 1970s.

Even trained economists who should know better, but who are also infected with desire to relive their long-haired youth -- like Krugman -- are swooning for specious arguments against free trade. And don't mistake it: such willed ignorance is foretaste of an Obama administration, wiping out 30 years of economic progress on an altar of Boomer nostalgia.

We really do seem to be slipping back 30 years or so, what with inflation, the disappearance of a conservative alternative in American politics, and the revival of discredited leftism. Similar policies (an explosion of public spending) lead to similar results. The only things missing are the bad drugs, bad sex, and polyester leisure suits.

But I do hear a cheezy ABBA soundtrack in the background ....
---
* Of course, Samuelson does get it right, as he always does. He routinely puts the rest of the media to shame.

Labels: , , , ,

Sunday, July 20, 2008

Candidacy or cult?

American politics seems to be, far more than in any time in living memory, falling into an era both silly and dangerous. The most recent sign was the Democratic primaries, a largely empty contest of identity politics where the most qualified candidates were eliminated early on. The ultimate result was Barack Obama's success in getting the Democratic presidential nomination, backed heavily by the wealthy, white, and ultraliberal wing of the party. But it's also hard to remember an election when the news media were so thought-free and ready to divert attention from political substance, while relentlessly promoting a candidate as the center of a celebrity cult. Obama is probably the most underqualified presidential candidate since the 1920s and maybe ever. His candidacy is a testimony to the continuing, if declining, influence of the media. More than anything, Obama is their candidate. One of the few good side effects is that what's left of the media's credibility is being hosed away before our eyes.

Obama's candidacy is also a fantasy of ultraliberal wealthy donors who like the fact that he's a blank slate. They're competing with each other to be the first to scribble it. They want to shape him the same way Bush was "turned" by the neocons after 9/11 -- another sign of a cult, hangers-on competing to manipulate the image of the figurehead. For his supporters, Obama is an exciting Rorschach inkblot. But he's not baggage-free. The notion that Obama is "post-partisan" or all about "change" is the phoniest thing about his candidacy. His political career in Chicago and voting record demonstrate this. Even more striking is Obama's combination of ignorance and arrogance.* While Obama went in six months from "not black enough" to "the black candidate," his politics has always been white-bicoastal-ultraliberal. The cult tendencies are most obvious and disturbing whenever the media's largely successful attempt to protect Obama from questions or criticism breaks down. The campaign reacts with anger: how outrageous, how racist. Isn't this a preview of an Obama administration, both authoritarian and empty, with a lackey press in tow?

There's only one reason to vote for Obama, and that's if you want a seriously underqualified candidate with all the baggage of the Democratic left: semi-isolationist parochialism, free-trade phobia, high taxes, high inflation, greedy interest group paralysis. All the other reasons being kicked around are bad ones. What we're electing in November 2008 is the president for the next four years, not the last four, or the four before that. (As for the Iraq war, it's essentially over.) The attraction of some conservatives and libertarians to Obama especially needs a cold shower of this sort. While a majority of Democratic votes and elected delegates did not go to him, there is also the attraction of the anti-Hillary voter to Obama: how else to explain otherwise rational women falling for him?

My experience with foreigners on this issue continues to be different from what I expected. For the most part, they can't understand why American voters would be attracted to someone so inexperienced, even more than Bush in 2000 or Carter in 1976. Obama's politics are a pre-1980 throwback, with the Democrats' post-60s isolationist-protectionist tendencies added. This isn't just idle talk. People keep tearing their hair out about the price of oil. Most of its recent increase is actually due to the decline of the dollar. That decline, in the last six months, is strongly influenced by a perception outside the US that Americans have entered another period of self-righteous navel-gazing and political weakness. It's true, although the causes are not widely understood outside the US. Without consciously thinking it, the words tumbled out of my mouth while explaining this to a foreign friend: certain voters are attracted to Obama because he's an underqualified blank slate.

Since the 1980s, the left wing of the Democratic party has wanted to tear down the two pillars (economic and security) of post-1945 American leadership under the guise of "progressive" politics. The Democrats were the party that built this system, but they've repudiated it. Keep that in mind when you hear the continuing chatter about American "unilateralism" and "restoring American's reputation." Obama's provinciality on these issues, to the extent he knows anything about them, is astounding. (Mostly, he sounds like the last adviser to brief him.) This is not your father's Democratic party, or even Bill Clinton's. Something has gone terribly wrong.

Hillary is the ambitious 18-year-old Tracy Flick, now forced to attend "Kumbaya" exercises with the 12-year-old set. But Hillary and her husband are no longer the issue: it's the voters who voted for her. The not-surprising upshot is a sight familiar over the last forty years, a large group of voters who would like to vote for a Democrat, but not for the party's candidate. A majority of Democratic primary voters failed to determine the nomination, and the non-Obama Democrats are growing firmer in their rejection. The party has a major problem on its hands. What's more amazing is the repudiation by the party's wealthy elite of what the Democrats once stood for as the main creators of the post-1945 international order. Instead, Democratic politicians and activists have ever more completely rejected free trade and foreign entanglements, being now beholden to narrow interest groups and devoted to non-stop pandering to the party's nutty fringe. It's no wonder the dollar is falling, foreigners are worried, and American voters are disoriented.
---
* Like his insistence that Americans learn French before they go to Europe. Really -- Americans should be learning European. Not everyone in Europe speaks French :)

Labels: , , , , ,

Tuesday, July 15, 2008

Does cultural property make sense?

The postwar concept of "cultural property" is increasingly intruding on museums and their ability to offer a cornucopia of the world's material culture. It sounds like a good, liberal idea: returning things to their countries of origin. The classic case is apparently the Elgin Marbles, parts of the Parthenon and other classical Greek ruins removed by Lord Elgin (the British ambassador to the Ottoman empire) in the early 19th century and placed in the British Museum. But even though Elgin was criticized at the time, it's hard to see how he did anything but good.

Like many such ideas, it's actually more politically correct than liberal. It's based on a weird inversion of liberal values -- the value of diffusing knowledge, especially to the general public that goes to and supports museums -- mixed with bogus history. Ben McIntyre of the London Times has a look over here and finds the whole movement questionable at best.

The Elgin Marbles themselves, once their full history is understood, are a perfect example. When Elgin removed the marbles from the Parthenon, there was no modern Greek state to claim them. In fact, at the time, few Greeks knew or cared about the leftovers of classical antiquity. Athens was controlled by the Ottoman Turks, and the Parthenon was a military fort. The Greeks were fighting a war of independence, heavily supported by the British, and eventually won in 1833. Elgin spent about £75,000 (about a couple million dollars today), part of it to pay the Ottoman government, the only government there at the time. And it's clear that the friezes would have been even more damaged than they are already were at the time, if they had been left on the Parthenon, exposed as it was to rifle and artillery fire. The Parthenon had already been badly damaged in previous wars.

There are many other examples of the same mix of selective and garbled history and chauvinism, like Kennewick Man. Discovered in 1990 in the Pacific Northwest, this skeleton, wrongly claimed by certain American Indian tribes as a ancestor, is of unknown origin. Fortunately, it's still open to scientific study. Only through the willful PC ignorance of history and acceptance of cultural chauvinism by whoever's anointed and designated as "oppressed," while rejecting the cultural chauvinism of "white Europeans" or whoever's the latest designated "oppressor," can bad ideas like this get a foothold. But wait! Aren't the Greeks white Europeans?

Anyway, as McIntyre argues correctly, the material culture of the past deserves to be shared more, not to be monopolized.* This is another example of our "post-liberal" culture: Enlightenment ideas (here, a shared past "owned" by no one) opposed by superficially "progressive," but in reality, parochial and narrow, agendas.
---
* Meaning that the British Museum, say, should be sharing more and not itself act as a monopolist. In some cases, works could be moved permanently elsewhere, if there is a museum that can care for the objects in the same way. What's bogus are the legal and historical arguments often given and used in court cases.

Labels: , , ,

Friday, July 11, 2008

Academic standards and academic freedom

Faced with the negative effects of "political correctness" and post-modernism on academia, some people feel there is a conflict between academic standards and academic freedom. An example is the recent El-Haj tenure controversy in anthropology at Columbia. But if we look closely at this or any other example, we will see over and over how academic standards and academic freedom stand or fall together.

Academia's only real purpose is knowledge, its discovery, preservation, understanding, and handing on. When academic decisions and values are informed by this principle, there are no such conflicts. Academic freedom and standards are so intertwined that it's impossible to separate them in practice. The freedom, of faculty and students, to ask questions, to seek and arrive at answers, and to burn away the false through open criticism are essential to the fully engaged enterprise of knowledge. Only when that purpose is no longer valued, is there a crisis. The final granting of tenure to El-Haj was the end point of a larger process that began in the late 1980s, when the universities began to be taken over by post-60s post-modernists, who look down on knowledge and reason as bourgeois, patriarchal, and otherwise objectionable. The destruction of academic standards cannot proceed without destroying academic freedom at the same time. Criticism and questions have to be suppressed. Something else -- social engineering, "diversity" -- is at work, and knowledge is no longer the purpose.

Post-modernists took over American academic departments, especially in the humanities, by selectively driving out or not hiring people with ideologically "incorrect" views. The minority of such people who were there in the 80s or 90s felt enough of a hassle -- from speech codes and "sensitivity training" -- to make it worth their while to simply leave. The result after 20 years is an academic world overwhelmingly (at least in the humanities) left-liberal and "post-modern" in its politics and view of the world. OTOH, student bodies, at least at large public schools, are close in their views to the larger societies. Hence, the eruption of PC censorship and intimidation at universities in the last 20 years.*

The American system of academic self-governance was set up in the 1920s and 30s largely to protect faculty and, to a lesser extent, students from arbitrary interference by trustees, state legislatures, political figures, donors, and so on. The "free knowledge" model certainly needs protection from such people, because they can and have overridden the "knowledge paradigm" with sometimes ill-informed intervention. As long as the larger society charges the university with its purposes concerning knowledge, such protections are necessary.

But this system, copied from the German universities in their classical period (before 1933), always had a fatal weakness. It assumed that the forces impinging on "free knowledge" could only come from outside the university. It was not designed to protect the university from those who wanted to destroy it from the inside. Thus, from the mid-60s on, the universities have been unable to consistently beat back internal threats to both academic freedom and standards. It could prevent politically-motivated firings of faculty by administrators or trustees in politically tense periods, like the 1930s or the early 1950s. It can help to stop what happened in communist countries -- an outside political force trying to "coordinate" all institutions in society and mold them into ideological conformity. But what it couldn't stop was what happened in Germany in early 1930s, when faculty and students themselves, from within, upended both standards and freedom as part of their participation in a larger, anti-rational political revolution.**

A slow-motion version of the same has been happening on American campuses since the late 1980s, even though there is no larger anti-rational political revolution: the 60s New Left was consistently and decisively rejected by voters. On campus, there are no book-burnings and little violence -- just a lot of slow-acting, but long-lasting, professional and ideological pressure. A few schools, and schools in certain areas of the country, have become refuges as the dominant paradigm at the top bicoastal schools changed and American academia entered its "post-liberal" era.

The cure is not anything as silly as "affirmative action for conservatives." The cure is to stop the application of the underlying anti-knowledge, anti-reason paradigm.
---
* The students bodies at the more expensive and elite schools are closer to the left-liberal or left politics of their faculties. It's been a long time since American liberalism was a political tendency of the "masses." For several decades, it's been a largely elite movement.

** Even now, there is little comprehension of what happened in Germany in that period. The gutting of German universities, including book-burnings, was an inside job.

Labels: ,

Monday, July 07, 2008

Walden Pond revisited

A posting a short while ago talked about important distinctions often lost these days: between environmentalism and ecology, between conservation as a human choice and a false worship of nature's non-existent wishes. Henry David Thoreau was an early American prototype of the Romantic movement's anti-social and anti-civilization tendencies. He spent two years at Walden Pond (1845-47), a small and shallow, glacially-cut lake in Concord, just outside Boston, living in a hut. The hut is gone, although there's a replica there now, and Walden Pond and the area around it have been conservation land since the late 19th century.

The Pond is not far from the center of Concord and easily walkable, as it was then. Thoreau's two-year sojourn there was not prompted by a yearning to commune with chipmunks or hug trees. Thoreau was not seeking nature, but withdrawing from human society. To use a modern word, he was a drop-out. But human society and modern conveniences are not, and were not, far from the Pond. The railroad ran and still runs on the southwest side, and the roads, now paved, reached Concord and other nearby towns within a few hours walk or no more than an hour's horse or carriage ride. By all evidence, Thoreau made regular use of these conveniences.

The essentially anti-social nature of Thoreau's quest was obvious to his friends, particularly Emerson. While Emerson wrote a whole essay in praise of self-reliance, his notion was unambiguously social. He was uninterested in isolation, and did not want others to isolate themselves. His "self-reliance" meant simply standing on one's own feet, while being part of society. Emerson had limited patience for what he saw as self-indulgence on Thoreau's part, a dip into the "insidious ethic of conscience." Somewhat more than a century later, one man's navel-gazing would become a mass phenomenon.

When the State of Massachusetts took over Walden Pond in 1922, they were expressing a wish, to preserve something as partly natural, partly manmade, for future generations. It wasn't nature that made that decision; it was the state legislature and, ultimately, the state's citizens. And the site wasn't shut off from access either -- that would have undermined the point. That purpose was not to take humanity out of nature (we can't be in any case) or put nature far away out of human reach, but to take care of a selected spot for others to enjoy. In the process, the spot has been inevitably changed in some ways, prevented from changing in others, while itself changing the people who conserve and appreciate it. It's human art and artifice at work here, not wilderness untouched by human hands.

Labels: , , ,

Friday, July 04, 2008

The leisure of the theory class

It's Fourth of July weekend, and we wish you all the best in celebrating American independence. Remember to drive safely and go easy on the liquid fuel.

Over here at chez Kavanna, we'll be doing the same. Since we're bicoastal, they'll be plenty of veggie and vegan choices at the BBQ. It comes with the territory :)

Wednesday, July 02, 2008

The Royal Game of Ur, plus: Mice on drugs

Board games seem simple and obvious enough that you might think people have been playing them for a long time. And you'd be right. The oldest known is the Royal Game of Ur, one of the capitals of Sumer, in ancient Mesopotamia, starting from about 2600 BCE. The paraphernalia of the game were discovered decades before the rules were reconstructed. But they have been, and you can even play it online. (Warning: this site requires the Shockwave plug-in.) The British Museum sells a real-space version of the game as well (click on picture).

The weird thing is that the last known living variant of this game was played until recently by the Cochin Jews, of Cochin, India. They mostly live in Israel now. Evidently, they brought it with them when they moved from Mesopotamia (the Babylonian exile) to India. Some came with Alexander the Great (around 330 BCE); others later with the Muslims around the year 1000, or with the British, in the 19th century.

Meantime, in Israel, another ancient phenomenon, frankincense, has been investigated, not only as a spice and incense, but as a drug. Inhale enough of it, or drink the resin, and you get very calm and a little confused, but happy-confused. Now mice at the Hebrew University are testing and apparently liking it.

Frankincense might even be used some day as an antidepressant. It was once used as an incense in many ancient and not-so-ancient temples, evidently for good reason.

Labels: ,