Tuesday, September 28, 2004

The Politics of Plagiarism

A tempest is brewing at the Harvard Law School, but before I consign this tempest to the proverbial teapot, I'd like to point out that it's actually a bit more complicated than that. In a fairly nasty article, Weekly Standard writer Joseph Bottum excoriates Lawrence Tribe for relying on Henry Abraham's Justices and Presidents for language, including some very similar sentences and using some of the same words to describe certain situations. As a consequence, Tribe apologized in the Harvard Crimson.

Plagiarism is a serious charge against an academic, and the issue of appropriating the words or ideas of another seems to have special significance to scholars (for the sake of completeness and variety, I'll be using “academic” and “scholar” interchangeably to indicate professionals who teach and produce research, although they are far from completely overlapping.) It’s a different issue than illegal copying of movies and music, since the college kids still know who make the materials they bootleg or download; Metallica isn’t upset that kids are taking their music and attributing it to someone else. To an academic, plagiarism is failure and hypocrisy. Failure, because academic scholars are supposed to earn their income and status by producing original work, be it creative, analytic, scientific, or whatever. Stealing someone else’s words or ideas suggests you can’t, or at least won’t, produce your own, but more importantly it means stealing credit for someone else’s work. Hypocrisy, because an uncomfortable number of the students academics teach think nothing of turning in papers they’ve cobbled together from website texts or bought from online sources and most teachers at least talk a tough game on plagiarism. This explains the responses of Harvard students to previous charges against faculty for overreliance on source materials.

Allegations of plagiarism against an academic scholar are, thus, difficult to mitigate or minimize. The Weekly Standard article establishes that Tribe used short passages of text directly taken from Justices and Presidents and substantially paraphrased portions as well. On the basis of the article, the reliance on Abraham’s book does not seem too substantial, but is clearly wrong and Tribe quite properly issued an apology for his use of Abraham’s language.

So, the question is, what does the WS story mean? Although there isn’t a particular domain outside of which magazines of news and opinion like the WS shouldn’t stray, the subjects and stories that magazines choose to cover reveal their sense of purpose and importance, if we assume that editors, journalists and commentators prefer to write about important things. Is the Tribe story important to the WS? Important enough, it seems to devote space in the magazine and attention from one of their reporters. Now, scholarly ethics are not a typical subject matter for journals of opinion, but this is Lawrence Tribe, Harvard Law School professor and, perhaps more importantly, one of the leading liberal constitutional scholars in the United States.


From the responses of constitutional scholars that I’ve read, this incident is an embarrassment for Tribe and another telling example of the problems “celebrity” scholars have relying on the work of graduate assistants and using other shortcuts, especially with their work for popular audiences. It is not, however, perceived as a serious transgression calling into question Tribe’s reputation as a leading legal scholar. If it were, it would certainly be a major story, but the people whose opinions of Tribe and his scholarship are pertinent do not, in large part, think this is a career-shattering problem. One can see evidence of this in Bottum's article, where his expert on plagiarism, "pushed to decide" (presumably by Bottum) states: "Constant paraphrasing without at least semi-regular attribution constitutes a form of plagiarism." The MLA Guidelines, which he describes as "a little sterner," isn't, saying basically the same thing (note the italicized "unacknowledged" in the previous sentence, which certainly applies to the subsequent sentence.) Also, note that the examples intended to demonstrate substantial, thorough reliance on Abraham get thinner and weaker as the piece goes on, suggesting that this is about as tough a case as can be made, and as the examples get weaker the condemnations get, oddly, stronger.

Does the WS sincerely believe this is a world-changing revelation? I doubt it; the article does not convey the impression that this journalist or the magazine editor really think that this will substantially change Tribe’s standing or reputation among those who can judge Tribe's work on their own. Certainly, the article is not pitched to an audience who is at all familiar with either Tribe or Abraham’s work. Of course, law professors are naturally inclined to think that their assessments are the most important responses to any current event, but they’re usually not. In this instance, the purpose of the story is to suggest a connection between Tribe’s misdemeanor and his political positions. Put another way, the story is intended to imply that just as Tribe’s practice as a scholar is unprincipled, his constitutional scholarship is unprincipled and the policies he favors as a constitutional scholar are unprincipled. As evidence, consider the awkward effort Bottum makes mid-article to tie the book to the Bork nomination defeat. Even if the relationship between God Save this Honorable Court and the transformation in federal court nominations could be established, what possible relevance does this have to whether or not Tribe's reliance on Abraham is or is not serious plagiarism?

Again, just because the Weekly Standard’s interest in this story is predicated more on the opportunity it provides to attack a liberal legal scholar and impugn by association the positions he favors, that doesn’t mean that Tribe didn’t do anything wrong. What it does mean, however, is that the academic community cannot accept the monitoring of journalists and opinion-writers uncritically. What is particularly troubling about plagiarism scandals like those that have affected Harvard Law School and other Ivy League institutions recently is that they have focused inordinately on the scholars and scholarly commitments of the perpetrators and less on the more general scholarly practices that lead to them. Journalists, and other scholars, have been using plagiarism and other infractions of academic standards to tar scholars who achieve fame, wealth, or influence. For other scholars, a lot of the animus is based on jealousy or personal animosity, but for people in the opinion-mills, it’s all about politics. Legal scholars tend to be especially reluctant or unable to recognize and acknowledge the politics that surrounds them and their objects of study. Unfortunately, failing to acknowledge politics in law doesn’t make it impossible for legal scholars to practice politics; in fact, it can make it easier, much as the refusal of judges to comprehend the political nature of their outputs permits so much politics to saturate the legal system.

There’s a more immediate and practical consequence of the focus of stories like this for the academic community. The informally common practices that lead to scandals like Tribe’s need address in a more sober light. Scholars, particularly highly regarded, prolific, “celebrity” scholars like Tribe, have come to rely on graduate students to do at least some of the work that permits their extensive production. An irony central to journalists and opinion-writers busting scholars for infractions like this is that they are also the voices criticizing academic scholars for their lack of contributions to the public discourse on subjects of importance. If leading academics are expected to produce high-quality scholarship for a specialized audience while maintaining an engaged, practical career providing service or advise to public officials, and possibly writing books for a general audience as well, something is going to give. One of the consequences has been the over reliance on graduate assistants and other laborsaving technologies that make these kinds of things more likely. We’d be better off with a careful examination of what leads to these problems, rather than more opportunistic sniping at scholars for low ideological purposes, especially when the scholarly community doesn’t even recognize the stakes. Rather than the subject of "the problem of borrowed scholarship" promised by Bottum's subtitle, the focus throughout remains on demonstrating that the infractions Bottum strains to make damning are a function of Tribe's psychology, "The Big Mahatma" syndrome.

Incidentally, I've never read God Save This Honorable Court, but I have read Justices, Presidents, and Senators, the recent reissue of Abraham's book. While Abraham's book is a standard text in Supreme Court appointment history, it's not without source material itself. Presidential biographies note many of the same relationships and facts, especially regarding the FDR and Truman appointments, in language similar to Abraham's. As no one is concerned with discrediting Henry Abraham, I'm sure no one will put forth the effort to demonstrate that Abraham himself is a "plagiarist" and I'm sure that such an effort would not be worth it anyway, since you would end up, at best, with the kind of weak reliances and suggestive language Bottum collects (although better sourced, I'd wager.) But, another consequence of substituting ideological bloodsport for reasonable accounting of the problems with high-profile scholarship is that serious problems become more difficult to distinguish from political gamesmanship.


Monday, September 27, 2004

The "Entertainment Value" of Voting

One of my morning rituals is to check out the latest posts to Arts & Letters Daily, a page sponsored by the Chronicle of Higher Education that brings to light interesting articles and comments on the web from various sources. ALDaily used to be connected to Lingua Franca, the highly entertaining magazine devoted to academia and the humanities, which collapsed a few years ago. Since the shift, I've detected a tendency not to dig quite as deeply as it used to for interesting items to post. I've often wondered, for instance, how many people would really never have discovered interesting stories from the Sunday New York Times magazine if ALDaily didn't post to it early in the week. A lot of content on ALDaily now comes directly from the NYT, or the NY Observer, or the Washington Post, or a few other highly fruitful publications. I've also noted increasingly that ALDaily's blurbs, usually a quote or paraphrase from the article to introduce the link, occasionally give an unreasonably bad sense of the tone or intent of the article itself. Maybe an example will appear soon.

Anyway, another NYT magazine piece appeared this morning, a short essay by Jim Holt on the efficacy of voting in presidential elections. I read it with a bit of professional interest, knowing the political science literature on vote turnout, and was pleased to see a brief review of some rational choice theorizing on why people choose (or choose not) to vote. The basic paradox of rational voting is that, given a benefit for voting that is realized only if one's favorite candidate wins, the rational voter will choose not to vote unless the likelihood of being the deciding vote is high or the actual benefit is high enough to overcome the low probability of causing its realization. Putting aside congressional, state and local elections, people should only turn out to vote if they believe that their vote will decide the election. Of course, since turnout increases during presidential election years, it's implausible that people are drawn to vote by local races where they are more likely to decide the contests.



Due to the number of ballots cast and, as Holt points out, the "unit rule" system that awards all electoral college votes in a state to the winner of the statewide election in all states but Nebraska and Maine, the chances of actually casting the deciding vote in the election is really small. In fact, its about zero in many cases. Economically minded social scientists have pondered this for some time and come up with various explanations of why people, nonetheless, vote. The overestimation of efficacy, or delusional belief that you really do decide the election, isn't widely credited. There's also a basement-level problem with the rational choice prediction that costly voting will dissuade people from voting, since if everyone behaved rationally, then polling places would be vacant and at least one cost of voting (waiting) would be eliminated. Reminiscent of that Yogi Berra line, "Nobody goes there anymore, it's too crowded." Also, not voting in such a circumstance wouldn't be a stable Nash Equilibrium, since if no one else is voting, and people actually gain an outcome benefit, you could determine the outcome by being the only voter. Obviously, something else is going on.

Models of the "calculus of voting," devised by Anthony Downs and built upon by William Riker and Peter Ordeshook, included a benefit term for voting, which they referred to alternatively as the benefit of continuing democracy or the value of performing the civic duty of voting. This "D" term made it worthwhile to incur whatever costs were associated with voting. Of course, the "D" term is awfully convenient, and is often derided as a fudge factor intended to explain otherwise irrational behavior, but the presence of a duty benefit does seem consistent with "rational ignorance," the fact that many citizens who choose to vote do not choose to invest in information that may be helpful in making that choice count. Any increase in the cost of voting, like making yourself informed, is only worth it if it increases the benefit, which it won't if the benefit is gained merely by showing up.

Now, Mo Fiorina and other political scientists have come up with other explanations, like the "entertainment value" of voting, that are at least as consistent with the stylized facts of voting. Voting, according to this approach, has a "consumption value" which can be enjoyed without maximizing the expected benefit of the election outcome. I've come much closer to accepting this theory since the advent of American Idol and its ilk on television. Like the duty value, expressive or consumption value can be enjoyed without the investment necessary to make the vote itself an informed one. In fact, some might argue that the less one knows about the candidates, the more one can enjoy the experience of voting for one.

There are other explanations of voting that are not quite so pessimistic. Voters motivated by a duty heuristic may make their actual vote choices based on "low-information rationality" or an "online tally" of positives and negatives. People, in other words, don't remember a lot about the candidates, but recall how often or with what intensity they found something agreeable or disagreeable about them. Voters may also rely on opinion leaders, people who share basic goals and interests and can inform themselves on the candidates and issue at low cost (because they are college professors, for instance.) As opinion leaders, these people may have additional incentive to invest in making their vote choices informed.

Many of these explanations have some evidence, experimental or quasi-experimental. Still, the "entertainment value of voting" has a special ring to it. Even though I'm more inclined to believe that people get something like a "consumer value" out of voting than direct entertainment value, its just too much fun to argue that people vote in order to be entertained.

I'm pleased that Holt noted the research (by Steven Brams, one of my favorite game theorists) on the weight given to big states by the Electoral College and unit rule. I get tired of hearing otherwise intelligent people parrot their barely-remembered high school civics teacher saying that the Electoral College provides vital representation to small states in national elections.

As for the moral of his essay, that people should vote out of a sense of duty and presumably incur the costs of becoming an informed voter as well, he assumes, like many people who stumble into the thicket of voting behavior, that voting can be thought of as a coordination game or cooperative scenario rather than a non-cooperative, competitive one. The Condorcet Jury Theorem, which he refers to in that paragraph, rests on several assumptions and, in a more general form, only guarantees that a collective decision will more likely be correct than a decision made by a randomly chosen individual. It would be great if we could collectively work toward improving election outcomes by encouraging more, better informed people to vote, but that would likely reduce the entertainment value of voting, and maybe that's all we've got holding democracy together.

Friday, September 24, 2004

Media Review - Kool Keith, Diesel Truckers

I picked up the new Kool Keith album Diesel Truckers the other day with some trepidation. I'd skipped Keith's last project, Thee Undatakerz, released just a few months ago, for a few reasons, not the least of which is the poor quality control he's demonstrated lately. Keith releases a lot of material under various pseudonyms and with various collaborators, and I've been dissatisfied with most of the stuff he's done since Masters of Illusion, a collaboration with KutMasta Kurt released in 1999. That year also brought the excellent Dr. Dooom album, First Come, First Served, as well as the occasionally interesting Black Elvis/Lost in Space project. The latter is about the closest Keith has ever come to a crossover attempt, although it's not "street" enough to satisfy the demands of commercial rap. The weird collection of themes (Keith as rock superstar/science fiction figure/corporate mogul) was highly entertaining and lyrically fertile, but it was also a little dull musically.

With Kurt back in the fold after a few years apart, Diesel Truckers is a stronger effort than he's done since Black Elvis, and although the title concept is indifferently developed (there's more of that in one track on Masters of Illusion than in this whole album) it does take Keith away from the relentless description of his sexual preferences and scorn for rap and the music industry that characterized his Matthew and Spankmaster releases (like I said he releases a lot of stuff; if that's not enough, check out last year's The Lost Masters CD for outtakes of these albums. On second thought, don't.)



Several tracks on Diesel Truckers help justify the purchase, unless you've completely overloaded on Keith's oddball lyrics and complex, anarchic rhythms. "The Orchestrators" is standard, if high quality, Keith, while "Break You Off" sounds like a throwback to Sex Style, with some of the same references. He mixes it up some with Southern-style beats and drawl on "Takin' it Back," "MANE" and "I Drop Money," and adopts some of the laid-back flow of 50 Cent on "Can I Buy You a Drink?" much as he mimicked Jay-Z on Black Elvis' "Master of the Game." Although I'd prefer to hear more of the groundbreaking overflow rhyming so prevalent on the Dooom album, the fact that Keith can switch up his flow so easily is still impressive. Still, much of Diesel Truckers sounds like retread, rather than new territory for Keith, especially stuff like "The Legendary" and "Bamboozled" (featuring guest appearances by past collaborators Marc Live and Jacky Jasper.)

Black Elvis continues to haunt Keith at least as much as his underground sensation Dr. Octagon persona, if Diesel Truckers is any indication. I, for one, find the Octagon album overrated; it's not bad as a whole and has several excellent tracks ("Earth People," "Blue Flowers," and "Wild and Crazy" are particular standouts,) but nothing better than some of the the stuff on Automator's A Much Better Tomorrow LP, recorded around the same time. Although Keith rejected the acclaim heaped on Octagon (killing him off on the first track of the Dooom album,) he appears to think that the critical and commercial attention garnered by Outkast, especially Andre, is rightfully his. On "Mental Side Effects" he takes a few moments to call out "Benjamin" for copping his Black Elvis style and other career moves. Now, it should be noted that Keith thinks basically everybody has ripped him off (in the liner notes to Matthew he lists artists who haven't yet, which is a pretty manageable list) so it's hard to take very seriously. The whole thing seems like a paranoid fantasy off of another Keith project, the 2002 album Game. Besides, if Keith thinks he's the first person to delve into space age funk imagery, he's gotta be kidding. Outkast, for instance, were explicit about the heavy Parliament influence on their funky sci-fi ATLiens album from 1996, the same year Keith was doing Octagon.

One potential embarrassment on Diesel Truckers is how much Motion Man brings to the proceedings in his guest appearance on "Serve 'em a Sentence." Keith seems to be working a bit above his level on the rest of the album, and Motion nearly waxes Keith on his own track. Motion Man is like the anti-Keith; rather than spreading himself thin over legions of releases with inconsistent quality, he's dropped excellent performances sparingly over the years. His Clearing the Field is a certifiable classic, and his contributions to Master of Illusion nearly made the record.

Overall, Diesel Truckers isn't a bad album, even if you've already got a few Keith albums on hand. Is it a major step forward for Keith? No, and I'm tempted to say there might not be another major release in Keith, but you never really know with Keith. I was skeptical when I read word that this album was a "return to form," since that often means that an artist (in music, fiction, film, etc.) has finally consented to return to a subject and/or form that fans expect and feel comfortable with after a period of more interesting, idiosyncratic work, but in Keith's case the idiosyncrasies were wearing thin.

Thursday, September 23, 2004

Event Review: DragonCon 2004

Quite a bit late, as some others of my acquaintance have already commented on this event (even with pictures) and it was weeks ago. Well, I'm just getting around to it.

I've been to many DragonCons over the years beginning in the late 80's, I believe. I missed several in the intervening time, especially in the years when I wasn't living in Atlanta, but I imagine I've probably been to eight or nine of them in some capacity. Of course, most of the time I didn't actually register at the con and attend functions; if you only count those (including the ones where I got a badge from a friend who was working the con) it'd be only half as many. At other times, I would just wander around the area with friends, looking for people we knew or wanted to know, almost always at night when the activities shifted from panels on sci-fi and comics to parties and such.

Oddly enough, my wife is more of a participator than I am. I used to go to it and other cons primarily for the role-play gaming, but since then it's been primarily a social activity. Now I participate, but disinterestedly. Last year, my first in awhile, I went to a few panels, but often found that I couldn't match the enthusiasm of the other participants. I particularly remember how several of the "Classic Sci-Fi" track panels degenerated into intense bitching sessions about how bad contemporary sci-fi is compared to the Old Masterworks. Now, I like to complain about that kind of stuff as much as the next guy, as long as the next guy and I aren't sitting in a panel at DragonCon.

This year (finally got to it) I attended several interesting panels. The highlight was a morning panel featuring many of the minds and voices behind Cartoon Network's Adult Swim lineup, to which I am devoted. I understand that the anime and comedy programming have separate fan bases, and I belong entirely to the latter, but at least here it seemed that peace was maintained. Since the anime programming is entirely by acquisition and much of the comedy is home-grown, the focus of this panel was more on the comedy and that suited me fine. Although the best Adult Swim stuff is the odd, original stuff like Aqua Teen Hunger Force, I was really happy to see that they're getting Aaron McGruder's Boondocks cartoon. Boondocks has been in something of a rut recently, and cartoon adaptations of comics are often bad, but the exceptions (like Peanuts and Dilbert, which is better than the strip) are standouts.

Other than that, I attended interesting panels about The Young Ones, which covered lots of British comedy series from the last twenty years as well, and an informative laid-back panel about The Prisoner, which I've been catching again on BBC America recently. I also got some good reading suggestions at a panel about horror comics, which I haven't read in about fifteen years.

For professional curiosity, I attended a panel about a course offered at Kent State about Star Trek. It's a workshop rather than a class, co-taught by the panelist and her husband. The panelist (whose name escapes me) got her MA at Bowling Green's Department of Popular Culture, a credential that made me uneasy. One reason for attending the panel was that I don't know what academics studying popular culture really do (I know what academics in film programs do, and it's not an encouraging model) so I thought this might be a chance to see what a degree in Popular Culture prepares one to teach.

The syllabus for the course demonstrated that it was not merely a "blow off" course where the students aren't asked to do work. They prepare papers regularly and are admonished not to pad their work with plot synopses. I don't doubt that the intentions of the course are serious, but I left the panel with many of my concerns substantiated. The material of the course was entirely Star Trek episodes (and a screening of Forbidden Planet.) Now, it makes sense that a class about Star Trek would include some episodes, but despite the statement that the course was intended to reveal the ways in which Star Trek was shaped by and shaped the social conflicts of its time and the period since, I don't see how that can be established without some material directly addressing what those conflicts are. It may be true to assert that science fiction and other products of popular culture reflect and influence social phenomenon and the experiences of people, but if all that the students learn about the social turmoil of the late 60's is drawn from popular culture, then the claim is nothing but tautology. Popular culture studies, it seems to me, attempts to demonstrate its importance by defining important social and cultural phenomena entirely in terms of pop culture products. It may be, however, that pop culture's reflection of the experiences and concerns of people is inaccurate.

Another thing that came across in the presentation is how enthusiastic the instructor was to be able to teach a group of students too young to really remember (or maybe even have seen) the original series how to appreciate it as something other than a corny space opera with rickety sets and inflated dramatics. I'm sure that's a lot of fun, for students as well as faculty. I'd love to teach a class about how David Cronenberg movies demonstrate the problems with social and technological efforts to alter human relations, or how The X-Files reveals the pervasive failures of public and private institutions to cultivate or deserve trust, but if I did, I'd want to include materials demonstrating these phenomena in the real world. Movies and TV can, after all, distort reality as well as illuminate it.

On that point, it would also improve this kind of study, I think, to address such possibilities from a theoretical point of view. A reading from Plato or even somebody more contemporary (Foucault, Baudrillard, etc.) about the distinctions between the artificial and natural worlds (although these authors have different ideas about what those are) would allow students to do more than merely make connections between TV episodes and what they believe they already know about the world more generally. I didn't see any evidence in the course description that the students would receive any skills that could be applied outside the context of Star Trek appreciation.

Anyway, I thought about saying something, or at least asking a question to see if my concerns were addressed in any way, but I got the distinct impression from other members of the audience that my comments along these lines would not be welcome. So, I chickened out. I could say that I declined to comment for the sake of politeness, but I really just wanted to avoid having to argue with anyone, especially about something people can be so defensive about.

I also saw the first two Ju-on movies, although the second is about half a movie with a bunch of stuff from the first thrown in to fill time. The guy who put together the screening helpfully edited out the repeat footage, so the two movies together ran less than two hours. I've seen Ju-on: The Grudge on DVD, and found that the original movies (at least one of which was made for video) are conceptually a bit stronger, The Grudge version has an effective, uneasy pace. Of course, this series could be interpreted as the crash of the J-horror new wave, since Ju-on: The Grudge is little more than repeated variations on many of the strongest images and themes of the most successful horror films from Japan of the last 6-8 years. Now, an English-language remake starring Sarah Michelle Gellar is on the way, previewed briefly before the Japanese features started. Gellar looks very much like an actress trying rather desperately to leverage her Buffy popularity into more work without looking desperate or like she's falling into the horror heroine rut. I'd suggest she study of the career of Jamie Lee Curtis, who was content to ride a wave of horror popularity early in her career and let other things happen slowly, without trying to jump immediately into starring roles in other genres. I've always appreciated the willingness of Curtis to acknowledge and appreciate her past as a horror "Last Girl," when so many other actors do everything they can to hide their early horror film appearances.

The last day of a science fiction convention can be depressing, as for many of the participants it's the end of a long-anticipated vacation. Many of DragonCon's attendees travel from outside Atlanta, bringing their families and indulging fully in the pageantry and all-out geekiness of shared obsession with cultural marginalia. Of course, science fiction and comics as subjects aren't marginalized, they seem to dominate popular entertainment at times, but by focusing their love on specific products, fans of Babylon 5 or Hellboy probably feel a little strange at times. Not here.

Overall, I enjoyed this year's con more than I have in the recent past. It's a different experience than it was when I was younger, but it is possible to enjoy a sci-fi convention even if you're not the kind who collects Doctor Who paraphernalia or is inclined to dress up as Boba Fett.

Friday, September 17, 2004

Media Review - Resident Evil: Apocalypse

I have several other planned or partially written posts about other films (Ju-on 1 & 2, the new Manchurian Candidate, Reconstruction) seen earlier than this one, but I feel compelled to dash off something about the new Resident Evil while the movie is still fresh in my head. In part, this is because if I wait too long the experience will disappear entirely (and that tells you something about the movie itself) but also because the film's bizarre, alternatively cliched and provocative virtues are particularly prone to decay from memory. On a similar note, a warning: I spoil.

I was a fan (guardedly) of the first Resident Evil after skipping it in the theaters and catching it on video. I don't play video games, so I have no idea what the "plot" is supposed to be (I assume there must be a reason why the city is named "Raccoon" but don't know what it is.) The first RE benefited, perhaps, from my lack of background as well as the pitiful comparison of the other video game-based zombie flick, House of the Dead, released around the same time. RE begins with Milla Jovovich's character (Alice, the first of several Lewis Carroll references that seemed superfluous at first) introduced as if newly born: naked and innocent, knowing nothing about who she is, her relationship to the other characters, or her role in the precipitating disaster due to drug-induced amnesia. Over the course of the movie we have cause to suspect and fear the military personnel who lead her back down to the underground corporate laboratory (The Hive) that she must subsequently fight her way out of, other survivors of the man-made plague, the Hive's electronic overseer (called the Red Queen, of course) as well as the ominous Umbrella Corporation, the cannibalistic walking dead and the zombie-producing T-virus itself.



Night of the Living Dead (and its sequel) casts a long and deep shadow over almost all the zombie movies made since the late 60s, and this is not an exception. From Romero's models, this movie includes grotesque, slow-moving, flesh-eating corpses in large packs that have to be shot in the head, an uneasy alliance of civilian and military characters, the infection that follows being bitten by one of the zombies, and the "survival politics" that follow from it. Intense mistrust of the military and corporations is not solely a Romero contribution, but he does offer something of a gold standard for it. Unlike House of the Dead, which had almost nothing to demonstrate horror credentials other than gore and violence, Resident Evil has the intelligence to create horror through inference, to remind you of something in life (or in the world at large) that is unnerving. Combined with taut pacing of action and revelation and an effectively claustrophobic setting, it's a well-done piece of paranoid horror/sci-fi/action moviemaking in the tradition of not just Romero, but Cameron, Carpenter and Verhoeven.

Seeing the movie again recently, I was struck by how, like Carroll's Alice, the primary character in RE tries repeatedly to make what she believes are the morally right, socially responsible decisions and finds either that those choices are not available to her or that she does not understand the relationships between her actions and their consequences. Fundamentally, Alice is complicit in this disaster without knowing it and her basic socialization, something the audience undoubtedly shares, is totally inadequate to prepare her to deal with her complicity or the situation. Not a difficult connection to make, but interesting in a way that RE:A is not.

The first movie ends, and the second begins, with Alice reborn and evolved into a superhuman killing machine, all the better to justify endless explosive combat for the next 90 minutes. Raccoon City, dominated by the Umbrella Corporation whose malevolence is no less clear now than the undead, is quarantined due to the escaped virus and being used as an arena for an experimental battle between Alice and another T-virus creation called Nemesis. This is probably the silliest element of a movie teeming with silly elements, but the inevitable showdown competes with so many other conflicts boiling up repeatedly that it doesn't matter that much. You have to hand it to Umbrella: in the middle of a potential public relations disaster of apocalyptic proportions (unintentionally justifying the title) they decide to stage another bio-weapons test. I guess when life gives you lemons...

At times (but not very often,) RE:A recalls The Crazies, Robocop, and even some of the early bio-horror films of Cronenberg, especially Rabid and the underrated Scanners. At the end of RE:A, Alice is reborn once again, cloned this time as in Alien: Resurrection or Species II, and demonstrates even more destructive potential, near omnipotence really, while simultaneously setting up yet another sequel and suggesting that the Umbrella Corporation is still in control. In a chilling gambit last used with effect in the closing moments of John Carpenter's Escape from LA, Alice gazes unmistakably into the camera, at the audience, suddenly aware of our heretofore unperceived observation.

So far, I've discussed very little of what goes on between the opening and the conclusion. This isn't an oversight, as little of this material is worth commenting on. In fact, I was disappointed with the competent, but unremarkable battle and zombie attack setpieces that pad out the underwritten plot. An Umbrella scientist who initially cultivated the T-virus to give health to his young daughter enlists the motley collection of roaming survivors in Raccoon City to find his daughter, hiding in her school from hordes of zombie children while an impending nuclear explosion looms to cover up the mess. These survivors include a couple of abandoned Raccoon City elite cops (one of whom is the always-reliable Oded Fehr,) a comic relief street-hustler (Mike Epps, in one of his first movies that doesn't star a rapper,) and a video game heroine (there's really no better description) who is apparently some kind of rogue cop. None of these "characters" really deserve the title, but Jill Valentine (the video game heroine, played by Sienna Gullory) is a welcome addition for those who, like me, find that Milla Jovovich lacks voom.

So, the middle is a disappointment for anyone who isn't happy with visually satisfactory, but basically unmotivated explosions and zombie chomping. Unlike in the first movie, there's no uncertainty about who can be trusted or who is going to make it (for instance, the reporter who's never fired a gun before will NOT survive) and no point of access to real terrors. The ending goes a long way toward saving the movie by tapping into the vein of dread touched upon by the teaser trailer for the movie, released earlier this year, that was mostly indistinguishable from a moisturizer commercial (with some side effects.) As in Scanners, order appears to be restored, but what kind of order? Who is in charge? Who are these people?

I've likely put more thought into this movie than its makers did, but haven't come away completely empty handed. Without intention, I've delayed writing this long enough to catch Cliff Doerksen's review for the Chicago Reader, which adopts a typical, lazy attitude toward horror, especially zombie horror, found in many of its unfortunate enthusiasts. He opens with the ignorant claim that the zombie genre was "born" with Night of the Living Dead, and two of his three choices for superior imitators (28 Days Later and Cabin Fever, the latter of which has only a tangential relationship to Romero's films) were made in 2002. Although not as flatly stupid as his review of the new Manchurian Candidate, Doerksen's review dismisses Romero's social criticism (which only occasionally dips into satire) and locates the basis of zombie-horror effectiveness in the creation of "you are there" verisimilitude that nobody above the age of 10 really feels. The power of linking horror and social criticism, something that Romero shares with Cronenberg, Tobe Hooper (who sometimes lets his political impulses overwhelm his filmmaking judgment,) Wes Craven (to some extent,) and Bob Clark (see the marvelous Deathdream) is that metaphorical horrors are real, and closer than you imagine. To someone who can't distinguish politics from entertainment (as his Demme review makes clear) this is perhaps too demanding on the overtaxed imagination of a viewer trying to figure out what kind of oatmeal is clinging to the zombies' faces.



Wednesday, September 15, 2004

Restaurateur reaches the tipping point...

Interesting story from AP wire. In question is the legal status of gratuities required of "large" groups, in this case an 18% tip that the restaurant expects of parties larger than 5.

A party of nine attempted to leave a 10% tip, and the ringleader was subsequently arrested for theft of service. Apparently, this issue (whether required tips are enforceable debt) has not been litigated before in New York.

It seems to me that since laws generally treat tips as a part of wages (the federal minimum wage for employees receiving tips is $2.13, well below the $5.15 set for other covered, non-exempt employees) then restaurants are well within their rights to set a price for certain services payable directly to the employee as part of their wage. The question, as the article makes clear, is whether the customers received sufficient notice of that price. If not, it would be similar to being charged a "surprise" fee for the glasses of water brought to your table.

We'll see how this works out, but for the time being, tip your waitresses, I'll be here all week.

UPDATE: The link above now reflects the resolution of the case. No charge against the delinquent tipper; the DA's office contends that if the restaurant wants to mandate a fee for the server, it must be called a "service charge." I mildly disagree with this conclusion, as tip income is treated by federal and much state law relating to the employees as wage income, so to treat it differently with regard to patrons is inconsistent, but the association of tips with "gratuity," discretionary payments intended to reflect satisfaction with the service, does present a widely held assumption that tips are voluntary. Thus, the adequate notice requirement is reasonably questionable.

Monday, September 13, 2004

Media Review - Film School

I almost wish now that I hadn't even mentioned it. I watched the first episode of "Film School" this weekend and was just as bored by it as I typically am by reality shows, so I'm not sure I'll be seeing any more of it. I did make it to the end of the episode, where "scenes from future episodes" promise yelling, fighting, and other reality show money shots, but the first episode doesn't live up to the review.

Admittedly, the review refers mostly to scenes that don't appear in the first episode, but I don't think I care enough to follow through. The first episode spent a good deal of time examining the personal situations of these students, things that may very well be relevant to their lives as students, but not particularly unique to the experience of film school.

Several of the aspiring filmmaker/students speak of the "calling" that they feel, but I can't tell whether their calling is to be filmmakers or to make a certain film, to tell a specific story. Given the collaborative nature of most filmmaking and the expenses of production, making movies always creates more explicit and unavoidable conflicts between commercial and artistic motivations/justifications than do other art media. One of the students (Barbara?) admits that she's put aside her preference for experimental filmmaking (which she refers to in an ambiguous way that seems to be a synonym for non-narrative) in order to perhaps make a living at it. I don't have a problem with this at all, but her subsequent choice of project suggests that she has not entirely made peace with the commercial/artistic tension. The best filmmakers, it seems to me, are distinguished primarily by the way in which they deal with this tension.

The series, and these students' careers, are young, so it's probably too early to expect much from either at this point. The question is, do I really care enough to continue watching it. I guess I was expecting the series to be of particular interest as a document of beginning independent filmmakers, but it's too much like a typical reality show. I shouldn't have expected otherwise.

Friday, September 10, 2004

Perversely interesting

A review of a new reality series from the Independent Film Channel, Film School, makes the series sound irresistible.

I'm not a fan of reality television. I admit that I can watch a few minutes of some reality shows, but whatever appeal the Real World or Blind Date may have for me is burned off in about two minutes. I did enjoy about twenty minutes of the first episode of "The Surreal Life" second season recently, just seeing Flava Flav not recognize any of his fellow celebrity roommates. One of the highlights of living in NY for the last few years was getting to hear Flav give the "Geographic Traffic" reports in the mornings on 105.1 FM. He gave a memorable shout out to Robert Blake shortly after his legal troubles began, saying (as I recall): "To Robert Blake, who shot his wife with a Beretta... it's gonna be ahh-ite!"

Anyway, back to "Film School," which follows four NYU film students through their efforts to make student films. From the review, I'm delighted at the promise of watching vacuous, inarticulate, annoying cinema artistes flounder around hopelessly trying to negotiate the hazards of marginal film production and the even more treacherous terrain of their own pretentions. I never watched "Project Greenlight" in part because it was on HBO (which I don't get) and in part due to the "dream come true" frame of the series, as Miramax darlings (still?) Ben and Matt lift some dedicated visionary into the "real world" of film production. "Film School" strikes me as a work about what happens before the visionary is discovered, combined with an inside look at how college students work. I'll know more when I see it, of course.

Think what you will about jealousy or bitterness motivating my enjoyment of these would-be auteurs' humiliations (and they will be humiliated, as this is a reality show) but you'll be wrong. Despite my interest in film criticism, theory, and watching, I never for an instant entertained the notion of being a filmmaker. I want to make a movie about as much as I want to make a living as a pizza-chef: not at all. I appreciate that some of them exist, but I don't wish to join them. I'm a hater, and there's no room for jealousy in hating.

"Film School" premieres tonight at 10:30 and a review will be forthcoming.

Wednesday, September 08, 2004

Stark reminder of inevitable mortality

I have 249 movies on my Greencine queue. I know that this is not a lot, compared with several denizens of the public discussion boards, but it's a bracing figure nonetheless. Greencine (for those who don't know) is one of those services that mail DVDs to your home with return envelopes for a monthly fee... yes, like Netflix. I had a Netflix account for nearly a year, but Greencine has more of the cult horror and foreign flicks I want to watch, so I switched, even though their lack of distribution centers means that I have to wait about three days for transit (I keep stats, so when I say "about three days" I mean an average of 3.73 days for arrival and 3.15 days for return.)

My plan allows three discs out at once, and since the beginning of the year I've averaged 12.94 days from the point they go out to me and when they are received by Greencine. This includes two discs that we held onto for over a month (42 and 41 days) for some reason. We've averaged 6.63 discs per month so far this year, which means that if you round up and grant us seven discs per month, it will take me over 35 months to watch all the items on my list right now let alone things I haven't put on yet. So, I've got about three years of DVDs listed there.

The result is that every time I put a new disc on the queue, I must resign myself to the reality that I may not see the movie for years. Actually, it's worse than that; I may never see some of the movies on that list. I move things up the list if I really want to see them, but I've clicked any number of intriguing titles that I know little about or that I've always "wanted to" (read "thought I should") see that languish still at 222 (The Big Clock, if you must know) or 163 (Time Regained.) Not only are these items far from the top of the queue, they are regularly leapfrogged by other movies, meaning that they may stay in the ungainly and unloved middle of the queue for long spells. Many items at the top are high-priority, while those at the bottom are recent additions and more likely to be favored with a move to the top. Those in the middle are likely forgotten and must, like the snail in that math puzzle, creep up slowly only to slide back down in idle moments.

I may die with The Big Clock still hovering in the second quartile of my Greencine queue. Any insult to Charles Laughton is entirely unintentional.

Thursday, September 02, 2004

So far...

... not so good. From the pattern of conscious neglect I've shown over the last week and a half, it does not appear that I will have the problem of spending too much time on the blog. I have been busy lately, but not entirely without moments here and there to patch on a link or post that I've let slip past.

Oh, well. Content is what I require, although I have not yet settled the boundaries of acceptable disclosure here on the blog. Certain things need to be known, otherwise I couldn't really write anything. I am an educator (hence my recent busyness) and teach about law and courts. Thus, I follow politics and what's going on in the judicial branches (state and federal.) Much of my spare time is devoted to reading and other media, typical stuff, really.

I have no guilty pleasures. I'm always a bit puzzled when someone refers to something as a "guilty pleasure," although I think I know what this is intended to mean. There are all sorts of things that I like, movies, music, books, whatever, things that I enjoy and that give me pleasure, that I don't actually think have great value. To the extent that these things are "guilty pleasures," then I have many, but they don't make me guilty. It seems to me that you should only feel guilty if you allow your peculiar, idiosyncratic tastes to govern your assessments of value. I try to make clear distinctions between things I like and things that I think are "good" or have significant value of some form.

For instance, I like horror. Written, filmed, whatever... I like horror. I watch a lot of bad horror movies and enjoy some of them. That doesn't mean I'd recommend them to others or claim that they are good, just that I like them. Some horror, I believe, actually is good and should be appreciated by people even if they don't have an affection for the genre, but that's a separate assessment.

One thing that's worth noting, though, is that I don't like many things. I dislike many of my friends' favorite movies, books, music, etc. and often find that very popular or critically acclaimed works leave me cold. I'm also not terribly enthusiastic about things I like. Now, I can be enthusiastic about things I don't like. I actually find I'm often quite worked up about things that suck, yet are widely praised. I'm sort of a hater, dedicated to heaping disparagement and scorn on things other people cherish. It's a calling.

Well, if that isn't a dedication of principles, I don't know what is.