Saturday, December 31, 2011

"War Horse" and World War I

I strongly recommend “War Horse,” which is one of the best movies of the year.  Let me warn readers in advance that it is very sentimental and if anyone but Steven Spielberg had directed the film, it would not have worked.  The movie calls attention to World War I, a conflict that most Americans have relatively little knowledge of and that has long been overshadowed in popular culture by World War II.

In June 1914, a Bosnian Serb nationalist assassinated Archduke Franz Ferdinand of Austria in Sarajevo, triggering a series of military alliances that began World War I, the first major European war in a century.  Crowds across the continent cheered the coming of the conflict and both sides expected a short war.  Instead, the battle between the Allied Powers of Britain, France, and Russia and the Central Powers of Germany, Austria-Hungary, and the Ottoman Empire, dragged on for over four years, killing 9 million soldiers and devastating Europe.  When it finally ended on November 11, 1918, it was called the Great War or “the war to end all wars.” 

World War I played an instrumental role in shaping the rest of the 20th century.  By its end, the German, Austro-Hungarian, Russian, and Ottoman Empires had collapsed.  The harsh peace imposed by the victorious Allied Powers in the Treaty of Versailles weakened the European economy, paving the way for the Great Depression of the 1930s.  In particular, the settlement imposed huge reparations on the defeated Germans and demanded they accept guilt for starting the war, virtually strangling the democratic Weimar Republic at its birth.  Adolf Hitler and the Nazis Party flourished in this climate, builiding political support by claiming that Jews and others had “stabbed the country in the back.”  The war weakened the czarist regime in Russia, resulting in the Bolshevik Revolution in 1917 and the creation of the Soviet Union.  Finally, after promising the Arab world its independence in exchange for joining the fight against their colonial overlords, the Ottoman Empire, Britain and France carved up the region between each other. They created countries like Iraq and Jordan, which had previously not existed, drawing national borders to suit their own interests.  These artificial boundaries are responsible for many of the problems in today’s Middle East.

Of course, “War Horse” does not focus on such weighty geopolitical issues.  (SPOILERS) It is the story of a young British boy who trains a horse that his family must sell to the British military in order to keep their farm.  In the course of the war, the horse is used by the British, German, and French militaries on the Western front in France.  Joey, as the horse is called, is never used by the U.S. Army, which only arrived in 1918.  Under the leadership of President Woodrow Wilson, the United States tried to avoid entering the conflict, only declaring war in April 1917 following the resumption of German submarine warfare.  Given our small peacetime army at the time, it took almost another year to train units to send across the Atlantic, but our fresh troops eventually helped bring about an earlier end to the war.

The film illustrates how combat during the First World War changed from previous wars.  New industrial technologies facilitated a different kind of battle than seen in the biggest Western conflict of the 19th century, the American Civil War.  While armies still used the cavalry charges of the last century, weaponry like machine guns and tanks led to higher casualties, as the two sides got bogged down in trench warfare for four years in France.  Weapons of mass destruction like poison gas were used for the first time.  Spielberg’s depiction of the fighting in No Man’s Land, the region between the trenches, is every bit as impressive as the now-famous portrayal of D-Day in the opening 30 minutes of his “Saving Private Ryan.” 

Many historians consider World War I the first “total war,” where civilians experienced the full impact of a conflict. (Minor spoilers)  The section of the film where the French Army repeatedly confiscates goods from a local farmer and his granddaughter illustrates this phenomenon.  There is little ideology and politics discussed in the movie, which is largely about soldiers and civilians trying to maintain their humanity during the tragic conflict.

When the war was finally over, the Allies, including the United States, emerged victorious, but at a huge cost that would echo for generations.  Yet most Americans know little about the conflict, even though 53,000 soldiers died in combat.  The U.S. was only involved in the war for 18 months and did not suffer the heavy losses the Europeans did, while our home front was not nearly as affected.  Furthermore, World War II, commencing a mere 20 years later and causing even greater destruction, is much more dominant in the American memory.  Few major films have depicted World War I and none has achieved commercial success comparable to that of movies about the Civil War, World War II, or Vietnam.

“War Horse” will likely be a nominee for best picture and provide yet another best director nomination for Spielberg.  It almost makes me forget how much I disliked “Adventures of Tintin.”  More importantly, when I discuss World War I with my students, I will now be able to cite a film to illustrate some of my points.

"Dark Knight Rises" trailer

As memory of the 9/11 terror attacks recede, more and more films are influenced by the next major event of the last decade, the Great Recession.  “Wall Street: Money Never Sleeps” (2010) and “Margin Call” (2011) both were quasi-documentary examinations of the global financial crisis.  And just as there were films and television shows dealing with 9/11 in metaphorical fashion, it appears this is beginning to occur with the economy.   

“Batman Begins” (2005) and its sequel, “The Dark Knight” (2008) both featured heavy echoes of the war on terror.  In particular, “The Dark Knight” could be claimed by both sides in the debate over the Bush-era surveillance policies.  From the trailer, it appears the conclusion to Christopher Nolan’s trilogy, “The Dark Knight Rises” (2012) may comment heavily on income inequality and other issues that have emerged as part of the national debate since 2008.  Selina Kyle, a.k.a. Catwoman, tells the billionaire Bruce Wayne, a.k.a. Batman as well as a charter member of the 1 percent: “You think this will last. There's a storm coming Mr. Wayne. You and your friends better batten down the hatches. Because when it hits, you'll wonder how you ever lived so large and left so little for the rest of us.”

Occupy Wall Street, anyone?

Thursday, December 29, 2011

Mission Impossible: Ghost Protocol

I highly recommend, “Mission Impossible: Ghost Protocol,” which I believe is the best action film of the year.  When I wrote about the more serious tone of post-9/11 action films in September, I discussed several movie franchises, but not the Mission Impossible series.  I did so in large part because the films have not been particularly memorable.  Indeed, it is safe to assume that although it is his only film franchise, none of the first three movies will be remembered as among the most important of Tom Cruise’s long film career. 
Upon further review, however, Mission Impossible underwent the same transition to greater seriousness that Batman and James Bond did between the 1990s and the post-2001 era.  The first two films, released in 1996 and 2000, respectively, were merely crowd pleasers, without much else involved.  The third film, which was directed by J.J. Abrams (“Lost,” “Star Trek”) and released in 2006, featured a much darker story in which the villain’s scheme involves a pseudo-neoconservative plot to create a pretext for a pre-emptive strike against terrorists.  At one point, it appears Ethan Hunt’s (Cruise) wife is murdered (turns out to be someone else wearing a mask).   Premiering shortly after Cruise’s well-publicized engagement to Katie Holmes and his couch-jumping exploits on Oprah, the film underachieved at the box office.
“Ghost Protocol,” the fourth film in the series, may signal the emergence of a lighter touch in action films, as the memory of the attacks on the World Trade Center and Pentagon begins to fade.  Like this past summer’s “X-Men First Class,” it bears a strong resemblance to a pre-Daniel Craig Bond film.  (SPOILERS)  Like many Bond baddies, the film’s megalomaniacal villain manipulates U.S./Russian tensions to his own nefarious ends.  In this case, the antagonist steals Russian nuclear codes in the hopes of precipitating a global nuclear war.  While “Ghost Protocol” features a spectacularly filmed 21st century terrorist attack on the Kremlin, the film is more reminiscent of the over-the-top, Roger Moore-era Bond movies like “The Spy Who Loved Me” (1977) and “Moonraker” (1979).  As I said in the September post, it was too early to tell if the serious strain would last and by the same logic, “Ghost Protocol” is not sufficient proof that campiness is returning.  Indeed, the “Dark Knight Rises” trailer appeared before the film and it looks deadly serious (more on that in the next post).
The film may also herald a comeback for Cruise, who has been the biggest movie star of his generation, but has not had a hit in some time.   Hollywood star power has clearly faded in recent years as Cruise, Harrison Ford, and to a lesser extent, Tom Hanks have appeared in a number of commercial bombs.  Only Will Smith can guarantee a huge gross these days, a fact that will be severely tested by the release of the highly unnecessary “Men in Black III” next year.  Time will tell if Cruise can regain the star status he maintained from the mid-1980s to mid-2000s and if “Ghost Protocol” represents significant cultural change.

Sunday, December 25, 2011

Terra Nova

In the years since 9/11, television has presented the dilemmas of the war on terror in a number of different ways.  “24” did so directly, through a series of relatively realistic nuclear, biological, and chemical crises.  “Battlestar Galactica" presented a sci-fi metaphor, as the human race tried to survive and maintain its values after it is nearly destroyed by the Cylons.  “Lost” showed them indirectly and occasionally, as when the Losties decided how to obtain information from captured members of the mysterious “Others” who also lived on their island.
“Terra Nova” is yet another example of post-9/11 culture and it combines elements of all three shows.  In its “Galactica”-like premise, man has made the planet unlivable in the 22nd century through greed and environmental destruction.  Fortunately, humanity discovers a time fracture that allows people to make a one-way pilgrimage back to the pristine era of the dinosaurs.  The formula is basically “Jurassic Park” meets “Lost.”  Indeed, the Terra Nova colony looks suspiciously like “Lost’s” Dharma Initiative while the colony is routinely threatened by a subversive group of colonists called the “Sixers,” who seem suspiciously similar to the “Others.”  Humanity must survive amidst dangerous predators as well as a fifth column that wants to make the time fracture go both ways so they can plunder the past for their own financial gain in the future.
In this environment, the show depicts the challenges of the post-9/11 period.  Like “Galactica,” the colonists struggle with how to govern while under constant threat.  Like “Lost”, a potential subversive is tortured and held in inhuman conditions to get him to talk.  As in virtually every season of “24,” the colony leader, Colonel Taylor (whose name is likely an homage to Charlton Heston’s character in “Planet of the Apes”) must ferret out a mole within the community’s ranks. 
Despite the intriguing premise, the show is extremely formulaic.  Crises are neatly resolved and there is little drama.   While Taylor, played by Stephen Lang (the villain in “Avatar”) is interesting, the show focuses on the Shannon family, who are extremely boring.   Much attention is paid to the adolescent angst of their teenage children, whose portrayal falls well short of “Buffy” or even “Smallville” standards.  It reminds us that great science fiction is not merely about impressive special effects, but characters that people care about.  Frankly, I’m so annoyed by these characters that I’m rooting for the dinosaurs to eat some of them. 
As I’ve said previously, the failure of serialized shows like “V” and “The Event,” along with the success of procedurals like “Person of Interest,” seems to indicate that we are moving past the post-9/11 era in popular culture.  On the other hand, the former programs were also weaker than their predecessors.  Perhaps “Terra Nova” will improve if Fox renews it for a second season, but I’m not terribly optimistic.

Monday, December 19, 2011

The Tim Tebow Phenomenon

As a Florida Gators fan, I certainly hoped Tim Tebow, who was one of the greatest college quarterbacks ever, would have a successful NFL career.  But I never would have believed he would attain a rock star status even greater than he had in college football.  In the last week, Tebow, now the starting QB for the Denver Broncos, appeared on the cover of Sports Illustrated and was satirized on “Saturday Night Live.” Furthermore, in the last Republican debate of 2011, Rick Perry declared that he would like to be the “Tim Tebow of the Iowa Caucus,” referring to Tebow’s patented fourth-quarter comebacks as well as the quarterback’s evangelical Christianity, which is very popular with social conservatives in the Hawkeye State.  Though Sports Illustrated and “Saturday Night Live” are central institutions from an earlier era that have faded in relevance, few could have pulled off this pop culture trifecta.  The Tebow phenomenon culminated with Sunday’s Denver Broncos-New England Patriots game, which earned the second-highest ratings for an afternoon game on CBS since the network bought the rights to the AFC package in 1998.  In a manner of weeks, Tebow has gone from the backup quarterback on a losing team to perhaps the biggest sports phenomenon in the country, crossing boundaries into politics and entertainment.

Wednesday, December 14, 2011

Colonel Potter and the Evolution of M.A.S.H.

When Harry Morgan joined the cast of M.A.S.H.  for the start of its fourth season in 1975, the show still largely followed the slapstick formula of the 1970 film.  Morgan himself had played a minor role in this regard, with a guest appearance as a crazy general in a third season episode.  Such portrayals were typical of the show, which often portrayed the military leadership as inept and out-of-touch.  These representations grew out of the cynical spirit of the anti- Vietnam War protests of the 1960s and early 1970s.  As the program continued throughout the late 70s and early 80s, however, the show took on a more serious tone as the respectable Colonel Potter replaced the hapless Colonel Blake, who had run the 4077th for the first three years of the program.
The film depicted the Korean War of the early 1950s, but M.A.S.H. was clearly intended to serve as an allegory for the Vietnam War, which was still underway in 1970. In fact, the studio asked the filmmakers to add references to Korea to the movie, because director Robert Altman and others had tried to make the backdrop look as much like Vietnam as possible. The television show continued to follow this formula.   It is ironic that the most famous pop culture representation of the Korean War, often called the “Forgotten War,” is thought of as the portrayal of another conflict.
In its early years, M.A.S.H.  was a traditional comedy, as the irreverent Captain Hawkeye Pierce (Alan Alda) and Captain “Trapper John” MacIntyre (Wayne Rogers) played practical jokes on Major Frank Burns (Larry Linville) and Major Margaret “Hot Lips” Houlihan (Loretta Swit), who represented traditional military values as well as the conventional American patriotism that had come under attack in some circles during the 1960s.  The unit’s commanding officer, Colonel Henry Blake, was a well-meaning but bumbling leader who was manipulated by Pierce and MacIntyre with some assistance from Blake’s right-hand man, Corporal Radar O’Reilly (Gary Burghoff).
With the departure of Blake and Macintyre and the arrival of Potter and B.J. Hunicutt (Mike Farrell), the show began to take on a more dramatic tone.  This change accelerated when the arrogant, competent Major Charles Winchester (David Ogden Stiers) replaced the incompetent Burns.  Furthermore, the Margaret Houlihan character evolved from being the butt of jokes to a Mary Richards-like character mirroring the prominence of the women’s movement during the time.  Even Corporal Klinger (Jamie Farr) stopped trying to get thrown out of the military, put away his dresses and became an effective company clerk.
The evolution of the program came about not only from cast changes, but from the growing role of Alan Alda in the writing and producing of the show.  Alda was increasingly involved in liberal causes, becoming a leading advocate for the passage of the Equal Rights Amendment (ERA).  As some scholars have noted, male roles changed due to the emergence of 1970s feminism, moving away from the machismo of John Wayne to the sensitivity of Alda.  Reflecting this sensibility, Pierce, Alda’s on-screen alter ego, grew from an inveterate womanizer to a character who was frequently seen crying in episodes during the later years of M.A.S.H.  This culminated when Hawkeye has a nervous breakdown during the series finale, “Goodbye, Farewell, and Amen.”
Fans of M.A.S.H., like fans of Woody Allen, frequently debate: which was better, the early, funny years or the later, dramatic period?  M.A.S.H. was one of my favorite shows as a kid and I use to prefer the dramatic era, but I’m no longer sure.  The antiwar message becomes a little tired and I don’t need a TV show to repeatedly tell me that “war is bad.”  It also saddens me to say the funny period isn’t as funny as I remember it being.  Still, whichever time frame you like most, the shift began when Colonel Potter arrived at the 4077th.  Cue the theme music.

Thursday, December 8, 2011

Colonel Potter's Death and the Fracturing of American Culture

The death of Harry Morgan, who played Colonel Sherman Potter on “M.A.S.H.” from 1975 to the show’s conclusion in 1983, reveals the decline of mass culture, which has been one of the major themes of this blog.  At various times yesterday, Morgan’s obituary was the most viewed article on NewYorkTimes.com, which is incredible for the death of an actor who played a supporting role in a show that went off the air nearly thirty years ago. Of course, “M.A.S.H.” has lived on in reruns since, but it demonstrates the incredible followings that television programs could achieve before cable and how they provided a unifying culture for much of the nation.
With only three networks, hit shows such as “M.A.S.H.” drew ratings that are inconceivable in the 500+ channel universe of today.  I saw an article a few years ago that showed that “American Idol,” the biggest hit of the last decade, has an audience comparable to that of “Scarecrow and Mrs. King,” a middling show which aired for four years in the 1980s.  Most famously, the last episode of “M.A.S.H.,” “Goodbye, Farewell, and Amen,” which aired in 1983, remains the most watched non-sports program of all-time in terms of total viewers, a record that is likely to last for some time, even with the considerable growth in the population. Water usage in some cities increased dramatically during commercials for the final episode, as the nation collectively went to the bathroom (few VCRS and no DVRS in 1983!) Indeed, “Goodbye, Farewell, and Amen’s” overall record for total viewership lasted until 2010, when it was broken by Super Bowl XLIV between the New Orleans Saints and Indianapolis Colts. 
The long reign and domination of ABC, NBC, and CBS meant that large swaths of the country watched the same or similar programs.  There were fewer differences in viewership based on race, age, or ethnicity.  Most Americans watched Lucy Ricardo have her baby, Richard Kimball finally catch the one-armed man, Walter Cronkite narrate the moon landing, as well as discover that J.R. Ewing was shot by his secretary.  Gradually, though, cable networks emerged to cover specific subjects, like CNN for news and ESPN for sports.  This specialization evolved to news networks for liberals (MSNBC) and conservatives (FOX News) and sports networks for football (NFL Network) and golf (Golf Channel).  Today, even the broadcast networks tend to target niche markets, with Fox pursuing the 18-49 age group while CBS focuses on older viewers.  This has led to some of the fracturing of the culture I discussed in my earlier entry on the music industry; people no longer listen to the same artists or watch the same television programs. 
Of course, one would not want to get too nostalgic, as anyone who has tried to watch 10 minutes of “CHiPS” or “Knight Rider” in recent years can attest.  Cable has brought about a flowering of quality programs as HBO, FX, and AMC, have produced innovative fare like “The Wire,” “Nip/Tuck,” and “Mad Men.”  The broadcast networks responded with shows like “The West Wing,” “Lost,” and “30 Rock,” programs that likely would not have lasted long a generation ago.  Colonel Potter’s death is a reminder of what was lost and what we have also gained.

Sunday, December 4, 2011

J. Edgar

I strongly recommend J. Edgar, which is a very interesting look at the life and career of FBI Director J. Edgar Hoover.  Directed by the ageless Clint Eastwood and starring Leonardo DiCaprio, the film focuses heavily on Hoover’s close relationship with his right-hand man, Clyde Tolson, and suggests that the relationship went beyond friendship to a largely unrequited romance.  While many have speculated on Hoover’s sexuality and while the true nature of the Hoover/Tolson relationship can never truly be known, J. Edgar provides a fairly accurate look at the public aspects of his career.
The film starts with Hoover’s pre-FBI role as young government agent involved in the 1919-20 Palmer Raids, which was an effort to root out domestic communism after the end of World War I.  As the movie shows, it was prompted by a bombing campaign against several public officials that was blamed on American Communists.  The Justice Department, led by Attorney General A. Mitchell Palmer, engaged in extreme and legally questionable attempts to stop what they saw as a conspiracy against the country, deporting many radicals despite the fact they had no criminal record.  The Palmer Raids are often referred to as the First Red Scare and have been largely overshadowed by the Second Red Scare, led by Joe McCarthy in the 1950s
The movie then focuses on Hoover’s attempts to build the FBI into a modern crime-solving agency, with echoes of CSI techniques, which then included finger-printing and early expert testimony.  He used the gangster activity of the 1930s to leverage a greater federal role in crime policy, an issue which had traditionally been left to state and local governments.  Some of these events, like the FBI’s response to the bank robbery campaign of John Dillinger, were previously depicted in the 2009 film Public Enemies. The hysteria over the kidnapping of the Lindbergh baby and the eventual trial of Bruno Richard Hauptmann for the crime, also led to greater power and prestige for the FBI.
During the film, these events are juxtaposed with an older Hoover’s obsession with Martin Luther King, Jr. during the 1960s.  J. Edgar shows Hoover obtaining Attorney General Robert Kennedy’s acquiescence to the wiretapping of Dr. King by implicitly threatening RFK with material documenting President Kennedy’s affair with a woman from behind the Iron Curtain.  It also shows Hoover dictating a letter to accompany a tape recording of one of King’s affairs; the combination seemed intended to compel King to commit suicide.  In reality, William Sullivan, an Assistant FBI Director, composed the letter.  Still, the basic thrust of this section of the film is accurate, even if all the details are not.
At the end of the film, Hoover meets with a recently-elected Richard Nixon and tells Tolson that the new president wants greater control over the FBI and that Nixon will create his own apparatus if he does not cooperate.  Though I don’t believe there is evidence of such awareness on the part of Hoover, it dovetails with the historical record.  Nixon wanted the FBI to do his bidding with regard to monitoring his political enemies and when Hoover refused, he moved to create his own “Plumbers” who would work to investigate leaks and gather intelligence against his political opponents.  The break-in at the Watergate which, of course, led to Nixon’s downfall, was the most famous act of the Plumbers.
Of course, time limitations forced Eastwood and the writers to neglect aspects of Hoover’s career.  Largely omitted was Hoover’s central role in the Second Red Scare of the 1940s and 50s.  By 1960, because of his infiltration campaigns, a majority of the members of the American Communist Party were actually FBI informants!  As the film shows, Hoover remained obsessed with domestic communists long after they were a significant social and political force.
In addition to his persecution of King, Hoover actively opposed the civil rights movement for a half-century, harassing a series of black leaders and organizations, from Marcus Garvey in the 1920s and continuing with the FBI’s COINTELPRO program to disrupt the Black Panthers and other black nationalist groups during the late 1960s.  Though Hoover was often eager to extend the FBI’s influence, he refused to provide any protection to civil rights workers operating in the Deep South in the 1960s.  Indeed, the FBI did not even open an office in Mississippi until after the “Mississippi Burning” killings of Michael Schwerner, Andrew Goodman, and James Chaney in the summer of 1964.  A greater FBI presence might have prevented their murders as well as some of the other acts of white terrorism in the Deep South.  Hoover, though, did eventually use the same tactics he used against the Communist Party to weaken the Klan.
All in all, J. Edgar is an interesting look at a complex and important historical figure and is a relatively accurate film.  For a change of pace, I am now going to see the Muppets!

Saturday, November 12, 2011

"Margin Call" and the Financial Crisis

I modestly recommend Margin Call, a new film that dramatizes the 2008 financial crisis.  It features a number of fine actors who give excellent performances, including Kevin Spacey, Jeremy Irons (in the best roles for both in a while), as well as Zachary Quinto (Spock in the Star Trek reboot).  The movie is a bit slow, but is still clearly superior to Oliver Stone’s Wall Street: Money Never Sleeps (2010) in its depiction of the meltdown
The film is about an investment bank that is clearly an allegory for Lehman Brothers.  Early in the movie, Quinto’s character discovers that the mortgage-backed securities that the firm has based their business model around have brought the company to the verge of bankruptcy.  Reflecting the round-the-clock machinations that surrounded the sale of Bear Stearns, the bankruptcy of Lehman, and more recently, the demise of John Corzine’s MF Global, the film then follows the characters over the next 24 hours as they try to save the firm.
Margin Call reflects a number of the issues which surrounded the financial crisis.  A number of commentators have observed that the extraordinary compensation which emerged on Wall Street distorted the American economy by draining talent from other sectors.  Indeed, Quinto’s character turns out to be an MIT-educated physicist who went to work on Wall Street because of the higher pay. Another character was once an engineer.  Some have also observed that it was these types of people, with their mathematical and technical acumen, who conceived of the complex financial instruments which helped precipitate the crisis.
The film also discusses the extravagant lifestyles enjoyed by Wall Street traders.  Various characters seem to have extended themselves financially despite their immense wealth (or perhaps because of it).  Spacey plays a burned-out trader who wants to get out of the business, but decides to stay at the end of the film because he still needs the money after a lifetime working in the industry.
Irons, who plays the unnamed firm’s CEO, tries to rationalize the disaster at the end of the film, telling Spacey that the crisis was unavoidable because it is just another in a series of bubbles that have occurred throughout the history of capitalism.  This seems to be an echo of the explanations, or one might argue, rationalizations, provided by the heads of major investment banks since 2008. In their minds, new regulations like Dodd-Frank are unnecessary because the debacle of the last few years was not due to their irresponsible and unethical behavior, but due to forces beyond their control.




                                 

Wednesday, November 9, 2011

Joe Frazier's Death and the Decline of Boxing

Joe Frazier’s death and the ensuing recollections of his three battles with Muhammad Ali remind us how far boxing has fallen in American culture.  One of the three most popular sports during the first half of the 20th century, along with baseball and horse racing, it now barely gets a mention on SportsCenter.  While baseball may no longer be the national pastime, boxing is simply irrelevant.
In the late 19th and nearly 20th century, boxing was primarily a working-class sport, fought largely by immigrants in major cities.  By the 1920s, with the decline of Victorian values and changing social mores, boxing became more respectable and popular among middle-class Americans.  Furthermore, the emergence of radio allowed Jack Dempsey, the heavyweight champion of the era, to become a national figure, like Babe Ruth and other sports heroes of the time.
Boxing differed from baseball in that it was somewhat integrated.  Malcolm X once noted that the boxing ring was the one place a black man could beat up a white man without getting killed.  Jack Johnson became the first African American heavyweight champion in 1908 but the racism of the time eventually resulted in his criminal prosecution over his relationships with white women.  It would be another generation before a very different black fighter, Joe Louis, got a chance to fight for the crown.  His defeat of the German champion Max Schmeling in a title defense in 1938, at a time of tremendous tension between the U.S. and the Nazi regime, was one of the biggest sport events of the 20th century.  This triumph, as well as his humble manner, endeared the “Brown Bomber” to blacks and whites alike, making him the first crossover sports star.
The sweet science, as some called it, remained popular into the postwar period.  Champions like Rocky Marciano and Sugar Ray Robinson were among the most prominent athletes.  And it was not just the championship battles that were important.  Boxing remained a spectator sport at the local level as well; I once showed an episode of I Love Lucy in class where Fred and Ricky go on a boy’s night out to the fights, the way one might go to a basketball game today.
Though some date the beginning of the sport’s decline to the 1960s and 1970s, the heavyweight champion of the world was still one of the best-known people in the nation, if not the world.  Ali, Frazier, and George Foreman fought each other in battles that have still-legendary names like the “Rumble in the Jungle” and the “Thrilla in Manila.”  The “Rocky” film franchise began in 1976, helping to maintain the sport’s popularity.
Even with the charismatic and controversial Ali no longer on the scene, boxing still had some prominence in the 1980s.  Sugar Ray Leonard, Marvin Hagler, and Thomas Hearns, fought major fights in the welterweight and middleweight divisions that garnered national attention.  Sportswriters cared enough to label heavyweight champion Larry Holmes an unworthy successor to Ali and to laud Mike Tyson when he unified the division in the late 1980s.
Over the last two decades, boxing fell of the cliff for a number of reasons.  After Tyson went to prison, no fighter emerged who engaged casual fans.  The Olympic Games had launched the careers of a number of boxers, including Ali, Frazier, and Leonard, but network coverage of boxing declined as NBC pursued the female demographic. 
Finally, there is no doubt that public revulsion at the corruption and physical costs of boxing caught up with the sport.  Reminisces of Frazier all recall that neither he nor Ali were the same after their third fight.  Indeed, there is nothing sadder than the sight of the once-loquacious Ali, now silenced by Parkinson’s disease.  Many other boxers have had long-term health problems; a few have even died in the ring.
Today, when I teach about Louis or Ali in class, I have to remind them that the heavyweight champ was once a very important person.  I ask them who the current champion is and there is usually a deafening silence.  And not just from the students.  The professor doesn’t know either.


Thursday, November 3, 2011

How Football Continues to Dominate America

In a previous post, I discussed how pro football has come to dominate American sports and supplant baseball as the national pastime.  Ironically, the compelling seven-game World Series between the St. Louis Cardinals and Texas Rangers only made this clearer.
On the one hand, this year’s World Series garnered television ratings 19 percent higher than last year’s five-game set between the Rangers and San Francisco Giants.  While a seven game series should get a higher rating than a five game series, any time the ratings increase for a sporting event these days it must considered a success, given the continuing growth of other entertainment options.
A closer examination of the data, though, shows the relative strength of football.  Though World Series  games 3 and 4 beat Sunday and Monday Night football one-on-one in viewership, this past week’s Sunday night game between the Cowboys and the Eagles got a higher rating than Game 6, a dramatic affair that will go down as one of the greatest World Series’ games ever.   While Game 7 of the World Series got higher viewership overall than Cowboys-Eagles, it performed lower among the 18-49 demographic coveted by advertisers.  Unbelievably, more young people watched a regular season football game than the first Game 7 of a World Series since 2002.
Indeed, the World Series lagged the NBA Finals for the second straight year, providing a cautionary note regarding the potential costs of the current basketball lockout.  The tremendous interest generated by LeBron James’ move to the Miami Heat, culminating in their loss to the Dallas Mavericks in 2011, boosted the sport’s fan base.  Those gains could be squandered if significant portions of the 2011-2012 season are lost.
It will be interesting how the ratings for this weekend’s college football  “Game of the Century” between LSU and Alabama measure up against the World Series.

Wednesday, November 2, 2011

20th Anniversary Albums and the Changing Nature of the Music Industry

The 20th anniversary re-releases of Nirvana’s Nevermind and U2’s Achtung Baby, along with the breakup of R.E.M., made me think about the changing nature of the music industry. One of the major trends of the last 30 years has been the relative decline of mass culture and the concomitant rise of niche culture.  For example, as recently as the mid-1980s, the three broadcast networks still dominated the ratings and there were only 2-3 blockbuster movies per summer.  Nowhere has this change been more dramatic than in the music business.
After rock n’ roll emerged in the mid-1950s, singles were the dominant way people bought music.  By the late 1960s, following the success of works like the Beatles’ landmark Sgt. Peppers’ Lonely Hearts Club Band in 1967, albums became the dominant medium.  Indeed, album sales outpaced singles for the first time in 1968.
The dominance of albums continued into the 70s and 80s.  Indeed, in some ways, the 1980s were the peak of the mass culture era in pop music.  Artists released albums, singles from these albums were released on FM radio, and videos of the singles went into mass rotation on MTV (yes, it’s true, they did once show videos on MTV).  As a result, albums like Michael Jackson’s Thriller (1982), Bruce Springsteen’s Born in the USA (1984), and U2’s Joshua Tree (1987) reached an extraordinary audience.  Due to the combination of radio and MTV, some songs got massively overexposed.  To this day, I change the station when one of the hits off of Joshua Tree comes on the radio; I got tired of those songs in the year they came out.  Of course, regional, racial, and ethnic differences in tastes remained, but these 1980s stars reached a far wider audience than artists today.  Nevermind and Achtung Baby and R.E.M’s most commercially successful albums, Out of Time (1991) and Automatic for the People (1992), were released at the peak of this period. 

 In the early 1990s, MTV pioneered reality TV with the Real World and these shows gradually became more lucrative for the network.  As a result, they eventually supplanted videos as that market diminished by the early 21st century.  With the emergence of the Internet, downloading became the way most people experienced music, as ITunes put record stores out of business across the country.  The ease of buying individual songs on ITunes reduced the centrality of albums and FM radio does not have the audience among young people it had a generation ago.  Consequently, the music industry is much more decentralized and it is harder for an artist to gain traction outside a certain niche. 
This is perhaps most exemplified by the two iterations of the charity anthem, “We Are the World,” written by Michael Jackson and Lionel Richie.  The first, recorded in 1985 to support famine relief in Africa, featured a who’s who of the Rock n’ Roll Hall of Fame, including Jackson, Bruce Springsteen, Paul Simon, Bob Dylan, Tina Turner, Steve Wonder, among many others. By contrast, I barely recognized the younger artists in the 25th anniversary edition, made to back relief efforts in Haiti after the earthquake in 2010.  I first thought this meant that I was getting old and out of touch, until Saturday Night Live satirized the lack of star power in the new version shortly thereafter.
Due to these trends, it is unlikely there will be 20th anniversary issues of albums from 2011 in 2031. I will have more to say about the decline of mass culture in television and film in future posts.

Monday, October 31, 2011

"Person of Interest" and post 9/11 culture

In a post before the summer, I suggested that the post 9/11 era in popular culture was coming to a close.  At first glance, CBS’s new procedural Person of Interest seems to undermine this theory.  In the show, Mr. Finch, played by one of my favorite actors, Lost’ s Michael Emerson (the creepy Ben Linus) developed a surveillance device after the attacks which monitors all-email, phone calls, and cameras to predict future terrorist acts as well as conventional crimes.  Reminiscent of Steven Spielberg’s 2002 film Minority Report, this seems to reflect post 9/11 concerns about the growth of the state and the potential loss of civil liberties.  The government, however, was so focused on terrorism that it didn’t pay attention to predictions of traditional crimes.  Finch has found a way to tap into the government’s intelligence, which gives him the Social Security numbers of individuals who are either potential victims or perpetrators of crimes.
To stop these crimes, Finch hires an ex-CIA operative, Reese (Jim Caviziel), who was going to leave the agency until he felt he had to serve his country because of 9/11.  While he was fighting the war on terror, someone murdered the love of his life.  At the beginning of the pilot, he is homeless and riding the New York City subway.  He has clearly paid a high psychological price for his actions defending the nation, somewhat like Jason Bourne.
In the show, Finch and Reese work to stop murders and other non-terrorism related crimes.  At the end of the pilot, Reese tells one villain that he went abroad to hunt bad guys, but now realizes “there were plenty of you right here all along.”  In this sense, the show seems to reflect a shift away from the post 9/11 fear of terrorism back to concerns about traditional malfeasance.  In fact, it reminds me a little of a now-forgotten 1980s show, The Equalizer, which featured another ex-CIA agent who stopped crimes the police couldn’t prevent.
I’ll keep watching the show because the premise is interesting and Michael Emerson is one of the best actors on television today.

Thursday, October 20, 2011

The Ides of March

I didn’t like Ides of March that much, although my parents really enjoyed it.  It seems to be an attempt by George Clooney, who directs and co-stars, to do an updated (and darker) version of Robert Redford’s 1972 film The Candidate. There are number of contemporary allegories in the movie, which was based on the 2008 play Farragut North.  In an echo of Barack Obama, Clooney plays an inspiring progressive presidential candidate.  Ryan Gosling plays a political consultant who is in thrall to Clooney and believes he is a man who will change the country.  The Obama connection seems complete when a New York Times reporter accuses Gosling of having drunk the Kool Aid on Clooney and tells him that he is a politician who will eventually let him down.  Perhaps an echo of the liberal disaffection with Obama? Without giving away too much, let’s just say that Clooney’s character also has a little bit of Bill Clinton in him and gets in trouble with a sexual indiscretion.
In the film, the Democrats are in the middle of a heated primary that appears to be more important than the general election because the Republicans are in such bad shape.  This seems similar to 2008 when the Hilary-Obama race seemed like the main event given how Bush’s unpopularity was going to hinder the GOP nominee.  Another echo of ’08 is how Republicans are encouraging their flock to vote in the Democratic primary to help Clooney’s opponent, who is perceived to be weaker and to also extend the primary campaign.  This seems earliy similiar to Rush Limbaugh's Operation Chaos, which asked conservatives to support Hilary in open primaries when it appeared Obama was going to wrap up the nomination. This actually helped Obama in the end, forcing him to build a campaign infrastructure across the country, paving the way for his wins in GOP-leaning states like North Carolina and Indiana.
In the end, I would recommend Clooney’s 2005 Good Night and Good Luck for a political film.

Monday, October 10, 2011

"Ich Bin Ein Berliner"

Last night’s Pan Am episode revolved around a real historical event, JFK’s 1963 trip to West Berlin.  On this trip, JFK made one of his most famous speeches, the “Ich Bin Ein Berliner,” speech, which doubled as the title of the episode.  In this address, Kennedy identified with West Germany’s struggle against communism by declaring that he was a Berliner (although in German he actually said he was a jelly doughnut). In order to prevent the continued exodus of educated people to the West, the Soviet Union and East Germany had erected the Berlin Wall, which split the city in two, in 1961.  The wall quickly became the main symbol of the Cold War and the division of Europe.  The show accurately depicts the tremendous excitement inspired by JFK’s visit.

A review of the speech reveals how much it expressed Kennedy’s Cold War liberalism.  While we remember JFK as a liberal icon, he was a hawk who campaigned in 1960 on a platform of being tougher on the Soviet Union.  In the “Ich Bin Ein Berliner” speech, Kennedy declared:
There are many people in the world who really don't understand, or say they don't, what is the great issue between the free world and the Communist world. Let them come to Berlin. There are some who say that communism is the wave of the future. Let them come to Berlin. And there are some who say in Europe and elsewhere we can work with the Communists. Let them come to Berlin. And there are even a few who say that it is true that communism is an evil system, but it permits us to make economic progress. Lass' sic nach Berlin kommen. Let them come to Berlin
The Wall would remain the most visible symbol of the Cold War throughout the 1960s, 70s, and 80s.  Its fall in 1989 provided the clearest sign of the end of the U.S./Soviet conflict.  West and East Germany would reunify shortly thereafter.
There were a number of other interesting historical notes in the episode.   Some of the reporters covering the speech made oblique references to JFK’s affairs, which were not known to the general public at the time.  In this time before feminism fostered a greater stigma toward adultery and Vietnam and Watergate brought about a more aggressive media, the press did not examine the private lives of politicians.  In fact, the episode probably exaggerates the press’ knowledge of these affairs.
The trip revives the wartime memories of Colette, a French-born stewardess who grew up during the Nazi occupation of France from 1940-44.  She has an angry exchange with German officials over how Kennedy’s visit seemingly gives Germany a pass for its wartime behavior. This part of the episode accurately reveals how the Cold War limited discussion of Nazi crimes because the U.S./West German alliance gave the West an interest in constructing an image of West Germans as “good” guys in the struggle against communism.  Greater discussion of the Holocaust would only emerge in West Germany after a wave of youth protests in 1968 and, if you can believe it, the showing of NBC’s Holocaust mini-series in the late 1970s. 
With the exception of a silly subplot where Christina Ricci’s character goes out of her way to try to meet President Kennedy, I enjoyed the episode.  Indeed, I’ve actually been surprised by the relative quality of the show and will continue to watch and blog.

Tuesday, October 4, 2011

The Return of the Mid-1980s

The imminent release of the Footloose remake made me realize that the last three years have witnessed the revival of a number of television shows/films from the mid-1980s.  Since 2009, studios have produced new versions of: The Karate Kid (original: 1984, remake: 2010), A-Team (1983, 2010), V (1983, 2009), Conan the Barbarian (1982, 2011), G.I. Joe (1982, 2009), and Footloose (1984, 2011).  I found it remarkable that the remakes occurred in a roughly 26-29 year cycle after the original.
A cursory review reveals that this pattern is a familiar one.  The early 2000s witnessed a similar dynamic with: Charlie’s Angels (1976, 2000) The Incredible Hulk (1977-78, 2003), Battlestar Galactica (1978, 2003), and The Dukes of Hazard (1979, 2005). One could also add Doctor Who, revived in 2005, to this list.  Though it originally premiered in Great Britain in 1963, the show reached unparalleled popularity with Tom Baker starring as the Doctor in the late 70s.
I think this pattern reveals the (ugh) maturation of Generation X.  Once a generation reaches a certain age and attains a certain level of power within the entertainment industry, it seeks to revive the treasured programs of its childhood.  While this is understandable and welcome in some cases, it does pose some frightening possibilities.  In all likelihood, we are about to see a revival of the late 80s.  There have been persistent rumors of a Quantum Leap film, which I would like to see.  More disturbing prospects include revivals of Full House and Family Matters.

Sunday, October 2, 2011

40th Anniversary of Walt Disney World

This past weekend marked the 40th anniversary of Walt Disney World in Lake Buena Vista, FL.  The park, which is one of the most popular tourist attractions in the world, has made a profound impact on Florida and the United States.
The roots of the Central Florida park lie in the evolution of Disneyland in Anaheim, CA, which opened in 1955.  The park, which capitalized on the affluence of post-World War II America, quickly became a very popular attraction.  In the 1950s, more and more Americans had disposable income and paid vacations and could make the trip to Southern California.  Despite this success, though, Walt Disney quickly became disturbed by the seedy hotels and other businesses opening just outside the Magic Kingdom.
As he contemplated a second park, Disney wanted a larger area to develop his ambitions.  He quietly purchased a large amount of land in Central Florida and arranged a favorable deal with the state of Florida regarding the region’s governance.  This led to his owning nearly 28,000 acres of virgin property near Orlando, giving him the autonomy to build any idea of his dreams.
But Walt Disney, who died in 1966, would not live to see the success of the new park.  Walt Disney World had a star-studded opening in 1971 and the differences with Disneyland were notable.  The park could not be seen from the surrounding roads, making Walt Disney World a self-contained community unlike Anaheim. At the outset, though, Disney World was a shadow of what it is today, featuring the Magic Kingdom, which was nearly a replica of Disneyland, and a few hotels.
Before his death, Walt Disney contemplated building a model city of the future.  This town would have, among other things, a dome to protect it from inclement weather and all types of modern transportation.  As the company floundered in the 1970s, Walt’s successors remade this concept of the Experimental Prototype Community of Tomorrow (EPCOT) into a World’s Fair-type park with a Future World of technology pavilions and a World Showcase of country pavilions.  Opening in 1982, EPCOT became the second theme park at Disney World.
The arrival of Disney World changed Orlando and its surrounding areas from a sleepy, rural community, to a modern city with massive suburban sprawl.  The area from Tampa to Daytona Beach, known as the “I-4 Corridor” is filled with people who have moved from other parts of the country and the world.  This area, with its large number of registered independents, is considered the swing vote in the state of Florida, and because of the state’s Electoral College clout, the country.  The growth of Florida, now the fourth-most populous state, would not have been as dramatic without the arrival of Mickey Mouse.
With the company near bankruptcy in the mid-1980s, Walt’s nephew Roy brought in Michael Eisner as CEO, and he led a dramatic expansion of Disney World.  Under his leadership, Disney/MGM Studios’ opened in 1989, followed by Animal Kingdom in 1998.  Furthermore, the number of park hotels expanded from a mere three in the early 1980s to 24 by 2011. Walt’s dream of a planned community became a reality with the opening of Celebration, FL in the late 1990s (minus the dome, though).
Disney’s growth reflects the globalization of the economy.  Today, Disney has parks in Tokyo, Paris, and Hong Kong and is planning one in Shanghai.  None of them would be imaginable without the success of Walt Disney World.
Of course, Disney has its critics who believe it has too much power in its arrangement with the state of Florida.  Others believe it is partially responsible for the homogenization of American culture.  And some, including me, think t is way too expensive.  Still, none of these critiques diminishes the tremendous influence the parks have had on Florida, the US, and the world.

Tuesday, September 27, 2011

Pan Am TV Show and the Changing Nature of Air Travel

At the most basic level, Pan Am is a paean to the airline industry before deregulation. With government-controlled routes that excluded competition, Pan Am dominated international travel.  As a result, it charged high fares and provided services that one would never see today, at least on domestic routes.  Passengers were treated to a number of benefits, such as spacious seating and food service that you would only find in first class today.  Several films have portrayed this in a limited way.  Think of Indiana Jones flying Pan Am in Raiders of the Lost Ark and Leonardo DiCaprio posing as a pilot and recruiting stewardesses in Catch Me If You Can.
Taking place in 1963, the show portrays the stewardesses’ lives as emblematic of the coming feminist movement, which was only in the early stages at the time.  One woman sees her job as a way to assert her independence; her sister joins Pan Am after fleeing her wedding in Graduate-style fashion to avoid a life of 1950s domesticity.  Christina Ricci plays a stewardess who lives as a bohemian in Greenwich Village.
Though Pan Am largely romanticizes their lives, the show does depict some of the downsides of working as a stewardess in that time.  The airline routinely checked your weight and you had to quit when you got married.  Indeed, after the passage of the Civil Rights Act of 1964, air stewardesses were one of the first groups to challenge gender discrimination under Title VII of the landmark law.
With the passage of airline deregulation during the Carter Administration in the late 1970s, new competitors emerged which undercut the domination of Pan Am, TWA, Eastern, and other older carriers.  The growth of discount airliners made flying accessible to many more people, but at the cost of the services that made air travel luxurious.  When was the last time someone offered you a meal on a domestic flight, let alone playing cards?  In the aftermath of the Libyan-sponsored bombing of Pan Am 103 over Lockerbie, Scotland in 1988, the airline was forced to declare bankruptcy.  Eastern and TWA suffered the same fate in the 80s and 90s.  If one were to make a 21st century version of Pan Am, it would be called Southwest.  It doesn’t sound nearly as romantic, though it is certainly more affordable.

Monday, September 12, 2011

How Football Came to Dominate America

The start of the football season provides an interesting window into American culture.  The incredible hype surrounding the beginning of the NFL, following all of the fears there wouldn’t even be a season, only reinforces how football has become the most dominant sport in the country by a large margin.
It wasn’t always this way.  For years, baseball was the “national pastime” and the most popular sport in the nation.  Opening day used to attract the kind of attention that the first Sunday of football now receives.  This past summer, however, talk about a possible NFL lockout subsumed discussion of the actual baseball season.  What happened?
Throughout the first six decades of the 20th century, the three most important sports in the country were baseball, boxing, and horse racing.  The World Series was the most important annual sporting event and the Super Bowl did not even exist.  College football was actually more popular than pro football until at least the 1950s.
Pro football’s coming out party was the “Greatest Game Ever Played,” the 1958 NFL Championship game between Johnny Unitas’ Baltimore Colts and Frank Gifford’s New York Giants.  One of the early games on TV, it ended in dramatic fashion as the Colts’ Alan Ameche scored on an one yard run in overtime.  Many credit the exciting contest for raising the NFL’s profile.
The popularity of the sport grew during the 1960s as the rivalry between the newly-formed AFL and NFL eventually resulted in the merger that created the modern NFL at the end of the decade.  The first Super Bowl, held in 1967 as a contest between the AFL and NFL champions, was not a major event, but quickly grew in the following years.  The famous Super Bowl III victory of Joe Namath’s New York Jets, indicating the competitiveness of the AFL, was another marker in the sport’s rise.  By the early 1970s, polls showed pro football ahead of baseball in popularity.  The Super Bowl became the biggest sporting event in the nation, a virtual national holiday that even non-fans feel obliged to watch.
What else accounted for the rise?  No doubt television was instrumental.  While baseball has made a tremendous amount of money from TV, football is more suited to the medium.  NFL commissioner Pete Rozelle, who was probably the greatest pro sports commissioner, developed relationships with the networks in the 1960s that helped grow the sport.  Moreover, the wealthy owners embraced a kind of socialism, equally distributing the television money so that Green Bay could be as competitive as New York.  This helped to bring about parity between large-market and small-market teams, giving every fan hope at the start of each new season.
Still, as recently as the mid-1980s, football was still barely ahead of baseball in popularity.  In 1985, a Harris Poll showed 24 percent of fans choosing pro football as their favorite sport while 23 percent chose baseball.  By 2010, 35 percent picked the NFL while only 16 percent picked major league baseball.
I think a number of factors account for the growth in the gap.  Clearly, baseball’s labor strife during this time, including multiple strikes and the cancellation of the 1994 World Series, hurt the game.  At the same time, the NFL had labor peace from 1987 to 2011, with no games lost to labor stoppages in that period.
Furthermore, football is a game more suited to the shorter attention spans of Generation X, raised on MTV and USA Today, and Generation Y, used to downloading music or receiving information immediately.  The languid pace of baseball, which may account for declining Little League participation, doesn’t seem to suit those 40 and under.
On a personal note, I grew up a bigger baseball fan than football fan, but in recent years my allegiances have changed.  I still love baseball, but it is a more difficult sport to follow as an adult.  I enjoyed following the batting races and memorizing statistics as a kid, but I don’t have the time anymore.  Part of the genius of football is that we can follow it by watching one day a week during the fall and winter, when the weather in most of the country precludes other activities

Thursday, September 8, 2011

Post 9/11 Popular Culture

As we approach the 10th anniversary of 9/11, we are about to see a great deal of commentary about how the attacks altered the country.  Over the last few months, I have procrastinated by watching a number of films that deal with terrorism and related issues.  They provide a window into how much the culture has changed because of the attacks.
When you look at films that deal with terrorism from the 1980s and 90s, the humorous tone of the movies is noteworthy.  Both Die Hard (1988) and Die Hard 2 (1990), for instance, are full of Bruce Willis’ wisecracks and public officials who don’t take the situations seriously.  The lack of airport security is notable in Die Hard 2, as John McLane engages in full-scale firefights within the airport itself while security seems to exist solely of glorified rent-a-cops.  I realize that some of these elements exist for dramatic effect, but it would inconceivable today for a film to depict the head of airport security ignoring a shooting in his own airport, as occurs in Die Hard 2.
Similarly, 1994’s True Lies, one of the first films to deal with the possibility of terrorists getting nuclear weapons, features a similar comedic tone.  The film is a complete farce with cartoonish terrorists and includes a scene with Jamie Lee Curtis and Arnold Schwarzenegger kissing as a loose nuke explodes in the background. 
Some film franchises provide clear demarcations between pre and post 9-11 culture.  For example, the 90s Batman films, particularly Batman Forever (1995) and Batman and Robin (1997), feature over-the-top villains and cartoonish plots reminiscent of the Adam West TV show from the 1960s.  On the other hand, the 21st century Christopher Nolan directed Batman Begin (2005) and Dark Knight (2008) have depicted relatively realistic threats similar to terrorist plots, such as Raz-a-Ghul’s attempt to poison the water in Gotham and the Joker’s multiple attacks.
Another clear contrast can be seen in the difference between the James Bond films of the 1990s and the post 9-11 007 movies.  While Pierce Brosnan revived the franchise, the films are notable for ludicrous plots that couldn’t possibly be taken seriously, culminating in 2002’s Die Another Day, where Bond uses an invisible car and drives through an ice palace.  It makes Moonraker look positively believable!
On the other hand, the Daniel Craig films feature a Bond that is ultra-serious and doesn’t even bother to engage in the usual puns and wisecracks.  The plots of Casino Royale (2006) and Quantum of Solace (2009) are relatively believable and it seems as if genuine issues are actually a stake.  Bond doesn’t even use the usual gadgets that have been such a trademark of the franchise as Q doesn’t appear in either movie.  One couldn’t imagine Roger Moore, the Bond of my childhood, starring in these films.
Indeed, the Craig films seem inspired by the Bourne movies.  Though based on the Robert Ludlum novels of the 1970s, the movies update the plots for the post 9/11 era.  Bourne is a somber CIA-trained assassin trying to figure out his own identity.  The third film, the Bourne Ultimatum (2007) is replete with commentary on the Bush era.  When Joan Allen’s ethical CIA officer questions the agency’s extreme tactics, asking “When does this end,” her counterpart played by David Straitharn says, “It ends when we win!” When Bourne later asks Allen’s character, Pam Landy, why she is helping him, she responds, “This isn’t what I signed on for. This isn’t us.”
Generally speaking, action adventure films have taken a more serious tone in the years after 9/11.  We will see if this continues or if this development will fade as we gain more distance from the attacks

Tuesday, August 9, 2011

Return to the Planet of the Apes

                With the release of Rise of Planet of the Apes, it seems a perfect time to revisit the original Planet of the Apes, released at a pivotal moment in contemporary US History in 1968.  By this time, the optimism of the early 1960s has given way to cynicism as race riots, divisions over the Vietnam War, and assassinations roiled the country.  In the film, which is conceived as an allegory for the civil rights movement, three astronauts, led by Charlton Heston, arrive on a world where apes rule over humans.  The astronauts left Earth thousands of years ago on a journey to find alien life.
                The film reflects a number of the key issues of the 1960s.  Opening a year before the Apollo 11 moon landing, the use of the astronauts as protagonists demonstrates the centrality of the space program during this period.  I discussed this in my earlier post on the end of the shuttle program.
                The movie also reflects the racial tumult of the period.  In a scene that is repeated in Rise, the apes use a fire hose on Heston while they hold him in prison.  This scene was inspired by a pivotal moment in the civil rights movement, when the Birmingham, AL police do the same to children protesting in 1963.  This incident, which was replayed on television, led JFK to propose legislation that would eventually become the Civil Rights Act of 1964.
                Echoes of the youth revolution and the generation gap of the period can be seen as well.  One of the apes helping Heston tells him not to act like another adult giving orders.  “Never trust anyone over 30,” jokes Heston to the younger ape, a common refrain of the New Left at the time.
                Though the civil rights movement had achieved many legal goals by 1968, the women’s movement was still only in its infancy.   Betty Friedan and other second-wave feminists had only formed the National Organization for Women (NOW) in 1966. Heston is accompanied by three other astronauts, including one African American man.  However, the crew only includes one female astronaut, who was brought for little other purpose than procreation and died during the flight.  Though the Soviet Union had already sent up a female cosmonaut, Sally Ride only became the first American women in space in 1983.
                Of course, the most famous scene in the movie is still the ending (Spoilers), when Heston discovers that the planet he landed on is actually the Earth of the future.  When he sees the remains of the Statue of Liberty, we are left to believe that there has been a full-scale nuclear war, which was one of the central fears of the Cold War. This had been a theme of a number of films of this period, including Dr. Strangelove and Fail-Safe, both released in 1964.   Of course, we had come perilously close to that outcome in the Cuban Missile Crisis, which had occurred a few years earlier in 1962.  It’s possible that younger viewers don’t make that connection.
         Finally, in a preview of his later role as head of the NRA, Heston carries a gun throughout much of the film.

Sunday, July 24, 2011

Captain America and World War II

I enjoyed “Captain America;” it was a fun movie that shows how summer movies can be a nice diversion.  Unlike “X-Men: First Class,” it did not, however, hold up much when it came to historical scrutiny.  While I know it is only a Hollywood film, I found it annoying that the movie shows black, white, and Japanese American soldiers serving together in World War II.  We fought World War II with a segregated army with blacks and Japanese Americans fighting in separate units.  The all-Japanese 442nd combat division was one of the most decorated units in the conflict.  President Truman did not issue an order integrating the army until 1948, three years after the end of World War II.  In reality, the military did not fight a war as an integrated force until the Vietnam War in the 1960s. While some have commented that the inclusion of minorities highlights their contributions, I don’t think it is helpful for Hollywood to sanitize our past.
A couple of other interesting items.  The German émigré scientist, Dr. Erskine, who develops the formula that transforms Steve Rogers into Captain America, seems to represent the German intellectuals who fled the Nazis in the 1930s.  Most of them were Jewish, though the film never makes it explicit that Erskine is Jewish.  In the comic book, the character’s name is Reinstein, which is likely a reference to Albert Einstein.
Another interesting nod is the appearance of Tony Stark’s (Iron Man) father, Howard Stark, in this film.  The use of the name “Howard” seems to be a nod to Howard Hughes, whom some have theorized was the original inspiration for the Tony Stark character when the Iron Man comic premiered in the early 1960s.

Wednesday, July 13, 2011

Harry Potter and Diversity

In recent years, the United States has experienced a huge wave of immigration which has transformed the nation’s demography.  At the same time, many European countries have undergone a similar transformation, including Great Britain.  Western Europe experienced a significant labor shortage after World War II and welcomed “guest workers” from North Africa and Asia.  Many people from India, Pakistan, and other former British colonies emigrated to Great Britain.  These “guests” and their families stayed and have become part of the fabric of these countries.   I have been to London twice in the last ten years and it feels a lot like New York City with a tremendous diversity of cultures.
The “Potter” books and films reflect this phenomenon.  Though none of the main characters come from minority groups, a number of supporting characters do.  Perhaps the most important was Cho Chan, Harry’s first crush.  Ron and Harry went to a school dance in Goblet of Fire with the Patel twins.
Other British sci-fi products have reflected this change.  For instance, the new “Doctor Who”, which has featured a couple of black cast members, has had a much more diverse feel than the original program, which aired from the 1960s-1980s.