Tuesday, January 31, 2012

The Super Bowl Halftime Show and the History of Rock n' Roll

Since the Janet Jackson “wardrobe malfunction” debacle of 2004, the National Football League (NFL) has turned to “safe” musicians with traditional star power for its Super Bowl halftime shows.  ‘60s icons such as Paul McCartney, the Rolling Stones and the Who have performed as well as ’80s stars like Bruce Springsteen, Tom Petty and Prince.  The NFL’s choices reflect the decline of the culture wars of the 1960s as well as the enduring power of artists from the mass culture era of the 1980s.

When rock n’ roll emerged out of a fusion of rhythm and blues, country, and gospel in the 1950s, it was immediately controversial.  The backlash against artists like Elvis, Jerry Lee Lewis, and Little Richard led to a domestication of the music by the early 60s.  Groups like Frankie Avalon and the Four Seasons, whose story has been revived in recent years by the musical “Jersey Boys,” and others promoted a brand of music that lacked the aggressive sexuality of the early rock n’ roll.  Some observers even suggested that rock was going to fade away.

The emergence of the Beatles and the British Invasion bands of the mid-1960s dispelled such thoughts.  Rock music reached new heights and these groups were often controversial given the social battles of the period.  The Beatles drew the ire of many Christians when John Lennon suggested that the Fab Four were “bigger than Jesus.”  The Rolling Stones, who sought to be more confrontational than the Beatles, were particularly galling to traditionally-minded people in the era of the “generation gap” as young and old clashed over the counterculture, civil rights, and Vietnam.  In 1966, a poll showed that rock was the most unpopular music in the nation, disliked by almost half of adults (“Forty Years After Woodstock, a Gentler Generation Gap,” p. 2).

As time passed and the baby boomers aged, the music of the 1960s became “classic rock” and Generations X and Y embraced it with enthusiasm.  A recent survey revealed that the Beatles are virtually the favorite band of both people between ages 50 and 64 as well as those between ages 16 and 29.  Rock n’ roll is now the most popular music of every age group except those 65 and older (“Forty Years After Woodstock,” 18, 3).   As a result, the 60s counterculture is now prime material for the family-friendly halftime show.

Similarly, once-polarizing 80s performers like Madonna, who courted controversy with her songs and videos during the Reagan years, is now acceptable.  The Material Girl, like Springsteen and Prince, was a star of the last era of mass culture in music.  As I noted in my post on the evolution of the music industry, http://popculturemeetshistory.blogspot.com/2011/11/20th-anniversary-albums-and-changing.html, the MTV/FM radio era of the 80s and early 90s that gave musicians a wide audience has faded, leaving fewer acts that can garner the attention of the nation.  Last year’s disappointing performance by a more contemporary group, the Black Eyed Peas, demonstrated this phenomenon as the Peas had to cover other artists’ songs during their show.  They simply did not have a broad enough catalog of their own recognizable hits.

But there is a problem the NFL will have to face soon as the league is fast running out of entertainment options.   There are no remaining 60s artists able to perform and the supply of 80s and 90s stars is also rapidly diminishing.   I predict we will be watching Green Day at a Super Bowl halftime in the near future.

Sources:  “Forty Years After Woodstock, a Gentler Generation Gap,” Pew Research Center, August 12, 2009

Monday, January 30, 2012

Kiefer Sutherland's "Touch" and 9/11

As I’ve noted previously, the intersections between 9/11 and popular culture provided the primary inspiration for starting this blog.  In the years following the attacks, I gravitated to programs such as “24” and “Heroes,” which were directly influenced by the war on terror and the fears and debates it spawned.  These shows have concluded and been replaced by programs that seem to be moving away from 9/11 allegories.  Fox’s new show “Touch,” developed by “Heroes” creator Tim Kring and starring “24’s” Kiefer Sutherland, reveals a continued shift away from direct fears of terrorism to an emphasis on the legacy of the attacks and a return to concerns about traditional crimes.

In “Touch,” Sutherland’s character, Martin Bohm, is a single father who has been caring for his autistic son since his wife was killed in the North Tower of the World Trade Center.  His son can’t speak, but is able to see numerical patterns that help Bohm prevent future crimes and accidents.  Through a series of hard-to-explain mechanisms in the pilot, Bohm and his son facilitate the conditions by which a New York City firefighter can rescue children from a burning school bus.  It turns out this firefighter is consumed with feelings of guilt because of his inability to save Bohm’s wife on 9/11. Like the Tom Hanks/Sandra Bullock film “Extremely Loud and Incredibly Close,” “Touch” depicts how the deaths in the World Trade Center impacted those left behind.

From the previews of future episodes, it also seems that unlike Jack Bauer, Sutherland’s “24” character, Bohm will be trying to stop conventional crimes rather than terrorist attacks.  Similar to CBS’ s “Person of Interest,” “Touch” reveals a gradual move away from post-9/11 fears of spectacular nuclear, chemical, and biological events seen in “Heroes” and “24” to anxieties about common malfeasance like murders, robberies, and car accidents.

Saturday, January 28, 2012

"Midnight in Paris" and the "Lost Generation" of the 1920s

The Oscar-nominated “Midnight in Paris” is Woody Allen’s best film since “Match Point” (2005), if not “Crimes and Misdemeanors” (1989).  Set in Paris, Owen Wilson plays a writer who wanders the City of Lights each evening looking for inspiration.  When the clock strikes midnight one night, a car takes him back in time to the Paris of the 1920s, where he hobnobs with literary greats like Ernest Hemingway and F. Scott Fitzgerald.  They were among many American intellectuals, often referred to as the “Lost Generation,” who left the country during the Roaring Twenties.  Disillusioned by World War I and frustrated by the dominance of conservative values in their home country, they made their way to France.

In the years before the First World War, many intellectuals held optimistic beliefs about the inevitability of human progress.  Such views were shattered by the devastation of the trenches and the Western Front (see post on Spielberg’s “War Horse” http://popculturemeetshistory.blogspot.com/2011/12/war-horse-and-world-war-i.htmlHemingway himself was wounded in the conflict.

For the first time, the census showed that a majority of Americans lived in cities in 1920.  During the subsequent decade, rural and urban Americans clashed over cultural issues as many in the heartland feared that traditional Victorian values were fading in the face of the rise of the cities and secular ideas.  The divisions of this period were akin to the contemporary divide between “red” and “blue” states.

The strength of rural conservatism disturbed some of the urban intellectuals and if abortion, gay rights, and gun control represent the cultural divides of the 1990s and early 21st century, then alcohol, immigration, and evolution were the analogous splits of the 1920s.  Throughout the decade, Prohibition reigned as the law of the land, though it was routinely flouted.  The Ku Klux Klan expanded beyond its traditional rural Southern base and garnered support across the country, controlling the politics of states such as Indiana and Colorado.  In this period, the Klan’s agenda went beyond a desire to maintain racial supremacy and extended to concerns about rising immigration from eastern and southern Europe as well as the power of the Catholic Church. Their influence grew to the point that the Klan helped prevent the Democratic Party from nominating Al Smith, the governor of New York and a Catholic, for president in 1924.  Finally, religious modernists and fundamentalists battled over evolution laws in a number of states, culminating in the famous clash between Clarence Darrow and William Jennings Bryan in the Scopes Trial of 1925 in Dayton, Tennessee.

The emerging consumer culture of the period also disturbed the American expatriates.  The US economy boomed in the 1920s, and while not all shared in the prosperity, the middle class grew and new and exciting products such as radios became available.  From the perspective of the “Lost Generation,” too many Americans of this time were simply concerned about making money and “keeping up with the Joneses.”

Not all fled.  Some stayed and critiqued American culture from home.  But others went to Paris where they ran into Woody Allen—or at least Owen Wilson playing Woody Allen.

Wednesday, January 25, 2012

The History of the Super Bowl

As the media commences another two weeks of unrelenting hype in advance of Super Bowl XLVI, it is the perfect time to review the history of the biggest sporting event in the country.  Far younger than the World Series, the Masters, or the Kentucky Derby, the NFL championship game has become a virtual national holiday in which the life of the country comes to a full and complete stop.
The Super Bowl’s origins lie in the creation of the American Football League (AFL) in 1960. Started by a group of businessmen who wanted pro football teams but were frustrated by the NFL’s unwillingness to expand, the AFL forged ahead as an alternative that would play a more wide-open brand of football.  So began a rivalry that would help propel pro football ahead of baseball as the most popular spectator sport in the country by the end of the decade.

In 1966, after several years of competition, NFL Commissioner Pete Rozelle and Lamar Hunt, owner of the AFL’s Kansas City Chiefs, negotiated a merger agreement in which the two leagues would formally join together in 1970.  In the meantime, the AFL and NFL champions would play each other at the end of the season and Hunt suggested calling the new game the “Super Bowl.” Though both he and Rozelle thought a better title could be found, sportswriters started using the moniker in advance of the inaugural game in January 1967 and it stuck (MacCambridge, 236-237).
Though there was anticipation before Super Bowl I between the Green Bay Packers and Kansas City Chiefs, the hype did not remotely approach what we will see over the next ten days.   The game, which was held in the Los Angeles Coliseum, did not even sell out.  As Michael MacCambridge, author of a history of pro football, observed, “fans simply weren’t used to traveling to neutral sites.” (MacCambridge, 240)  Though the Vince Lombardi-era Packers routed the Chiefs, ratifying notions of NFL superiority, the game drew 65 million television viewers, the largest ever for an American sporting event at the time (MacCambridge, 240).
The game’s popularity took off from there as the New York Jets’ shocking upset of the Baltimore Colts in Super Bowl III gave the AFL credibility.  After the merger, the NFL split into the American Football Conference (AFC) and the National Football Conference (NFC) and the victors of those conferences fought it out at the end of each season.  The two-week gap between the conference championship games allowed suspense to build, as the media presence grew dramatically. By 1974, the event had grown to such proportions that Norman Vincent Peale declared that if Christ were alive “he’d be at the Super Bowl.” (MacCambridge, 312).
As the NFC’s domination of the AFC produced a series of Super Bowl routs in the 1980s, Madison Avenue swooped in to create a different kind of interest in the game.  In 1984, Apple commissioned a Ridley Scott-directed commercial promoting their new Macintosh computer.  The ad, based on George Orwell’s dystopian novel, 1984, showed a woman tossing a sledgehammer into a gigantic TV screen of Big Brother’s propaganda, thereby destroying it.  Shown during Super Bowl XVIII, the commercial started a sensation and from that point forward, corporate America debuted their best ads during the game.  After all, no better place to unveil them than before the biggest national television audience of the year.  And ranking the spots became another part of watching the game.
While viewership for the World Series and NBA Finals are highly dependent on whether large-market teams or major stars participate or not, the Super Bowl’s ratings are almost unaffected by these factors.  The NFL’s revenue-sharing arrangement allows small-market teams to remain competitive and even become national brands.  While a playoff matchup between Milwaukee and Pittsburgh would strike fear into the hearts of baseball officials, last year’s Super Bowl between Green Bay and Pittsburgh drew an American television record of 111 million viewers.  
With the rise of cable TV, the Internet, and other entertainment options, the country rarely pauses to watch or follow the same event, except in cases of national tragedy.   When the game between the New York Giants and New England Patriots kicks off at 6:30 p.m. EST on February 5, virtually the entire nation will be watching, producing a collective experience that is rare in today’s niche culture world.

Sources: Michael MacCambridge, America's Game: The Epic Story of How Pro Football Captured a Nation (New York, 2004)

Sunday, January 22, 2012

"Red Tails"--George Lucas on the Tuskegee Airmen

When I heard George Lucas was making a film about the Tuskegee Airmen, my first thought was “I hope it’s better than the Star Wars prequels, because the Tuskegee Airmen deserve better.”  Unfortunately, “Red Tails” was not better than the prequels—in fact, it may have been worse.  Lucas wanted to produce an old-fashioned war film, with one of his colleagues saying black soldiers had never received the “John Wayne treatment.” (NYT, January 22, 2012)  “Red Tails” accomplishes that goal, but adds a heavy dose of corny dialogue and dull characters.  “Glory” (1989), which gave long-overdue attention to African-American military service in the Civil War, showed you could make a heroic war movie that is also sophisticated.  “Red Tails” falls well short in this regard.

The film depicts the true story of black pilots in the Army Air Force during World War II (the Air Force did not become an independent service until 1947).  Overcoming opposition from within the military as well as segregationist politicians, the Tuskegee Airmen fought to serve their country.  Though not portrayed in the film, First Lady Eleanor Roosevelt pushed hard for them to have the opportunity, bringing media attention to their cause.  Once they received a chance, the airmen played a central role in the air war in Europe during the last two years of the conflict.  The 332nd Fighter Group, one of two squadrons which composed the airmen, had a “record [that] was unmatched by any single escort group,” wrote Gail Buckley in her history of blacks in the military, adding that “in two hundred missions they never lost a single bomber” (American Patriots, 277). Though “Red Tail’s” approach is wooden, the movie gives well-earned credit to the pilots, showing them as traditional military heroes.

The two main black flyers, however, are portrayed in cookie-cutter fashion; one is cautious and careful while another, Joe Little, is hot headed and adventurous. Little (whose name may be a reference to Malcolm Little, who became Malcolm X) gets into a bar fight with white soldiers at an officer’s club in Italy after he is called the n-word.  This scene reflects the fact that conflicts between black and white soldiers were common during the war, deeply worrying military leaders. 

Black airmen noted that they faced little prejudice from Italians.  Indeed, Little has a relationship with a local woman, as some black soldiers did.  As in World War I, stories about these interracial relationships deeply concerned many in the South, where Jim Crow had always centered on a fear of sex between black men and white women.

Race often came to forefront during the Second World War as some historians see the conflict as the start of the modern civil rights movement.  Even before the war started, many in the government were concerned about black morale.  A. Philip Randolph, a key black union leader, threatened to lead a March on Washington in 1941 if the Roosevelt Administration did not take action to prevent discrimination in defense employment.  When Randolph wouldn’t budge, FDR issued Executive Order 8802, which created the Fair Employment Practices Commission (FEPC).  Though the FEPC would be underfunded and largely ineffective, it represented the first federal measure to protect the political rights of African Americans since Reconstruction.  While Randolph called off the march, it provided the inspiration for the 1963 March on Washington where Martin Luther King, Jr. gave his “I Have a Dream” speech

Once the war began, the fight for democracy against racist regimes in Germany and Japan exposed the contradiction between American rhetoric and American practice.  Noting that African-American soldiers were fighting for freedom abroad when they didn’t have equal rights at home, black newspapers like the Pittsburgh Courier promoted the “Double V” campaign of defeating both Jim Crow and the Axis powers.   White liberals began to focus on the need to address the problem of segregation, albeit gradually. 

500,000 blacks served in the American military during the war (American Patriots, 261).  The Tuskegee Airmen, like all African-American soldiers, faced Jim Crow conditions while off base during their training in the South.  While the pilots and others saw combat in segregated units, traditional stereotypes about the weak capabilities of black soldiers (briefly referenced in “Red Tails”) relegated most to supporting roles.  One notable exception to both rules came during the Battle of the Bulge in 1945, when manpower shortages forced the integration of a few infantry units.

In interviews publicizing the movie, Lucas has mentioned the possibility of making another film that dramatizes the reaction to the airmen when they return home.  While most black soldiers came back to the United States in 1945 hoping to find jobs and resume their lives, some sought to challenge the racial caste system. A young Mississippian named Medgar Evers tried to vote in the Democratic primary in his home state in 1946, only to be met with violence. He later became head of the NAACP in Mississippi.  Jackie Robinson, who served in the Army during World War II, integrated major league baseball when he debuted for the Brooklyn Dodgers in 1947.  At the same time, many in the South wanted to ensure that returning black soldiers did not take the democratic rhetoric of the war too seriously.  Lynchings, which had declined significantly over the previous generation, increased dramatically in 1946-47 and several veterans were among the victims, spurring President Harry Truman to take a stronger stance on civil rights. 

The heroic service of the Tuskegee Airmen as well as other black soldiers paved the way for President Truman’s order integrating the military in 1948.  Many military leaders, including five-star generals such as Dwight Eisenhower and Omar Bradley, opposed integration, with Bradley declaring that the military was not a venue for social engineering.  It is remarkable to see the similarities between this thinking and contemporary arguments against allowing gays to serve openly in the military.

Lucas said that he wanted to make a heroic film for black teenagers because African Americans have never received that kind of portrayal in a World War II film.  Indeed, dating back to the movies made during the war, World War II films have rarely highlighted black participation.  Even in recent movies like “Saving Private Ryan” (1998), “Thin Red Line”1998), and “Flags of our Fathers” (2006), African Americans barely appear.  Despite disdain from critics, “Red Tails” is getting a strong response from audiences, doubling the studios’ expectation with its opening weekend gross.  Perhaps it will simply be a crowd-pleaser that calls attention to an important and relatively unknown part of American history.  Many movies don’t even accomplish that much, though I would recommend the HBO movie “Tuskegee Airmen” (1995) for a more sophisticated look at the subject.

Gail Buckley, American Patriots: The Story of Blacks in the Military from the Revolution to Desert Storm (New York, 2001)

New York Times Magazine, January 22, 2012

Saturday, January 21, 2012

Tinker, Tailor, Soldier, Spy

I very much enjoyed the extremely intricate “Tinker, Tailor, Soldier, Spy,” which harkens back to the 1970s and the era of Cold War espionage.  It focuses on the British MI-6 agency and is about as far in spirit from the James Bond movies as possible.   George Smiley, the film’s protagonist, is no suave 007, but a sober analyst whose life has been broken by the spy business.  Played expertly by Gary Oldman, he comes out of retirement and methodically seeks out a mole within the ranks of British intelligence.

Based on a 1974 novel by John Le Carre, the story seems to be based on the Cambridge 5, a famous British espionage ring that was exposed in the 1950s.  They were a group of British agents who became communists while students during the Great Depression.  During the 1930s, many in the West were attracted to Marxism because the chronically high unemployment of the period made it seem as if capitalism had failed.  At the same time, the growth of the Soviet Union under Stalin’s 5-Year Plan of forced industrialization led many to believe it was the land of the future.  Others were attracted to communism because they thought that the USSR was standing up to fascism while the Western democracies dithered as Hitler’s power grew.  After the revelation of the Stalinist terror campaigns of the 30s, along with the signing of the Nazi-Soviet non-aggression pact in 1939, many leftists soured on communism.  Nevertheless, when the Cambridge 5 joined British intelligence, they passed along state secrets to the KGB during World War II and into the early years of the Cold War.

The film takes place during the Cold War and reveals some similarities between that struggle and the war on terror.  Both saw concerns regarding government secrecy and about how far to go with regard to interrogation of suspects.  Furthermore, both eras featured fears about the possibilities of fifth columns undermining national security.

Friday, January 20, 2012

Oscar thoughts

There have been plenty of Oscar prognostications, so I’m not going to do a full Academy Awards preview, but here are a few random thoughts concerning Tuesday morning’s announcements.

Andy Serkis, “Rise of the Planet of the Apes”- I very much hope that he receives a nomination for playing Caesar, the ape who is the central character of the film.  Beginning with his performance as Gollum in the Lord of the Ring movies, Serkis has been the master of the new motion capture technique with CGI.  While he was a key supporting character in LOTR, his acting carries “Rise”, which was my favorite summer film.  I’m guessing, however, that his nomination will have to wait until he reprises his role as Gollum in this fall’s “The Hobbit.”

Alan Rickman, “Harry Potter and the Deathly Hallows, Part II”- The Potter franchise has been the most consistently strong of any film franchise, yet has received little Oscar love.  Rickman helped make Severus Snape the most interesting character in the whole series and his acting stood out in the movies, which was quite impressive given that he was performing opposite half of the Royal Shakespeare Company.  Furthermore, Rickman may be the best active actor not to have received a nomination, despite a series of excellent performances starting with his memorable turn as Hans Gruber, the villain in the original “Die Hard” (1988).

Saturday, January 14, 2012

Oliver Stone's "Wall Street," 25 Years Later

In recent weeks, pundits ranging from Paul Krugman to Ross Douthat have made analogies to Oliver Stone's 1987 film "Wall Street," with some going so far as to compare Mitt Romney with Gordon Gekko.  In light of this dynamic as well as the spirited contemporary debate over economic inequality, I decided to reexamine the movie. Though it was released during another era of economic turbulence in the Reagan years, it appears as relevant as ever today.
It is important to remember the events surrounding the film, as the 1980s witnessed an historic bull market with stocks taking off as the economy recovered from the 1981-82 recession.  While Wall Street brokers made a great deal of money in the 1960s and 70s, the rewards grew dramatically during the 1980s.  At the same time, income inequality began to grow as the rich got richer, while the middle class barely gained, and the poor stagnated or fell behind.  Finally, a number of major insider trading scandals broke in the late 1980s, famously ensnaring Ivan Boesky and Michael Milken.
Leveraged buyouts (LBOs), a process by which corporate leaders bought companies, then broke them up and sold them off rather than rebuilding them, also emerged in the 1980s.  The advocates of LBOs claimed they were bringing new competitive pressures to American business, forcing them to be more efficient or face takeovers.  To their critics, firms like Romney’s Bain Capital were merely corporate raiders that earned money by destroying businesses rather than creating anything.
This tension plays out in “Wall Street,” as a young trader named Bud Fox (Charlie Sheen) is working his way up the career ladder.  The son of an airline union leader (real-life dad Martin Sheen) who is not satisfied with conventional success in the financial industry, Fox wants to make the big time by working with a major player, Gordon Gekko (Michael Douglas).  Once he ingratiates himself into Gekko’s inner circle, Fox engages in all sorts of illegal activity, notably insider trading.  He makes a great deal of money in the process and the film is replete with examples of the conspicuous consumption of the 1980s.
Though the film revolves around Fox, the movie’s star is clearly Gekko and Douglas’ performance earned him an Oscar for best actor.  Equally charismatic and villainous, Gekko tempts Fox into his orbit, eventually using the relationship to put himself in a position to purchase and liquidate Fox’s father’s airline, Blue Star Airways, laying off all the workers in the process.
While Gekko’s “greed is good” speech is the most famous clip from the film, I think Gekko’s response to Fox when he confronts him about Blue Star’s fate is far more instructive.  “The richest one percent of this country owns half our country's wealth, five trillion dollars,” Gekko declares, “One third of that comes from hard work, two thirds comes from inheritance, interest on interest accumulating to widows and idiot sons and what I do, stock and real estate speculation. It's bullshit.  You got ninety percent of the American public out there with little or no net worth. I create nothing. I own.”  In that sense, Gekko was dramatically different from morally ambiguous businessmen from the late 19th and early 20th centuries, such as John D. Rockefeller and Andrew Carnegie, who created massive new industries and businesses.  It also differentiated him from a real-life contemporary, Steve Jobs, who was leading the personal computing revolution at Apple.
Gekko’s speech proved prescient as the nation’s economy was in the early stages of an economic transition that has continued to the present day.  According to Alan Krueger, Obama’s Chairman of the Council of Economic Advisers, “The proportion of people in the top 1% who were from the finance and real estate sector nearly doubled from 1979 to 2005.  And in 2005, executives from the finance and real estate sector made one quarter of the income in the top .1 percent.” While the earnings of different income quintiles grew by roughly the same amount from the late 1940s to the mid-1970s, Congressional Budget Office data shows that the real after-tax incomes of the top 1% of families grew by 278 percent between 1979 and 2007, while the middle 60 percent grew by less than 40 percent (Krueger speech to Center for American Progress, Jan. 12, 2012).  Today, income inequality has reached its highest level since 1929, the start of the Great Depression. 
At the end of the film, Fox maneuvers to save Blue Star Airlines, but is arrested for insider trading.  He wears a wire to obtain evidence against Gekko before going to jail at the end of the movie.  His father tells him it may be the best thing for him to go to prison and “Stop going for the easy buck and produce something with your life. Create instead of living on the buying and selling of others."  With the insider trading scandals and the October 1987 stock market crash, change in Wall Street appeared to be on the horizon.  “I thought the ’80s would have been an end to a cycle," reflected Stone in 2008, "I thought there would be a bust. But that’s not what happened.”(New York Times, October 5, 2008)
Quite the opposite occurred.  Bizarrely, many people who went to work on Wall Street over the next two decades cited Gekko as their inspiration.  The 1990s witnessed another stock market boom, as high-technology stocks soared even though many of them never made a profit.  Though inequality did narrow some due to the low unemployment of the Clinton years, the bubble burst as the century closed.  The 2000s saw relatively weak growth fueled by the housing bubble, which also exploded, leaving the Great Recession in its wake.  In the aftermath, Stone revisited the subject in the much weaker “Wall Street: Money Never Sleeps” (2010).  The sequel revolves around Gekko, but Fox makes a brief cameo.  Rather than learning a lesson from his comeuppance, Fox has become a morally questionable entrepreneur himself.  Perhaps a sad metaphor for the country not learning anything either.

Monday, January 9, 2012

The SEC's Dominance and the New South

With Alabama winning the SEC’s sixth consecutive national title, the conference’s dominance in college football has been established beyond a doubt.  Many sportswriters have discussed the reasons for the conference’s supremacy, ranging from the popularity of the sport in the region to the massive television contract that gives its members the power to pay top dollar to hire the best coaches.  Beyond sports, however, the strength of the Southeastern Conference illuminates key shifts in the country with implications beyond the football field.
It may come as a surprise to those living south of the Mason-Dixon Line that college football started in the Northeast during the late 19th century, conceived in part as a way for the children of the Eastern Establishment to establish their manhood.  Theodore Roosevelt and others saw it as a way to toughen a generation that had not experienced combat in the Civil War.  By the 1920s, college football had established itself in the South, where it eventually became the most popular sport and the area’s passion.
The rise of the SEC reflects, among other things, the shift in population away from the Northeast and toward the Sunbelt since the Second World War.   Weapons production and the expansion of military bases in the region during the war brought new people to the area and this trend continued as Cold War defense spending created a peacetime military establishment.  Strong chairmen of the Senate Armed Services Committee, such as Richard Russell of Georgia and John Stennis of Mississippi, used their clout to place bases in the region and funnel defense contracts to local plants.  The Eisenhower Administration started the interstate highway system during the 1950s, which strengthened transportation in the relatively poor region, paving the way for population growth. 
With an improved infrastructure and lower labor costs as an attraction, manufacturing and other businesses began leaving the unionized Northeast and Midwest for the nonunion South.   Foreign companies followed, with BMW, Mercedes, and Nissan building plants in South Carolina, Alabama, and Mississippi, respectively.  Over the last half-century, economic differences between the South and the rest of the country have narrowed considerably, with per capita incomes nearly reaching parity. 
Improved race relations were key to the South’s renaissance, as it was impossible for the region to move forward economically under Jim Crow, which restrained the potential of its black citizens. It is no coincidence that the much of the economic growth in the area has come since the passage of the Civil Rights Act of 1964 and Voting Rights Act of 1965, which ended legal segregation.  Indeed, business leaders in the area were often instrumental in pushing for change, not because of a humanitarian concern, but because of an understanding that racial disputes discouraged national and international investment.
Before the civil rights era, Southern schools did not recruit black players and were often unwilling to even play against integrated teams.  Top black players went to Northern conferences like the Big Ten or to historically black colleges (HBCUs).  Throughout the Jim Crow era and in its immediate aftermath, Grambling, under head coach Eddie Robinson, was an HBCU powerhouse with players like NFL Hall of Fame receiver Charlie Joiner and Doug Williams, the first black quarterback to win a Super Bowl.    SEC teams squandered hometown talent, as the University of Mississippi eschewed recruiting Walter Payton, who later became the NFL’s all-time leading rusher.  Payton stayed in state to play at Jackson State, another historically black college.
After the passage of the civil rights laws, the SEC gradually embraced recruiting black players.  Vanderbilt and Kentucky became the first schools to do so in 1966 and eventually the legendary Alabama coach Bear Bryant revived the Crimson Tide in the 1970s by aggressively pursuing African American athletes.  In the early years of the post-Jim Crow era, some black players were likely reluctant to go South because of strong memories of the violence of the civil rights era. Indeed, I believe the dominance of the conference today is in part due to the fading memories of the upheavals of the 1950s and 60s, as African American athletes are now more enthusiastic about playing in the Old Confederacy.  In particular, the success of many black quarterbacks in the SEC over the last decade would have seemed unlikely a generation ago.
Today, the SEC is the top football conference and the South is the fastest-growing region of the country, both economically and in terms of population growth.  Of course, the gains have been uneven, as Virginia, North Carolina, and Georgia are more prosperous than Mississippi, Alabama, and Arkansas.  While race relations have improved and black players dominate the field, some barriers still remain.  As of 2011, there has been only three black head coaches in the history of the conference.