The Orphan Blockbuster: How We Stopped Caring and Learned to Love Unlovable Movies
Soda is delicious. But to the ancient Callatians, so was the flesh of dead relatives and nowadays no one outside of gourmand serial killers would salivate over a dish of foie gras d’ humain. That soda and junk food have followed in the footsteps of flesh and cigarettes to become the consumptive Voldemorts of the 21st century presents a great challenge for corporate confectionerians: How to produce products with addictive deliciousness without fattening the populace into lumbering Elephant Men.
PepsiCo’s quest of attaining this snacktopia was chronicled recently in a fascinating New Yorker article written by John Seabrook. In the article, Pepsi’s strategies for creating healthier food — developing a brand new type of salt with the atomic-age name “15 Micron Salt” and building a “taste testing” robot hardwired with cultured cells featuring the genetic sequences of the four known taste receptors — seemed more like excerpts from a science-fiction novel than the evolutionary next step for Cool Ranch Doritos.
Instead of spending hundreds of millions dollars on cutting-edge scientific research, all the folks at Pepsi really needed to do was look west toward Los Angeles. During the past fifteen years the marketing, distribution, and accounting departments inside Hollywood studios– the real imaginative forces of the dream machine—have discovered a can’t-miss business algorithm: making movies that no one likes but everyone goes to see.
Or, to be more precise, the Orphan Blockbuster.
Like the pod people in Invasion of the Body Snatchers, the Hollywood hive mind has now confused audiences to the point where they can’t tell good from bad. Let’s look at two recent releases, both on their way to making more money than the GDP of Guinea-Bissau: Fast Five and Pirates of the Caribbean: On Stranger Tides.
Both films may be built with near-identical DNA–sequels to long-running franchises with giant budgets, giant set pieces and giant stars–but that’s where the comparisons end. Fast Five earned its money honestly. Despite the fact that it was the fifth in a series and broke many of the Seven Deadly Sequel Sins, the film was made with a sort of crazed, genuine passion that bled through the screen. Audiences who saw it loved it. And then told others to see it
Unlike the rich and diverse talking points Fast Five provided to audience members exiting the theater (the amazing car chases, unintentional high-camp performances, hidden homoerotic subtext), the topic of conversation for those leaving Pirates 4 was more likely focused on whether one should go to Applebee’s or Chili’s for the 2 for $20 meal deal.
With unemployment hovering at 9%, many of you reading this blogumn probably don’t have a job. If so, let me offer you a (sadly, non-paying) assignment: Watch the top ten highest-grossing films of each year from 1980 through 1999. Despite a few clunkers, you will at least enjoy the movies. And rightfully so, as the success of those films hinged on their popularity.
So when did a movie’s quality stop being an integral component as to whether or not someone would buy a ticket? One can point to several cases in the late 1990’s where audience appeal was but a small factor for box office success (I’m talking to you Armageddon), but if you were to name one film, then 2000’s Mission: Impossible 2, which made $215 million domestically to become the third highest-grossing film of that year, would be this epidemic’s patient zero.
I have never, ever, ever met anyone who even remotely enjoyed Mission: Impossible 2. And those who have actually taken in a second viewing or speak of the sequel without using adjectives like “incomprehensible” or “coma-inducing” are so rare as to be legitimately considered crypto-creatures.
And no, we’re not talking about coastal critics and cineastes who love to pan the latest Bruckheimer behemoth in favor of some Icelandic incest drama, we’re talking about all of America, which collectively gave M:I2 a dreaded “B” grade when polled by Cinemascore during its opening weekend.
And while a “B” might be cause for celebration as the final result to a grueling semester in Combinatorial Topology at Brown, considering that the average Cinemascore grade is a “B+” then M:I2’s “B” is nothing more than an abbreviation for “below average.” Hollywood, however, seemed to have greeted this news with cheers — M:I2 became the Henrietta Lacks for every future Orphan Blockbuster.
The first step Hollywood took in adopting the Orphan Blockbuster was to radically decrease production on medium-sized movies. These small-scale dramas, character comedies, and down and dirty action films had mid-level budgets and could be counted on for mid-level grosses. But as any freshman marketing major knows, “choice depresses response,” and this past Christmas season has proven that, if given the chance, audience members would much rather see a good movie like The King’s Speech or True Grit than a Orphan Blockbuster like Tron: Legacy. Aside from massaging star egos, paying back filmmaker favors and the outside chance of winning an Oscar, the not-so-secret secret is that studios don’t actually want to make True Grits. They would much rather populate multiplexes with as many Tron: Legacys—and their accompanying higher budgets and higher ceilings for profit– as possible.
But in the 1990’s and early 2000’s there was a problem; big-budget, four-quadrant “tentpole” films could only be released when the majority of Americans went to the movies: summer (which accounts for 40% of all box office revenue) and Christmastime. To adopt as many Orphan Blockbusters as possible, the studios needed more optimal release dates. So like the Catholic Church in the 16th Century, they changed the calendar. Long accepted seasonal beginning and end points (Memorial Day to Labor Day and Thanksgiving Weekend to New Years) were fattened up for maximum movie capacity.
The inaugural blockbuster of the extended summer season was 1996’s Twister, which was released the second week of May. In 2002, Spider-Man swung in a week earlier (May 3) with a record-breaking opening weekend and this year Fast Five reversed summer into April. The Christmas movie season has followed suit: many of the highest-profile holiday hopefuls are released during the first two weeks of November and even the last weekend of October is now being flirted with.
Last May it seemed that the inglorious reign of the Orphan Blockbuster had come to an end as audience members rejected a string odoriferous offenders (Prince of Persia, Shrek Forever After, Sex and the City 2, Robin Hood), but like the slasher in countless horror films its demise was nothing but a fake ending as the monster grosses of Thor and Pirates have shown.
By extending our favorite periods of the year and populating it with the sort of supposedly “fun” (but really “bad-for-you”) product we expect during those times, the studios have waged a decade-long reign of terror on our taste and there is no end in sight. Who would have guessed that Hollywood, and not climate change, would be the first to bring about snowy summers and Indian Christmases?
Ryan Dixon is the Associate Product Manager of ScriptShark and Screenline. A graduate of Carnegie Mellon, Ryan is the co-author of the graphic novel Hell House: The Awakening and a contributing blogumnist to FierceAndNerdy.com. He has worked in development at such companies as Paramount, MGM/UA, World Wrestling Entertainment, IMAX, and Good in a Room.
This blog post was also published as part of The Ryan Dixon Line on FierceAndNerdy.com.