[1962] To Kill A Mockingbird

There’s a handful of roles made for a single actor. It’s rare that an audience will remember an actor by a single role. It takes a confluence of happenstance–timing being the big one. The right actor in the right circumstance with the right personality and experience meets the right writer who writes for the right projection of self; the plot is timely and impactful and the characterization is meaningful, riddled with emotional cues and the director and supporting cast have the right combination of empathy to allow the role to breathe or constrict, as written.

This is rare. It’s rare to get a handful of these circumstances in the same state, and even more unlikely to have them convalesce on the same set. George C. Scott as General Patton in Patton is one. Daniel Day-Lewis bucks this trend and seemingly rearranges spacetime to force the pieces together as Christy Brown in My Left Foot, William Cutting in Gangs of New York, Daniel Plainview in There Will Be Blood, President Lincoln in Lincoln, and about half a dozen others. One more to add to this list is Gregory Peck as Atticus Finch in To Kill a Mockingbird.

We’ve got to consider context, too. The external factors that audiences have access to were off the radar to audiences in 1962: it was an age of simmering indifference and false innocence. Americans were lulled into great times of growth. Post-war America ushered in a generation of prosperity and security, mildly plagued by simmering tensions in the East. Fathers and brothers who served their country and came home in Europe or Asia were rewarded with access to education, credit and stable jobs. It was never this way for black Americans, though. It wasn’t even a secret.

Harper Lee wrote To Kill a Mockingbird from within an era of piercing  failure of justice. Her words, with the benefit of experience, said the quiet part deceptively loud. Through her characters, tightly constructed, the reader sought idealism and the aforementioned justice for humanity. She defended different and championed compassion for the men and women she made. What American idealism had done for 300 years–dehumanized the black experience–Harper Lee, herself white, tried to tackle over 200 or so pages. For whatever looming threat lurked overseas unbeknownst for generations, the internal war we’d been fighting in America raged, nearly invisible to the naked eye. We’d fought to free the slaves a hundred years ago, but the lives of others remained nominally unaffected. Never forget Emmett Till.

Lee’s book, and Robert Mulligan’s movie, is what gives those who would otherwise ignore civil rights of others standing to fight for them, for all Americans, and especially black Americans.

Continue reading

[1972] Cries and Whispers

There’s enough copy out there about how Ingmar Bergman drenched his feature, Cries and Whispers in red (röd). He chose red (especially after the majority of his earlier work was monochrome) for its striking visuals, color theory, and connection to Swedish history. If nothing else, in Sweden a specific red, Falu Rödfärg, colors a significant number of buildings, especially in the countryside. Red, across cultures, symbolizes passion, lust, desire, etc. It’s also the least visible of the color spectrum, so I’ll offer that Bergman chose to oversaturate the film with it to stand it out. It’s the first thing you’ll notice about this movie.

That’s really all there is to say about the color red and Cries and Whispers. There’s lots of scholarship about it. That’s enough.

The not-so-secret fact about the Oscars is their Americanness: of the 566 films to ever have been nominated, 12 have been filmed in a language other than English. This isn’t surprising–the Academy voters are mostly native English speakers, American audiences overwhelmingly see movies performed in English, and despite the premise that the Oscar winner should best represent the gestalt of the year, what we really mean is the zeitgeist of American culture. The road to global reach is unpaved, likely, but we’d like to think that language is the one phenomenon we can overcome; there are subtitles.

But alas, Cries and Whispers was the last Swedish movie to earn an Oscar nomination for Best Picture (but not first! The Emigrants” was nominated just a year earlier. The Emigrants, however wasn’t wholly Swedish, either. It’s a Swedish-rooted story about America).

Swedish, as a language, is sort of bonkers to decipher, and nagging to the native English speaker because of its Germanic roots and bouncy gait. Its structure, like any language spoken by millions of people, follows a pattern, and there’s more than a few cognates. But there’s still great distance between Swedish and English, and it’s an effective barrier between the two cultures. Cries and Whispers is incredibly Swedish, incredibly Bergman, but also full of universal tones ahead of their time: the quiet tragedy of women.

Continue reading

[1940] Rebecca

Since evolved from a romantic horror genre to a more complex emotional battleground, Gothic arts take pleasure in allowing audiences to take part in their characters’ suffering; it’s the defining feature. The Germans have a word for the positive-extreme version: schadenfreude, or taking pleasure in someone else’s pain. It’s a mostly strange oddity of the human condition to relish in this emotion: it’s a private condition that’s always better left hushed. Hitchcock was a master of the Gothic, perhaps none more mesmerizing than Rebecca.

Alfred Hitchcock is known for his archetype defining tropes, many of which involve manipulating an audience to suffer–however slightly–for his own pleasure. Hitchcock’s use of schadenfreude remains classic, if not overlooked.His sound and visual cues were likely the first to signal a psychological trauma incoming (PsychoThe Birds), or the first to use first-person to treat the audience as a a voyeuristic character (Rear Window). But these tropes came from somewhere, and they likely were fully formed for Rebecca. 

Rebecca‘s strongest feature is pacing, which seems to turn on a dime, starting and stuttering, purposefully designed to keep the audience intentionally off-balance. It’s written in such a way — likely in the source material, too — that we’re not supposed to know who to root for or against at any given time. The de Winters, alive oscillate between pitiable and crass. We want this man, Maxim de Winter, to find love again, then he’s a rube, and then he’s a murderer. His second wife, never given a name, is cloyingly Pollyannaish and bright-eyed, until she’s convinced to jump to her death. Rebecca, Maxim’s first wife, is revered, until it’s revealed shes the smarmiest of them lot, conniving as ever. These people are all terrible, and it’s Hitchcock’s pacing that let’s his audience figure this out on our own, without need to tell. Hitchcock was a master of show.

But Mrs. Danvers is the most Gothic character and sets a stage for Norman Bates in Hitchcock’s Psycho 20 years later. She has an obsession with Rebecca de Winter bordering on violent delusion, and takes offense to Maxim remarrying, soon after Rebecca’s death but likely ever. She relishes misleading Maxim’s second wife into dark corners, stirring trouble. We’re supposed to hate her, and empathize with Danvers’ prey as an object of evil affection. Mrs. Danvers is obviously mentally ill, but 1940s America sees her as evil and crazy. If Hitchcock understood this about his audience, he made a perfect character. If he didn’t, he shot a great character, accidentally.

Continue reading

[1987] Moonstruck

A day is both a discrete event and part of a string of days that hopefully make up a full, expectant life. During, and within, each day seems insignificant and to evaluate requires a perspective unavailable to us until much later: we don’t pre-write memoirs for this reason, and often our elders are wise because of their age and  because of their particular string of days; an existence in 88 keys. To short-circuit the learning-experience curve, maybe bisecting it, we use half-baked heuristics. As example: daily, maybe more often, our brains need to subjugate and dissect our interactions into lists and charts. We process by simplification and we exchange understanding and context for nuance. It might take years to undo or double-down on this type of life and it is almost impressively difficult to do.

Even more often we don’t even read a heuristic in a book or article: it is defined for us on screen and stage. And we accept it as true, even subconsciously, because we want to believe it. This specific bias is called “confirmation bias” and together with our peers we engage in groupthink. Almost every mass movement, good and bad, has been a combination of bias heuristics and groupthink. When we talk about race and creed we almost always rely on heuristics — stereotypes — to frame our interactions. Think of a person of Italian descent; now think of a new person walking into a room who looks Italian. What are the first traits that come to mind? Pasta fazhool? Mafia memes? Catholicism? Moonstruck 1987’s Italian-American melodrama can tell its story because of the biases baked into our collective culture. The jokes and jabs Moonstruck uses are shorthand for exposition. Loretta (played by surprisingly nimble Cher) is unlucky in love; her family is unflinchingly large and tightly woven; her new boyfriend needs to tend to his mamma in Sicily; the Church fosters character development almost as a wink and nod to its audience (because of course the Italian family credits the Church with its success and relies on it for strength through strife). Moonstruck tracks the family through love as heuristic for character development. Continue reading

{Second Take} [1987] Moonstruck

Sometimes a song is just meant for the screen. Even without any visuals to accompany it, it’s cinematic on its own; it tells a story, conveys a meaning, and conjures a time and place. “That’s Amore” by Dean Martin was born for film, quite literally, in that it was commissioned specifically to be performed by Martin in 1953’s The Caddy, but it’s somehow more at home when soundtracking a montage of people on dates eating pasta. It almost does a filmmaker’s work for them, as when director Norman Jewison pairs it with glittering imagery of Brooklyn Heights and the Metropolitan Opera House in the opening sequence of Moonstruck. The audience knows exactly what will happen next: high passions, red wine, and Italian accents.

Few songs have had the reach “That’s Amore” has commanded over the decades, and despite its overtness and obviousness, we still associate it with success and accolades. It garnered an Academy Awards nomination for Best Original Song upon its debut in The Caddy, losing to “Secret Love” from Calamity Jane as one might expect: serious fare often earns more respect than lighter material. Later, “That’s Amore” set the stage for Cher, Olympia Dukakis, and writer John Patrick Shanley to all win their sole Oscar awards. At a time when the movie-going public was buying up tickets to action movies and thrillers, the Academy slowed down, took a breath, and recognized the ambitious acting and crystal-clear characterization the cast of Moonstruck was able to deliver. In 1987, “That’s Amore” primed audiences for a comedically fraught romance with bombastic performances that mix Old World realism with stereotypes informed by the observational eye of a playwright, and they were ecstatic when that’s what they got.

Only a year after the debut of “That’s Amore,” Alfred Hitchcock borrowed it to great and jarring effect in Rear Window (also nominated for awards from the Academy). The song already had such immediacy and cultural weight that The Master of Suspense was able to subvert its romantic message by removing the lyrics and using it to score a scene where a man spies on a newlywed couple from his window through theirs. It’s impossible to avoid mentally singing the lyrics to yourself, crossing the signals in your brain between the romantic and the voyeuristic. You know intellectually that the newlyweds are in the throes of love like the song describes, but you’re not with them: you’re with the peeping tom. Hitchcock relied on the original music’s instant popularity and inherent romantic meaning to create an uncomfortable dissonance for his viewers to sit with. Continue reading

[1944] Double Indemnity

Here is a short rundown of film noir:

…[b]ut the vivid co-mingling of lost innocence, doomed romanticism, hard-edged cynicism, desperate desire, and shadowy sexuality that was unleashed in those immediate post-war years proved hugely influential, both among industry peers in the original era, and to future generation of storytellers, both literary and cinematic.

This site continues to cite Double Indemnity as patient zero for the revolution of “black film” and it definitely doesn’t matter. That Double Indemnity sought but a visual vehicle for James M. Cain’s novella of the same name (this movie is probably on the list of movies that tremendously surpass their book equivalents i.e. The Godfather) is not significant. This “movement,” depicting seediness with sexuality and anger without avarice, wasn’t a Hollywood plant. The closest film noir came, and can be backed up by contemporary evidence is as a natural counter to the propaganda optimism bought and funneled through a government machine. For every Going My Way there was room for another Big Sleep.

Eventually noir contracted in the United States as audiences rejected the bleak for the bulbous (see: Marty, a rightful anti-noir, which won in 1955). Europe’s studios, surely replete with their own hot-take Humanism, sought to redefine art as it predicts reality, and branded their films “new wave” and “realism,” but really were noir reincarnate, but with better sound editing and sometimes, C O L O U R. Every few decades noir pops up as a counter-culture movement. Smart filmmakers understand that for every The Blind Side warm-heart there is room for a Winter’s Bone non-casual grit and grimace. Recently counter-epochal film has sprung up as “neo-noir,” whose best take, LA Confidential is as embedded in its own phone booth legacy as Double Indemnity is in its paper trail. Postmodern noir will comment on the fight against Twitter and Facebook. 2018’s Searching tried this concept, smartly showing setting its audience inside a computer. The film itself was a called strike three. Last comment on film noir: brand it however, what the “genre” is is much more anti-modernism than it is pro-anything.  Continue reading

[1976] All The President’s Men

There’s a film (not nominated for Best Picture, probably incorrectly) called The Thin Blue Line, which doesn’t really distinguish between narrative fiction and fictional narrative, but asks the audience to follow incredibly closely and decide for themselves what happened. Errol Morris took this film in a brilliant direction as each person watching the movie (documentary?) was asked to examine their own biases for the name of fairness, correctness, and real life tragedy. His work is an important distinction and groundbreaking in that before The Thin Blue Line, film was very obviously either true or false; a director took license only where absolutely necessary. A few hypotheses why this was the case, in order from probably the truth to certainly not the truth:

  • Technical limitations set the parameters for what could be staged, shot, edited, and pressed. Until the advent of more advanced cameras and computers and software to handle the ambition, storytellers limited their ideas to plausible narratives and the naturally insane.
  • Film was expensive, and filming too much more in the wayward sense of exposition and exploration, would have driven budgets beyond what a financier would consider “acceptable” overruns.
  • Inventing a whole new type of storytelling takes a bold visionary, and they had not yet come along.
  • Audiences cared much more and were entirely more naive about what was truth and what was not. Critical narratives were not readily accessible and without them audiences could not fathom a distinction between manipulative intent and honesty.
  • There was no incentive or market to bust up inertia and jump-start creativity [Ed. – This might be true in the 2010s, somewhat]

This last point is not true, though film in the mid-to late 1980s had lost some of the ferocity brought forth starting in the late 1960s and The Thin Blue Line had started to shake up some of the storytelling techniques that would carry forward, especially into Oliver Stone’s JFK in 1990 and lots of neo-noir works like LA Confidential in 1997 and Mystic River in 2003. There was a cascading acceptance of newness toward the late 1980s. Continue reading

[1989] My Left Foot

A conscious creature develops a personality over time, though differently than it grows physically. An individual human, say, is governed by genetic code hardwired into every bit of body; its height and skin color determined and unchangeable save an external change. In this way the body is determined and fraught with nature. In other ways, aspects about an individual human are subject to their environment—a socio-economic standing, a gender, a fight for survival. A personality, though is complicated. The manner by which a human presents themselves outwardly is governed by a genetic mix, et cetera. Each person may spend their whole life becoming themselves, as if there was a post erected to pass. Humans are all actors, no?

So, how long does it take to become someone else?

For pouring himself into Christy Brown, Daniel Day-Lewis earned himself the first of three Best Actor awards out of six nominations. By any account this actor’s actor has found himself in extreme fortune and generous timing; each of his roles triumphs as a character vehicle, whose environment and/or plot places second and/or third. His work is always a conscious choice of character, too. Since 1982, he has only twenty credits to his name, and only 13 roles since (including) My Left Foot, his first victory, and the one for which he most carefully considered the role. To play cerebral palsy and not to mock it or make light of it is a tremendous feat of mind-meld. The disease is itself a parable of the physical, a body, as it turns out, not broken, but different. Here, Day-Lewis shines on all fronts. Looking back 30 years, six nominees, and three wins recognizing the actor’s actor as a chameleon off sorts. For this actor, himself is his other selves. 

Other actors can claim method, so Daniel Day-Lewis is not alone atop his triumphant mountaintop. The most decorated actor of all is Katherine Hepburn with four, followed by Meryl Streep, Ingrid Bergman, Jack Nicholson, and Walter Brennan. With a fifty-percent win rate, Day-Lewis isn’t even the most efficient—Brennan did it with a 3-out-of-4 career (though all nominations were for Best Supporting Actor, amazing in its own right, but not dominant like an Actor award). All of these actors, and the dozens of others with multiple nominations are tremendous at becoming someone else for art, and for money. So why do we talk about Day-Lewis in a separate breath? The conscious conversation steers this way. The first hypothesis is his gender; for worse, the Actor award is deemed more prestigious in the narrative. In 50 years, assuming the awards last that long, the conversation will drift away from this narrative. It is unfortunate, but true, that many of the stories told so far have been about men, whose funding sources are men, who have decided that these are the stories worth telling. Continue reading

[1951] A Place in the Sun

The Gilded Age in the American experience subsists as worthwhile to study because of its uninterrupted, demonstrated prosperity (curiously corresponding to a legal ban on drink) immediately followed by superficially mitigated disaster and calamity. The Depression certainly carved space for the creation of great works; jazz and photography each had hallmark decades and increased the breadth and depth of its craft. Advances in telecommunications, regardless of who could afford them, allowed for this art to democratize and to offer at least a distraction and at most a joy to millions of people who had nothing now but drink and unsalable assets. Authors who write about this transitory time ex post facto get the benefit of knowing in advance what came next. What makes The Great Gatsby brilliant makes its later Contemporary American Novels not so much: perspective, of which we know Scott Fitzgerald had little.

Fitzgerald’s contemporary, at least in epoch, Theodore Dreiser, wrote a book called An American Tragedy, which would eventually bastardize its way into A Place in the Sun, a 1951 film that showcased Montgomery Clift and Elizabeth Taylor. Unlike the book, whose plot developed slowly and canonically, the movie saw its lead characters smush together into a love triangle that convinced no member of its audience of its emotional heft. The key to Clift’s character, a naive and unassuming nephew type, is believing that plot points happen to him and that he is in control of nothing. Only after he falls in too far does an audience understand that the avariciousness is borne of self-preservation not of circumstance. The character study is trying to piece together how much of the behavior is nature versus nurture. When, as A Place in the Sun insists, the “love” between leads is forced for the sake of time or convenience, our character palate becomes not a band of misfits, but contemptuous mallards. Forget the antihero trope that Gatsby pulls off with aplomb (that each character is a self-serving product of nature), this trope, the speedy drive-thru love, is a film killer and should have died on the cutting table. Continue reading

[1973] American Graffiti

Nostalgia is a hell of a marketing technique. It, as a concept, can be sufficiently disaggregated so that each person’s experience is both universal and personal. Lots of new media relies on the unreachable past. There exists, as I’ve written about before, a term called sonder, which means nostalgia for a time not one’s own. Midnight in Paris captures this feeling to the letter, and commodifies it so that its message can be bought and sold by the very people it aims to placate with dreams of subservience to the artistes de La Belle Époque. Nostalgia is also overwhelming. Instead of inspiration, nostalgic media inspires selective memory, further confusing past narratives. Drowning in nostalgia is akin to a drug-induced coma. Here’s the trick for those who insist on capitalizing on it: rinse and repeat. People will become nostalgic of their own nostalgia.

That’s where, in 1973, George Lucas sold middle-aged Boomers on a “better” time, some ten years earlier, before fake war and realpolitik took generations of Americans to dirt futures. The concept is bizarre, because presumably these very Boomers lived this era, perhaps not as wantonly as the four underdeveloped kids, but they very much existed and had formed their own memories of 1962. Remember, 1962 was the apex year of postwar prosperity for an average American kid. The question for Lucas and his producer, somehow Francis Ford Coppola, is not what they should write the movie about, it’s who is this movie for, exactly? Was it a dopamine insult for Americans who couldn’t stand having family and friends napalming Cambodians and systematically picked off near Hanoi? Was a movie going to suddenly placate the hippies? The answer, in short, is totally, absolutely, and exactly. Here’s a mind-blowing number: in 2019 dollars American Graffiti would have made $800 million on a budget of just over $4 million. This movie made an overwhelming amount of money selling a truly empty version of American Life. Continue reading