[1987] Moonstruck

A day is both a discrete event and part of a string of days that hopefully make up a full, expectant life. During, and within, each day seems insignificant and to evaluate requires a perspective unavailable to us until much later: we don’t pre-write memoirs for this reason, and often our elders are wise because of their age and  because of their particular string of days; an existence in 88 keys. To short-circuit the learning-experience curve, maybe bisecting it, we use half-baked heuristics. As example: daily, maybe more often, our brains need to subjugate and dissect our interactions into lists and charts. We process by simplification and we exchange understanding and context for nuance. It might take years to undo or double-down on this type of life and it is almost impressively difficult to do.

Even more often we don’t even read a heuristic in a book or article: it is defined for us on screen and stage. And we accept it as true, even subconsciously, because we want to believe it. This specific bias is called “confirmation bias” and together with our peers we engage in groupthink. Almost every mass movement, good and bad, has been a combination of bias heuristics and groupthink. When we talk about race and creed we almost always rely on heuristics — stereotypes — to frame our interactions. Think of a person of Italian descent; now think of a new person walking into a room who looks Italian. What are the first traits that come to mind? Pasta fazhool? Mafia memes? Catholicism? Moonstruck 1987’s Italian-American melodrama can tell its story because of the biases baked into our collective culture. The jokes and jabs Moonstruck uses are shorthand for exposition. Loretta (played by surprisingly nimble Cher) is unlucky in love; her family is unflinchingly large and tightly woven; her new boyfriend needs to tend to his mamma in Sicily; the Church fosters character development almost as a wink and nod to its audience (because of course the Italian family credits the Church with its success and relies on it for strength through strife). Moonstruck tracks the family through love as heuristic for character development. Continue reading

Advertisements

{Second Take} [1987] Moonstruck

Sometimes a song is just meant for the screen. Even without any visuals to accompany it, it’s cinematic on its own; it tells a story, conveys a meaning, and conjures a time and place. “That’s Amore” by Dean Martin was born for film, quite literally, in that it was commissioned specifically to be performed by Martin in 1953’s The Caddy, but it’s somehow more at home when soundtracking a montage of people on dates eating pasta. It almost does a filmmaker’s work for them, as when director Norman Jewison pairs it with glittering imagery of Brooklyn Heights and the Metropolitan Opera House in the opening sequence of Moonstruck. The audience knows exactly what will happen next: high passions, red wine, and Italian accents.

Few songs have had the reach “That’s Amore” has commanded over the decades, and despite its overtness and obviousness, we still associate it with success and accolades. It garnered an Academy Awards nomination for Best Original Song upon its debut in The Caddy, losing to “Secret Love” from Calamity Jane as one might expect: serious fare often earns more respect than lighter material. Later, “That’s Amore” set the stage for Cher, Olympia Dukakis, and writer John Patrick Shanley to all win their sole Oscar awards. At a time when the movie-going public was buying up tickets to action movies and thrillers, the Academy slowed down, took a breath, and recognized the ambitious acting and crystal-clear characterization the cast of Moonstruck was able to deliver. In 1987, “That’s Amore” primed audiences for a comedically fraught romance with bombastic performances that mix Old World realism with stereotypes informed by the observational eye of a playwright, and they were ecstatic when that’s what they got.

Only a year after the debut of “That’s Amore,” Alfred Hitchcock borrowed it to great and jarring effect in Rear Window (also nominated for awards from the Academy). The song already had such immediacy and cultural weight that The Master of Suspense was able to subvert its romantic message by removing the lyrics and using it to score a scene where a man spies on a newlywed couple from his window through theirs. It’s impossible to avoid mentally singing the lyrics to yourself, crossing the signals in your brain between the romantic and the voyeuristic. You know intellectually that the newlyweds are in the throes of love like the song describes, but you’re not with them: you’re with the peeping tom. Hitchcock relied on the original music’s instant popularity and inherent romantic meaning to create an uncomfortable dissonance for his viewers to sit with. Continue reading

[1944] Double Indemnity

Here is a short rundown of film noir:

…[b]ut the vivid co-mingling of lost innocence, doomed romanticism, hard-edged cynicism, desperate desire, and shadowy sexuality that was unleashed in those immediate post-war years proved hugely influential, both among industry peers in the original era, and to future generation of storytellers, both literary and cinematic.

This site continues to cite Double Indemnity as patient zero for the revolution of “black film” and it definitely doesn’t matter. That Double Indemnity sought but a visual vehicle for James M. Cain’s novella of the same name (this movie is probably on the list of movies that tremendously surpass their book equivalents i.e. The Godfather) is not significant. This “movement,” depicting seediness with sexuality and anger without avarice, wasn’t a Hollywood plant. The closest film noir came, and can be backed up by contemporary evidence is as a natural counter to the propaganda optimism bought and funneled through a government machine. For every Going My Way there was room for another Big Sleep.

Eventually noir contracted in the United States as audiences rejected the bleak for the bulbous (see: Marty, a rightful anti-noir, which won in 1955). Europe’s studios, surely replete with their own hot-take Humanism, sought to redefine art as it predicts reality, and branded their films “new wave” and “realism,” but really were noir reincarnate, but with better sound editing and sometimes, C O L O U R. Every few decades noir pops up as a counter-culture movement. Smart filmmakers understand that for every The Blind Side warm-heart there is room for a Winter’s Bone non-casual grit and grimace. Recently counter-epochal film has sprung up as “neo-noir,” whose best take, LA Confidential is as embedded in its own phone booth legacy as Double Indemnity is in its paper trail. Postmodern noir will comment on the fight against Twitter and Facebook. 2018’s Searching tried this concept, smartly showing setting its audience inside a computer. The film itself was a called strike three. Last comment on film noir: brand it however, what the “genre” is is much more anti-modernism than it is pro-anything.  Continue reading

[1976] All The President’s Men

There’s a film (not nominated for Best Picture, probably incorrectly) called The Thin Blue Line, which doesn’t really distinguish between narrative fiction and fictional narrative, but asks the audience to follow incredibly closely and decide for themselves what happened. Errol Morris took this film in a brilliant direction as each person watching the movie (documentary?) was asked to examine their own biases for the name of fairness, correctness, and real life tragedy. His work is an important distinction and groundbreaking in that before The Thin Blue Line, film was very obviously either true or false; a director took license only where absolutely necessary. A few hypotheses why this was the case, in order from probably the truth to certainly not the truth:

  • Technical limitations set the parameters for what could be staged, shot, edited, and pressed. Until the advent of more advanced cameras and computers and software to handle the ambition, storytellers limited their ideas to plausible narratives and the naturally insane.
  • Film was expensive, and filming too much more in the wayward sense of exposition and exploration, would have driven budgets beyond what a financier would consider “acceptable” overruns.
  • Inventing a whole new type of storytelling takes a bold visionary, and they had not yet come along.
  • Audiences cared much more and were entirely more naive about what was truth and what was not. Critical narratives were not readily accessible and without them audiences could not fathom a distinction between manipulative intent and honesty.
  • There was no incentive or market to bust up inertia and jump-start creativity [Ed. – This might be true in the 2010s, somewhat]

This last point is not true, though film in the mid-to late 1980s had lost some of the ferocity brought forth starting in the late 1960s and The Thin Blue Line had started to shake up some of the storytelling techniques that would carry forward, especially into Oliver Stone’s JFK in 1990 and lots of neo-noir works like LA Confidential in 1997 and Mystic River in 2003. There was a cascading acceptance of newness toward the late 1980s. Continue reading

[1989] My Left Foot

A conscious creature develops a personality over time, though differently than it grows physically. An individual human, say, is governed by genetic code hardwired into every bit of body; its height and skin color determined and unchangeable save an external change. In this way the body is determined and fraught with nature. In other ways, aspects about an individual human are subject to their environment—a socio-economic standing, a gender, a fight for survival. A personality, though is complicated. The manner by which a human presents themselves outwardly is governed by a genetic mix, et cetera. Each person may spend their whole life becoming themselves, as if there was a post erected to pass. Humans are all actors, no?

So, how long does it take to become someone else?

For pouring himself into Christy Brown, Daniel Day-Lewis earned himself the first of three Best Actor awards out of six nominations. By any account this actor’s actor has found himself in extreme fortune and generous timing; each of his roles triumphs as a character vehicle, whose environment and/or plot places second and/or third. His work is always a conscious choice of character, too. Since 1982, he has only twenty credits to his name, and only 13 roles since (including) My Left Foot, his first victory, and the one for which he most carefully considered the role. To play cerebral palsy and not to mock it or make light of it is a tremendous feat of mind-meld. The disease is itself a parable of the physical, a body, as it turns out, not broken, but different. Here, Day-Lewis shines on all fronts. Looking back 30 years, six nominees, and three wins recognizing the actor’s actor as a chameleon off sorts. For this actor, himself is his other selves. 

Other actors can claim method, so Daniel Day-Lewis is not alone atop his triumphant mountaintop. The most decorated actor of all is Katherine Hepburn with four, followed by Meryl Streep, Ingrid Bergman, Jack Nicholson, and Walter Brennan. With a fifty-percent win rate, Day-Lewis isn’t even the most efficient—Brennan did it with a 3-out-of-4 career (though all nominations were for Best Supporting Actor, amazing in its own right, but not dominant like an Actor award). All of these actors, and the dozens of others with multiple nominations are tremendous at becoming someone else for art, and for money. So why do we talk about Day-Lewis in a separate breath? The conscious conversation steers this way. The first hypothesis is his gender; for worse, the Actor award is deemed more prestigious in the narrative. In 50 years, assuming the awards last that long, the conversation will drift away from this narrative. It is unfortunate, but true, that many of the stories told so far have been about men, whose funding sources are men, who have decided that these are the stories worth telling. Continue reading

[1951] A Place in the Sun

The Gilded Age in the American experience subsists as worthwhile to study because of its uninterrupted, demonstrated prosperity (curiously corresponding to a legal ban on drink) immediately followed by superficially mitigated disaster and calamity. The Depression certainly carved space for the creation of great works; jazz and photography each had hallmark decades and increased the breadth and depth of its craft. Advances in telecommunications, regardless of who could afford them, allowed for this art to democratize and to offer at least a distraction and at most a joy to millions of people who had nothing now but drink and unsalable assets. Authors who write about this transitory time ex post facto get the benefit of knowing in advance what came next. What makes The Great Gatsby brilliant makes its later Contemporary American Novels not so much: perspective, of which we know Scott Fitzgerald had little.

Fitzgerald’s contemporary, at least in epoch, Theodore Dreiser, wrote a book called An American Tragedy, which would eventually bastardize its way into A Place in the Sun, a 1951 film that showcased Montgomery Clift and Elizabeth Taylor. Unlike the book, whose plot developed slowly and canonically, the movie saw its lead characters smush together into a love triangle that convinced no member of its audience of its emotional heft. The key to Clift’s character, a naive and unassuming nephew type, is believing that plot points happen to him and that he is in control of nothing. Only after he falls in too far does an audience understand that the avariciousness is borne of self-preservation not of circumstance. The character study is trying to piece together how much of the behavior is nature versus nurture. When, as A Place in the Sun insists, the “love” between leads is forced for the sake of time or convenience, our character palate becomes not a band of misfits, but contemptuous mallards. Forget the antihero trope that Gatsby pulls off with aplomb (that each character is a self-serving product of nature), this trope, the speedy drive-thru love, is a film killer and should have died on the cutting table. Continue reading

[1973] American Graffiti

Nostalgia is a hell of a marketing technique. It, as a concept, can be sufficiently disaggregated so that each person’s experience is both universal and personal. Lots of new media relies on the unreachable past. There exists, as I’ve written about before, a term called sonder, which means nostalgia for a time not one’s own. Midnight in Paris captures this feeling to the letter, and commodifies it so that its message can be bought and sold by the very people it aims to placate with dreams of subservience to the artistes de La Belle Époque. Nostalgia is also overwhelming. Instead of inspiration, nostalgic media inspires selective memory, further confusing past narratives. Drowning in nostalgia is akin to a drug-induced coma. Here’s the trick for those who insist on capitalizing on it: rinse and repeat. People will become nostalgic of their own nostalgia.

That’s where, in 1973, George Lucas sold middle-aged Boomers on a “better” time, some ten years earlier, before fake war and realpolitik took generations of Americans to dirt futures. The concept is bizarre, because presumably these very Boomers lived this era, perhaps not as wantonly as the four underdeveloped kids, but they very much existed and had formed their own memories of 1962. Remember, 1962 was the apex year of postwar prosperity for an average American kid. The question for Lucas and his producer, somehow Francis Ford Coppola, is not what they should write the movie about, it’s who is this movie for, exactly? Was it a dopamine insult for Americans who couldn’t stand having family and friends napalming Cambodians and systematically picked off near Hanoi? Was a movie going to suddenly placate the hippies? The answer, in short, is totally, absolutely, and exactly. Here’s a mind-blowing number: in 2019 dollars American Graffiti would have made $800 million on a budget of just over $4 million. This movie made an overwhelming amount of money selling a truly empty version of American Life. Continue reading