Spoiler Alert: These essays are ideally to be read after viewing the respective films.
Showing posts with label film history. Show all posts
Showing posts with label film history. Show all posts

Sunday, May 24, 2020

Hail, Caesar!

For anyone interested in filmmaking, a film that features the internal operations of a film studio—especially one during the “Golden Age” of Hollywood—is likely to be captivating. After all, as Eddie Mannix, the studio executive in Hail, Caesar! (2016), says, the “vast masses of humanity look to pictures for information and uplift and, yes, entertainment.” This film provides all three for its audience on what film-making was like in the studio system. With regards to the Christian theology, however, the result is mixed.  The film makes the point that theological information best comes out indirectly from dramatic dialogue rather than discussion on theology itself. In other words, inserting a theological lecture into a film’s narrative is less effective than an impassioned speech by which entertainment and uplift can carry the information.


Mannix’s meeting scene with clerics is harried and thus difficult for the viewer (and Mannix) to digest, but Baird Whitlock’s emotional speech on a studio set as the Crucifixion scene is being filmed conveys religious ideas in an entertaining manner. The speech centers on what is so special about the person being crucified. The information is carried on Whitlock’s emotive warmth, and thus the acting of George Clooney who plays the character. In contrast, emotion is sparing in Mannix’s meeting with a rabbi, Catholic priest, Greek Orthodox priest, and Protestant minister. Instead, the scene is energized by fast-moving theological points, but this is unfortunately of little use to the viewers as demonstrated by Mannix’s confused reactions as the clerics debate. This is ironic because the scene’s role in the film narrative is to make the point that Capital Studio’s management takes the informational role the film being made seriously. Whereas Mannix just wants to know if any of the clerics are disturbed by the Jesus portrayed in Mannix’s film, the inclusion of the clerics’ discussion of theology begs the question: can’t film do any better in expressly handling theological concepts through dialogue? The viewer has not yet seen the scene of Whitlock’s emotional speech at the Crucifixion, but that scene does not address whether theological dialogue is viable in film. After watching Mannix’s meeting, the viewer likely answers, not well. The example may not be a good one, however.

How good is the medium of film in portraying Jesus Christ and the story that encapsulates him? I contend that this is precisely the question that Hail, Caesar! (2016) attempts to answer, but falls short. The scene of Mannix’s meeting not only relegates theological dialogue as being beyond the reach of viewers, but also assumes quite explicitly that the best portrayal of Jesus is that which is the least controversial. Because Whitlock’s reverential articulation of Jesus is appreciated universally on the movie set on which the film within the film is being shot, the message is that impassioned meaning itself is enduring; it is also the least likely to offend. Does not the strategy of coming up with a portrayal that does not offend anyone run the risk of being drab? Is such a portrayal merely a copy of the default, which may contain problems? Moreover, does the inclusion of something controversial take away from the uplift and entertainment value?

Even though avoiding anything controversial fit the 1950s—the time when the film takes place—especially in American society, and thus Hollywood, viewers watching the film in 2016 likely perceived the strategy to be antiquated and even suboptimal. Some viewers may have seen controversial films on Jesus such as Jesus Christ Superstar (1973) for trivializing the story with pop music, Jesus of Nazareth (1977) for emphasizing Jesus’ human characteristics at the expense of his divine Sonship, The Last Temptation of Christ (1988) for its moral stances and conflicted Jesus, and The Passion of the Christ (2004) for taking Jesus’s suffering beyond that in the New Testament. By 2016, the assumptions that explicitly theological dialogue is inherently beyond the grasp of viewers; such dialogue itself is too controversial; and films should rely instead on impassioned speeches could be reckoned as nonsense. Surely the controversy of King of Kings (1961) over the decision to show Jesus’ face would be deemed anything but controversial in retrospect. Hail Caesar! may be making the same point regarding the conformist era of the 1950s.

Perhaps the disappointment of Hippie idealism and the ensuing criticism of American government and society beginning in the late 1960s had accustomed Americans to viewing controversy as acceptable, and even finding it to be entertaining and uplifting in terms of ideational freedom (i.e., thinking outside the box). Studios may have been absorbed the cultural criticism in producing films like Jesus Christ Superstar that were certainly outside the box relative to the earlier films such as King of Kings. It could even be said that the medium was ushering in a new wave of historical theological criticism after that of the nineteenth-century Germans such as Feuerbach and Nietzsche. Put another way, perhaps their thought had finally percolated through or resonated with American society after 1968 such that studios could take chances precisely that are anathema to Mannix in Hail Caesar! and therefore 1950s Americana.
Of course, entertainment and uplift could not suffer; they were no longer assumed to be mutually exclusive with religious controversy.  Entertainment had been a mainstay of film since even before the medium partook of narrative. Fifty seconds of an oncoming train in Arrival of a Train at La Ciotat (1896), for example, thrilled audiences. Sound would only have added to the fright. Both uplift and sadness or fear can be entertaining. In its de facto insistence on happy endings, Hollywood has neglected this point. Relatedly, an insistence on avoiding controversy out of fear that it would detract from the entertainment-value of a film neglects the possibility that controversy could add entertainment-value while providing thought-provoking information. Even thinking about abstract ideas after viewing a film can be entertaining for some people because those ideas came out of a narrative.

Generally speaking, the information/knowledge element is most salient in documentaries, but even fictional narrative is capable of carrying heavy weight in this regard. In regard to the religious content of Judaism and Christianity, Mannix says, “The Bible of course is terrific, but for millions of people, pictures will be their reference point to the story.” He predicts that film would even become the story’s embodiment. In other words, he predicts that from his time in the 1950s, film would come to supersede even the Bible itself because of the film medium’s greater potential to provide information, uplift, and entertainment. One of my reasons for studying film is indeed the medium’s hegemony and thus role in transmitting abstract ideas and even theories.

While I do not doubt the medium’s tremendous potential to present an experience in the story-world by means of visuals and sound, whereas a book is only text that must be read, Mannix omits the pleasure that can be afforded only by the human imagination without visuals and sound to constrict the imagination to a story world presented by film. Especially in the multi-layered genre of mythology (i.e., religious narrative), imagination can be stretched in a myriad ways and on many levels, given the scope for interpretation in myth.

On the other hand, even though film constrains imagination to within the contours of a story world, the mind’s ability to suspend disbelief allows for immersion into such a world, resulting in greater understanding as well as uplift and entertainment. A viewer can “enter” a film’s audio-visual story world cognitively, perceptually, and emotionally such that a sense of experiencing can be had. Experiencing the Biblical world can enable a viewer to better understand Jesus’ dialogues because they are in their contexts. To the extent that the ancient world historically can inform our understanding of the Biblical world, film can make use of historians and anthropologists in order to improve on how that world is being portrayed. To be sure, the Biblical world is distinct from history, and our knowledge of the ancient past is limited. Film carries with it the risk that viewers might take a portrayal as the world that a historical Jesus would have known rather than that of a faith narrative. The use of abstract dialogue does not suffer from this problem because the ideas being exchanged transcend the dialogue’s context. So the assumption that narrative-specific impassioned speeches are superior to such dialogue is flawed. Of course, this assumption in Hail, Caesar! supports the problematic assumption that controversy must at all means be avoided in order to maximize the entertainment value and uplift, which in turn relate to profitability.

I turn now to a scene analysis of Mannix’s meeting with the clerics in order to make several points, including that the scene is a bad example of how a religious film can effectively use abstract dialogue. The studio executive wants expert feedback both from within Christianity and outside of it to make sure that no viewers whatsoever will take offence to the Jesus being portrayed in Mannix’s movie—the film within the film. When Mannix first asks his guests whether they have any theological objections to the movie being made, the Greek Orthodox priest complains that the chariots in one scene go too fast. Even a cleric has difficulty turning to religious dialogue! The message to the viewer can only be that such dialogue is neither natural nor befitting a film-viewing. This point supports the film’s solution by means of an impassioned speech even if the implications regarding the use of abstract dialogue in film are wrong.
At the studio executive’s urging, the clerics finally focus on the task at hand. “The nature of Jesus is not as simplistic as your picture would have it,” the Catholic priest says. He is speaking theologically. “It is not as simple as God is Christ or Christ is God,” he explains. The portrayal should go further. It should show that Jesus is “the Son of God who takes the sins of the world upon himself so we may enter the Kingdom of God.” Indeed, the Jesus of the Gospels announces that his mission to preach the mysteries of the Kingdom. Unfortunately, the screenwriter did not have the priest say anything about that kingdom (e.g., how to get in it). Instead, the priest’s focus, consistent with the history of theology, is left at Christ’s identity (i.e., Christology) in salvation (i.e., Soteriology) even though the less abstract teachings of Jesus on how to enter his Father’s kingdom, such as benevolence even to detractors and enemies, would be more easily comprehended by viewers.

After the priest’s abstract theological point, the clerics rapid-fire contending points so fast and without sufficient explanation to Mannix (who seems clueless even though he goes to confession daily) that the viewers are clearly not deemed able to follow a theological discussion. Yet the film makes a straw-man’s argument by presenting the dialogue at such a fast pace that little could be gained from the ideas expressed.

The Protestant minister says that Jesus is part God. The rabbi counters that the historical Jesus was a man. Mannix, a Roman Catholic, asks, “So God is split?” to which the Catholic priest answers, “Yes and no.” The Greek priest says, “Unity in division” and the Protestant minister adds, “And division in unity.” Such word games do not advance a viewer’s comprehension of the dialogue. As if standing in for the viewer, Mannix loses his concentration and admits, “I don’t follow that.” The best line of the movie comes when the rabbi replies, “You don’t follow it for a very simple reason; these men are screwballs.”

From the Jewish standpoint, the Christian clerics have gotten themselves tied up in knots because they are claiming something that a human being is both fully human and fully divine. Aside from a historical Jesus, the god-man character in faith narratives goes against the Jewish belief that a chasm separates human beings from God. The belief that God has an incarnate human form (i.e., a human body) smacks Jews as a case of self-idolatry.  As confirmed at the Council of Nicaea (325 C.E.), Christian theology upholds that Jesus has two natures in himself—the divine and the human (except for sin). The two natures stay distinct in Jesus, so the divine is of the same substance (consubstantial) with the other two manifestations (or “persons”) of the Trinity; the human nature is unaffected by the divine except for the former being without sin. This is necessary so Jesus’ self-sacrifice on the Cross can be for other people rather than to pay the price of his own sin.

For the viewers, an analogy would have served better than the abstractions in the dialogue. Oil and water in a cup, for example, would have been more easily understood. The screenwriters fare better when the theological discussion turns to God (i.e., the Godhead). The Catholic priest claims that the Jews worship a god who has no love. “God loves Jews,” the rabbi retorts. Reacting to the unloving way in which Yahweh treats other people, the Protestant minister insists that God loves everyone. Yahweh’s statement that vengeance is His does not square with God being love. In his writings, Nietzsche argues that this incongruity discredits the conception of Yahweh in the Bible. It is the discredited conception that Nietzsche refers to in writing, “God is dead.” Fortunately, as St. Denis points out in his writings, God transcends human conceptions of God. The screenwriter could have had the Rabbi make this point, and moreover, that the Christian clerics are too obsessed with theological distinctions that assume the validity of the operative conception wherein a vice belongs to God, which is perfect goodness (omnibenevolent).   

As if channeling Augustine to refute the rabbi, the Catholic priest says, “God is love.” Calvin’s writings contain the same point, which can be construed as the core of Christianity. Whereas Augustine’s theological love (caritas) is human love raised to the highest good (i.e., God), Calvin’s is the divine self-emptying (agape) love. Whether or not human nature, even Eros, is part of Christian theological love, it manifests as universal benevolence (benevolentia universalis). In the film, the rabbi could have asked the other clerics whether humans are capable of self-emptying divine love (i.e., agape), and how the god of love handles the evil people, given that God is all-powerful (omnipotent). The clerics could have pointed very concretely to how a person can enter the Kingdom of God.

Instead, the Greek priest gets existential, insisting that the basis of God is love is, “God is who He is.” The screenwriter missed an opportunity for the rabbi to say, God is I Am. The implication is that theological love is divine existence, which transcends existence within Creation. God’s nature and very existence as love may thus be wholly other than human conceptions and experiences of love and existence. St. Denis made this point in the sixth century, and yet, as David Hume pointed out in the eighteenth century, the human brain is naturally inclined to view the unknown by attributing human characteristics to it.

The theological dialogue in the meeting scene could have brought the viewers to the point of appreciating God’s wholly otherness as transcending even the polished theological distinctions that we make. However, Mannix, who goes to confession daily, personifies the assumption that even religious viewers would get lost in theological dialogue in a film even though the rushed dialogue is rigged to support this assumption. The studio executive, for whom profitability is important, states up front in the meeting that he just wants to know whether the portrayal of Jesus in the film being made offends “any reasonable American regardless of faith or creed. I want to know if the theological elements are up to snuff.” Given the rabbi’s statements, however, the portrayal of Jesus as a god-man would be controversial at least to Jews. So Mannix really means to Christians. That’s all Mannix wants from the meeting, so to him even the theological bantering is a distraction. In fact, it could invite controversy for the film, Hail Caesar!, even though the film within the film is not controversial. On this meta-level, the religious dialogue is written as comedic perhaps for this reason, though by 2016 avoiding controversy would not likely be a concern. To be sure, even then for a cleric to suggest that divine mystery goes beyond the Christian understanding of Jesus being of two distinct natures would invite controversy. St. Denis’ claim that God transcends even our conception of the Trinity would certainly be controversial even in the early twenty-first century.   

Regarding the 1950s film within the film, Mannix asks at the end of the meeting scene, “Is our depiction of Jesus fair?” Without questioning Mannix's underlying assumption that fairness means non-controversial, the Protestant minister, answers, “There’s nothing to offend a reasonable man.” By implication, to present anything that offends a reasonable person would be unjust even if controversy would likely occur from presenting advances in theological understanding, including alternative views, which alter or question the default.  A reasonable person is almost defined as one who holds the orthodox (i.e., doctrinal) belief on Jesus’ identity (i.e., Christology). By implication, it is fair if an unreasonable man—a person who has a “deviant” Christological belief—is offended. Such fairness, it turns out, is not so fair; it is at the very least biased in favor of the tyranny of the status quo both as it applied to theological interpretation and the wider heavily-conformist American society in the 1950s.

Mannix represents the position that theology can and should be filtered through the lenses of business. That of the sacred that reaches the viewers must survive the cutting board of the profane. Because the Catholic priest says that the portrayal of Jesus in the film being made in Hail Caesar! is too simplistic, perhaps the message is that only simplified theology survives. While this point applies well to 1950s Hollywood cinema, the plethora of controversial films on Jesus since the utopian convention-defying days of the late 1960s in America suggests that controversial films can indeed be profitable, at least if a wider society is no longer so conformist. Indeed, societal judgments on what is controversial have varied over time. 

Even theologians’ views of profit-seeking have changed through the centuries of Christianity. Until the Commercial revolution, the dominant view was that salvation and money are mutually exclusive.[1] The rich man cannot enter the Kingdom of Heaven. Willowing down theology to suit profitability would have been deemed anathema. With greater importance being attributed to Christian virtues actualized by profit-seeking followed by the belief that God rewards Christians monetarily for having true belief (i.e., that Jesus saves souls), Christian clerics in the twentieth century could be more accommodating of studio executives. The end of reaching a large audience, for instance, could have been believed to justify unprofitable scraps of theology on the cutting-room floor. The historical uncoupling of greed from wealth and profit-seeking, having been accomplished by the end of the Italian Renaissance, made permissible such an accommodation. Indeed, if God is believed to reward faithful Christians monetarily, as is held in the Prosperity Gospel, then a profit-seeking studio executive would be seen as being favored by God in using profit as the litmus-test for theology. 

Although in the film's period of the 1950s any explicit questioning and criticism of the operative assumptions in Hail, Caesar! would likely have been squashed like bugs, the screenwriter could have included such material (even the squashing) so the viewers in 2016 could have a better understanding of just how narrow, and even arbitrary, the film's historical assumptions are. Therefore, both in terms of theology and the related societal context, the screenwriter could have delivered more to both inform and entertain, with the uplift including what naturally comes from putting a theology and social reality (i.e., of the 1950s) in a broader, contextual macro- or meta-perspective. 

Sunday, September 23, 2018

De-Lovely

Cole Porter (1891-1964), an American composer and songwriter, is the centerpiece of the film, De-Lovely (2004). The film begins when he meets Linda, who would become his wife. Their relationship is at the center of the story, as well as Porter’s love songs sung throughout the film. Although the complicated nature of the relationship takes center stage, the film can be viewed as a moving snapshot of the first half of the twentieth century, when film made inroads that would dwarf the stage.


The message is clear: quality (e.g. clever humor) was be sacrificed, or “dumbed down,” to be attractive to the much larger movie-market. In other words, entertainment would have to become virtual eye-candy to be attractive to the ordinary American. In the film, Porter’s “Be a Clown” is meant as a swipe at J.B. Meyer even as the eye-candy visuals were deployed to tickle his ribs so he would be oblivious to the insult being leveled at his industry, and thus himself.
Linda Porter is more direct at the film’s party at the set, likening Hollywood to being deep down in an ocean—suggestive of the bisexual activities of Cole having found ready outlets in L.A.—rather than being in warm and sunny southern California. No opening after-party on Broadway would take place literally on a stage. How low class that would be!
Lastly, after watching a private screening of Night and Day (1946), in which Cary Grant is implausibly cast to play Cole Porter, neither Cole nor Linda is impressed. Looking at the attempt to capture Cole’s life in film, the couple could be concluding, moreover, that the ascendancy of film would result in a new decadence—a new low—in American entertainment.
Had they been around, the Porters would have stayed home rather than see the slew of disaster sans narrative films, such as Earthquake (1974), The Day After Tomorrow (2004), and San Andreas (2015).  An interesting question is how Cole Porter viewed the decline in the number of Hollywood musicals beginning in the 1950s as the studio system started to come apart. He likely did not appreciate the dollar argument wherein what is produced should be what will maximize revenue, even if Porter benefitted financially from higher ticket sales of his films. It seems to me that the film medium is not to blame, for the film, Amadeus (1984), shows the existence of low theatre in the eighteenth century, before cinema would come into being.
Both theatre and film can go to the most base in terms of humor and narrative to titillate certain market-segments, while producing truly astonishing quality. Hence, films like Dumb and Dumber (1994) have not been made with an eye to getting an Academy Award, whereas films like The Iron Lady (2011) and Lincoln (2012) likely were. Astonishingly, actors like Meryl Streep can play in both camps, such as in starring in films like The Iron Lady and The Devil Wears Prada (2006) and yet also films like Mamma Mia! (2008) and Mamma Mia! Here We Go Again (2018). I am not saying that the latter films cannot or should not be taken to be entertaining; rather, I am pointing to the sheer distance between such films and those that receive best picture and acting nominations at the Academy of Motion Pictures. The existence of films such as Dumb and Dumber does not negate the high art of the films that are nominated (and win) Academy awards.

Monday, April 7, 2014

So Ends an Era: Classic Hollywood Cinema (1930-1950)

With the deaths of Shirley Temple on February 10, 2014, and of Mickey Rooney (Joe Yule) two months later, the world lost its two last major (on-screen) living connections to the classic Hollywood cinema of the 1930s and 1940s. The similarly-clustered deaths of Ed McManon, Farah Fawcett, and Michael Jackson during the summer of 2009 may have given people the impression that the celebrity world of the 1970s had become history in the new century.  

"Mickey Rooney" as "Andrew Hardy," flanked by his on-screen parents. Together, these three characters give us a glimpse of family life in a bygone era. Even in the 1940s, Andy Hardy's father may have been viewed as representing still another era, further back and on its way out. (Image Source: Wikipedia)


Lest we lament too much the loss of these worlds, as per the dictum of historians that history is a world lost to us, we can find solace in the actors’ immortality (and perhaps immorality) on screen. However, in the fullness of time, by which is not meant eternity (i.e., the absence of time as a factor or element), even films as illustrious or cinematically significant as Citizen Kane, Gone with the Wind, His Girl Friday, The Wizard of Oz, Philadelphia Story, Dracula, and even Mickey Rooney’s Andrew Hardy series of films will find themselves representing a decreasing percentage of films of note—assuming cinema or some continued evolution thereof goes on. As great as some ancient plays like Antigone are, the vast majority of Westerners today have never heard of the work (not to mention having seen it). Even more recent plays, such Shakespeare’s, are not exactly block-busters at movie theatres.

To be sure, cinema (and “the lower house,” television) has eclipsed plays as a form of story-telling. However, another technological innovation may displace our privileged mode sometime in the future. Virtual reality, for example, may completely transform not only how we watch movies, but also film-making itself (e.g., reversing the tendency of shorter shots and scenes so not to disorient the immersed viewer). Although the old “black and whites” can be colored and even restored, adapting them so the viewer is in a scene would hardly be possible without materially altering the original.

Aside from the decreasing proportion phenomenon relegating classic Hollywood gems, who’s to say how much play they will get even two hundred years from 2014, not to mention in 2500 years.  Even our artifacts that we reckon will “live on” forever (even if global warming has rid the planet of any humans to view the classics) will very likely come to their own “clustered deaths.” We humans have much difficulty coming to terms with the finiteness of our own world and ourselves within a mere slice of history. As Joe Yule remarked as Mickey Rooney in 2001, “Mickey Rooney is not great. Mickey Rooney was fortunate to have been an infinitesimal part of motion pictures and show business.”[1] En effet, motion pictures can be viewed as an infinitesimal phenomenon from the standpoint of the totality of history.




[1]Donna Freydkin, “Mickey Rooney Dead at 93” USA Today, April 7, 2014. 

Tuesday, February 4, 2014

The Oscars: Beyond the Eye-Candy

Writing on the night of the 84th Oscars in 2012, Michael Cieply and Brooks Barnes of The New York Times seemed to wonder "aloud" as they analyzed the 5,800-member Academy’s cultural relevance. They had found most members to be “overwhelmingly white, male and 60ish.” Such a rarified persona is presumably enough to relegate the Academy to oblivion. Coming during “Black History Month,” Billy Crystal’s portrayal of Sammy Davis, Jr.—a character sketch that had gone unscathed many times in the 1980s—functioned as a lightning rod for people otherwise bored with the lack of surprises in the announced winners (or the host). Lest “let’s go kill Hitler” had become too politically incorrect for Crystal’s Sammy Davis character to say (like Crystal, Davis was Jewish) at the Oscars, one might take a gander at the excellent film, Inglourious Basterds. This brings me to the main point. According to the New York Times, the Academy may not be relevant because the award-winners did not do well at the box office. I respectfully disagree.

The New York Times points to the “generally weak box-office performance among the year’s nine best-picture contenders—only one of which, [The Help], amassed more than $100 million in domestic ticket sales.” The best picture, The Artist, had amassed only about $32 million. Cieply and Barnes contend that film’s win underscores “the Oscars’ growing detachment from the movie-going public at large.” Indeed, only about one in ten of the Oscars’ viewers had seen the film.

The classic cinema look must have reminded some Oscar viewers and attendees of the grandeur of the big screen.    (source: The New York Times)

Providing another perspective, I submit that the Oscars is not a popularity contest. The awards are not about telling the public what most titillated it over the past year at the movies. The existence of the technical categories, such as art direction and sound mixing, points to something else—a chance for the experts to award talent. Whether we like it or not, the general public is not the best judge of the talent of a sound editor; we go to the movies to become absorbed into a world, rather than to resist this by critiquing each technical function that went into the making of the film.

So while many people saw the last Harry Potter movie and may have enjoyed it (I passed on the last three in the series), art design and cinematography went instead to Hugo, a film that far fewer people had seen. Whereas every member in the Academy can vote for best picture, the other categories are voted on only by their respective practitioners. While this allows for politics and bias (e.g. James Cameron not getting best actor for Avatar due to his personality), the method also enables people in a position to recognize skill to be decisive in the selections. A director watching another director’s film, for instance, can pick up on good directing much better than we could as viewers. I suspect the general viewer’s opinion becomes more valid when a particular technical function is bad (e.g. a scene is out of focus, or certain sounds can’t be heard). I suspect that practitioners in a given field are necessary to discern between five cases, each of which looks good to the rest of us.

Therefore, I think there is great value in having something more than The People’s Choice Awards. Moreover, the ancient Greeks were on to something when they defined virtue as excellence, and modern society only shows its banality in viewing such a conception of virtue as not relevant simply because not many of the public have seen specific instances.

While I found the storyline of The Artist to be formulaic (the “punch-line” is actually in some classic films about silent-era stars), the selective use of sound was interesting, as was the decision to have the film mostly silent in a sound era. I make this observation from the standpoint of the art and science of film (as well as from the vantage-point of film history), which is not necessarily that of the general public. The film’s technical functions were fine; that those awards were spread around to other films may testify to the imprint of practitioners of the respective functions. In other words, one film might have had the best sound editing while another had the best art direction (though in 2012 Hugo got both of these).

Hugo deserved to win for its art and cinematography, as much of “that world” of the film is essentially art. The screenplay is also notable—even if many more people went to see Harry PotterHugo is not only a movie for kids; it contains, or isa commentary on functionalism (and machines, and indirectly, technology). The screenwriter backs this up with a theology that is historically associated with functionalism (i.e., Deism, or God as clock-maker).

Even so—and this is where having experienced screenwriters voting—the screenplay of The Descendants may have been even more decisive (i.e., excellent) because the protagonist’s (not the actor’s!) choices having to do with character (i.e., virtue) are key to the film itself and especially its narrative. Specifically, how far the protagonist decides to go against one of his antagonists is vital because anything would have been justified. In making the nuances of the protagonist’s choices, the screenwriting is vital to the film in saying something about being human. The general life and death theme means the story is ultimately about being human. Relative to the screenplay of The Descendants, that of The Artist is rather formulaic—even predictable. Watching the ending, I thought to myself, “I’ve seen that before.”

I do not believe that the general public is in a position to judge between the best of the art and science of film-making. Indeed, film-making itself, including its various technical functions, is not like cooking—something that most people can do (or even judge). Whereas having taste buds makes anyone a potential expert on whether a dish is “good,” we cannot assess sound mixing or art direction, or even period costumes, simply by watching the finished product. Even in regard to acting, even though bad acting is rather obvious to the viewer, discerning between good actors must surely be difficult for a viewer who has not studied and practiced the stills of acting. Knowing the “tricks of the trade,” only an experienced actor could discern the nuances that distinguish good actors from the best. We, the general public, already knew before Oscar night which films had been popular and thus “good” in terms of popular opinion. Left unknown until "the envelop please" was which films were the best as judged by the practitioners according to standards forged out of specialized training and years or even decades of experience.  

If standards sourced in expertise are indeed irrelevant in modern society, Hollywood's output might reduce to what we think we want: more meaningless but tasty eye-candy. As the old adage goes, be careful want you ask for; you might get it. Lest we get what we think we want, we might want to view the Oscars as something more than a rubber-stamp of the People's Choice Awards; we might deign to acknowledge without feeling humiliated that the typical viewer is not the best judge. This lesson is lost on Hollywood itself to the extent that producers and directors chase the "top grosser" prize for the first weekend. 

Arthur Abbott, the renowned retired screenwriter in The Holiday, has the obsession that has come to grip Hollywood in his cross-hairs as he addresses the crowd at the WGA event to honor him with a lifetime achievement award. "I came to Hollywood over 60 years ago," he says. "When I first arrived in Tinseltown . . . there were no cineplexes or multiplexes. No such thing as a Blockbuster or DVD. I was here before conglomerates owned the studios. Before pictures had special effects teams. And definitely before box office results were reported . . . like baseball scores on the nightly news." As subtext, Malcolm Lee, the film's screenwriter, was undoubtedly sending Hollywood a message: Things have gotten out of hand and the quality of films has suffered as a consequence.   

Karl Jung would say Arthur Abbott instantiates the "Old Man Wisdom" archetype of our collective unconscious. The viewers are thus inclined to respect Arthur's points. (Youtube: Jacky Huang Szu Han) 

Film-making need not be led by polls and focus groups like a dog chasing its tail. There is still such a thing as talent, which comes from intuitive aptitude, training, and experience. Such expertise is not always reflected in the first-weekend numbers, or, moreover, readily observable at a distance. Chasing that distance is an exercise in futility (or self-destructiveness) for any aspiring or veteran film-maker who values outstanding quality in the craft.

Source:
Michael Cieply and Brooks Barnes, “’Hugo’ Wins 2 Early Awards at the Oscars,” The New York Times, February 27, 2012.




Wednesday, January 18, 2012

“The Great Gatsby” in 3D

It is difficult for us mere mortals to take a step back and view the wider trajectory that we are on. It is much easier to relate today’s innovation back to the status quo and pat ourselves on the back amid all the excitement over the new toy. I content that this is the case in cinema.

I was enthralled in viewing Avatar, the film in which James Cameron pushed the envelope on 3D technology on Pandora even as he added the rather down-to-earth element of a biologist who smokes cigarettes. Three years later, his other epic film, Titanic, would be re-released in 3D a century to the month after the actual sinking. As if a publicity stunt choreographed by Cameron himself, the Costa Concordia had conveniently hit a reef about twenty feet from an island off the coast of Tuscany three months before the re-release. “It was like a scene out of Titanic,” one passenger said once on dry land—perhaps a stone’s throw from the boat.

The question of whether a serious drama without a fictional planet or a huge accident can support an audience’s tolerance for 3D glasses was very much on the mind of Baz Luhrmann as he was filming his 3D rendition of F. Scott Fitzgerald’s “The Great Gatsby” in 2011. According to Michael Cieply, Luhrmann’s film “will tell whether 3-D can actually serve actors as they struggle through a complex story set squarely inside the natural world.”[1] According to Cieply, the director spoke to him of using 3D to find a new intimacy in film. “How do you make it feel like you’re inside the room?” Luhrmann asked.[2] This is indeed 3D coming into a state of maturity, past the rush of thrilling vistas and coming-at-you threats. Indeed, for the viewer to feel more like he or she is “inside the room” places the technology on a longer trajectory.

“The Great Gatsby,” for instance, was first on the screen as “Gatsby,” a silent film in 1926—just a year after the novel had been published. Being in black and white and without even talking, the film could hardly give the viewers the sense of being “inside the room.” Then came the 1949 version directed by Elliott Nugent. A review in the New York Times referred to Alan Ladd’s reversion to “that stock character he usually plays” and to the “completely artificial and stiff” direction. So much for being “inside the room.” Even the 1974 version starring Robert Redford left Luhrmann wondering just who the Gatsby character is. More than 3D would presumably be needed for the viewers to feel like they are “inside the room.” Even so, 3D could help as long as the other factors, such as good screenwriting, acting, and directing, are in line.

So Luhrmann and his troupe viewed Hitchcock’s 3D version of “Dial M for Murder” (1954)—this date itself hinting that 3D is not as novel as viewers of “Avatar” might have thought. Watching “Dial M” was, according to Luhrmann, “like theater”—that is, like really being there. Ironically, 3D may proffer “realism” most where films are set like (i.e., could be) plays. Polanski’s “Carnage” is another case in point, being almost entirely set in an apartment and hallway. With such a set, a film could even be made to be viewed as virtual reality (i.e., by wearing those game head-sets). In contrast, moving from an apartment living room one minute to the top of a skyscraper the next might be a bit awkward viewed in virtual reality. In that new medium, the viewer could establish his or her own perspective to the action and even select from alternative endings (assuming repeat viewings).

In short, 3D can be viewed as “one step closer” to being “inside the room.” As such, the technology can be viewed as a temporary stop in the larger trajectory that potentially includes virtual reality—really having the sense of being inside the room, but for direct involvement with the characters and being able to move things. Contrasting “Avatar” with “Gatsby” is mere child’s play compared to this. The most significant obstacle, which may be leapt over eventually as newer technology arrives, is perhaps the price-point for 3D. In my view, it is artificially high, and too uniform.

Luhrmann’s budget of $125 million before government rebates is hardly more than conventional releases. Even if theatres charge $3 more for 3D films because of the cheap glasses and special projectors, it might be in the distributors’ interest to see to it that the films wind up costing consumers the same as a conventional one shown at a theatre. As an aside, it is odd that films with vastly different budgets have the same ticket price (which suggests windfalls for some productions, which belie claims of competitive market). In other words, a film of $125 million distributed widely could be treated as a conventional film in terms of the final pricing, and it need not be assumed that theatres would be taking a hit. Adding more to already-high ticket prices is a model that does not bode well for 3D as a way-station on the road to virtual reality. Of course, technology could leap over 3D if greed artificially choke off demand for 3D glasses. I for one am looking forward to virtual reality. Interestingly, the filmmakers shooting on the cheap with digital cameras then distributing via the internet may tell us more about how films in virtual reality might be distributed and viewed than how 3D films are being distributed and priced. People have a way of voting with their wallets (and purses), and other candidates have a way of popping up unless kept out by a pushy oligarch. So perhaps it can be said that, assuming a competitive marketplace, 3D may become a viable way-station on our way to virtual reality on Pandora.


1. Michael Cieply, “The Rich Are Different: They’re in 3-D,” The New York Times, January 17, 2012. 
2. Ibid.