Spoiler Alert: These essays are ideally to be read after viewing the respective films.
Showing posts with label film studios. Show all posts
Showing posts with label film studios. Show all posts

Sunday, May 24, 2020

Hail, Caesar!

For anyone interested in filmmaking, a film that features the internal operations of a film studio—especially one during the “Golden Age” of Hollywood—is likely to be captivating. After all, as Eddie Mannix, the studio executive in Hail, Caesar! (2016), says, the “vast masses of humanity look to pictures for information and uplift and, yes, entertainment.” This film provides all three for its audience on what film-making was like in the studio system. With regards to the Christian theology, however, the result is mixed.  The film makes the point that theological information best comes out indirectly from dramatic dialogue rather than discussion on theology itself. In other words, inserting a theological lecture into a film’s narrative is less effective than an impassioned speech by which entertainment and uplift can carry the information.


Mannix’s meeting scene with clerics is harried and thus difficult for the viewer (and Mannix) to digest, but Baird Whitlock’s emotional speech on a studio set as the Crucifixion scene is being filmed conveys religious ideas in an entertaining manner. The speech centers on what is so special about the person being crucified. The information is carried on Whitlock’s emotive warmth, and thus the acting of George Clooney who plays the character. In contrast, emotion is sparing in Mannix’s meeting with a rabbi, Catholic priest, Greek Orthodox priest, and Protestant minister. Instead, the scene is energized by fast-moving theological points, but this is unfortunately of little use to the viewers as demonstrated by Mannix’s confused reactions as the clerics debate. This is ironic because the scene’s role in the film narrative is to make the point that Capital Studio’s management takes the informational role the film being made seriously. Whereas Mannix just wants to know if any of the clerics are disturbed by the Jesus portrayed in Mannix’s film, the inclusion of the clerics’ discussion of theology begs the question: can’t film do any better in expressly handling theological concepts through dialogue? The viewer has not yet seen the scene of Whitlock’s emotional speech at the Crucifixion, but that scene does not address whether theological dialogue is viable in film. After watching Mannix’s meeting, the viewer likely answers, not well. The example may not be a good one, however.

How good is the medium of film in portraying Jesus Christ and the story that encapsulates him? I contend that this is precisely the question that Hail, Caesar! (2016) attempts to answer, but falls short. The scene of Mannix’s meeting not only relegates theological dialogue as being beyond the reach of viewers, but also assumes quite explicitly that the best portrayal of Jesus is that which is the least controversial. Because Whitlock’s reverential articulation of Jesus is appreciated universally on the movie set on which the film within the film is being shot, the message is that impassioned meaning itself is enduring; it is also the least likely to offend. Does not the strategy of coming up with a portrayal that does not offend anyone run the risk of being drab? Is such a portrayal merely a copy of the default, which may contain problems? Moreover, does the inclusion of something controversial take away from the uplift and entertainment value?

Even though avoiding anything controversial fit the 1950s—the time when the film takes place—especially in American society, and thus Hollywood, viewers watching the film in 2016 likely perceived the strategy to be antiquated and even suboptimal. Some viewers may have seen controversial films on Jesus such as Jesus Christ Superstar (1973) for trivializing the story with pop music, Jesus of Nazareth (1977) for emphasizing Jesus’ human characteristics at the expense of his divine Sonship, The Last Temptation of Christ (1988) for its moral stances and conflicted Jesus, and The Passion of the Christ (2004) for taking Jesus’s suffering beyond that in the New Testament. By 2016, the assumptions that explicitly theological dialogue is inherently beyond the grasp of viewers; such dialogue itself is too controversial; and films should rely instead on impassioned speeches could be reckoned as nonsense. Surely the controversy of King of Kings (1961) over the decision to show Jesus’ face would be deemed anything but controversial in retrospect. Hail Caesar! may be making the same point regarding the conformist era of the 1950s.

Perhaps the disappointment of Hippie idealism and the ensuing criticism of American government and society beginning in the late 1960s had accustomed Americans to viewing controversy as acceptable, and even finding it to be entertaining and uplifting in terms of ideational freedom (i.e., thinking outside the box). Studios may have been absorbed the cultural criticism in producing films like Jesus Christ Superstar that were certainly outside the box relative to the earlier films such as King of Kings. It could even be said that the medium was ushering in a new wave of historical theological criticism after that of the nineteenth-century Germans such as Feuerbach and Nietzsche. Put another way, perhaps their thought had finally percolated through or resonated with American society after 1968 such that studios could take chances precisely that are anathema to Mannix in Hail Caesar! and therefore 1950s Americana.
Of course, entertainment and uplift could not suffer; they were no longer assumed to be mutually exclusive with religious controversy.  Entertainment had been a mainstay of film since even before the medium partook of narrative. Fifty seconds of an oncoming train in Arrival of a Train at La Ciotat (1896), for example, thrilled audiences. Sound would only have added to the fright. Both uplift and sadness or fear can be entertaining. In its de facto insistence on happy endings, Hollywood has neglected this point. Relatedly, an insistence on avoiding controversy out of fear that it would detract from the entertainment-value of a film neglects the possibility that controversy could add entertainment-value while providing thought-provoking information. Even thinking about abstract ideas after viewing a film can be entertaining for some people because those ideas came out of a narrative.

Generally speaking, the information/knowledge element is most salient in documentaries, but even fictional narrative is capable of carrying heavy weight in this regard. In regard to the religious content of Judaism and Christianity, Mannix says, “The Bible of course is terrific, but for millions of people, pictures will be their reference point to the story.” He predicts that film would even become the story’s embodiment. In other words, he predicts that from his time in the 1950s, film would come to supersede even the Bible itself because of the film medium’s greater potential to provide information, uplift, and entertainment. One of my reasons for studying film is indeed the medium’s hegemony and thus role in transmitting abstract ideas and even theories.

While I do not doubt the medium’s tremendous potential to present an experience in the story-world by means of visuals and sound, whereas a book is only text that must be read, Mannix omits the pleasure that can be afforded only by the human imagination without visuals and sound to constrict the imagination to a story world presented by film. Especially in the multi-layered genre of mythology (i.e., religious narrative), imagination can be stretched in a myriad ways and on many levels, given the scope for interpretation in myth.

On the other hand, even though film constrains imagination to within the contours of a story world, the mind’s ability to suspend disbelief allows for immersion into such a world, resulting in greater understanding as well as uplift and entertainment. A viewer can “enter” a film’s audio-visual story world cognitively, perceptually, and emotionally such that a sense of experiencing can be had. Experiencing the Biblical world can enable a viewer to better understand Jesus’ dialogues because they are in their contexts. To the extent that the ancient world historically can inform our understanding of the Biblical world, film can make use of historians and anthropologists in order to improve on how that world is being portrayed. To be sure, the Biblical world is distinct from history, and our knowledge of the ancient past is limited. Film carries with it the risk that viewers might take a portrayal as the world that a historical Jesus would have known rather than that of a faith narrative. The use of abstract dialogue does not suffer from this problem because the ideas being exchanged transcend the dialogue’s context. So the assumption that narrative-specific impassioned speeches are superior to such dialogue is flawed. Of course, this assumption in Hail, Caesar! supports the problematic assumption that controversy must at all means be avoided in order to maximize the entertainment value and uplift, which in turn relate to profitability.

I turn now to a scene analysis of Mannix’s meeting with the clerics in order to make several points, including that the scene is a bad example of how a religious film can effectively use abstract dialogue. The studio executive wants expert feedback both from within Christianity and outside of it to make sure that no viewers whatsoever will take offence to the Jesus being portrayed in Mannix’s movie—the film within the film. When Mannix first asks his guests whether they have any theological objections to the movie being made, the Greek Orthodox priest complains that the chariots in one scene go too fast. Even a cleric has difficulty turning to religious dialogue! The message to the viewer can only be that such dialogue is neither natural nor befitting a film-viewing. This point supports the film’s solution by means of an impassioned speech even if the implications regarding the use of abstract dialogue in film are wrong.
At the studio executive’s urging, the clerics finally focus on the task at hand. “The nature of Jesus is not as simplistic as your picture would have it,” the Catholic priest says. He is speaking theologically. “It is not as simple as God is Christ or Christ is God,” he explains. The portrayal should go further. It should show that Jesus is “the Son of God who takes the sins of the world upon himself so we may enter the Kingdom of God.” Indeed, the Jesus of the Gospels announces that his mission to preach the mysteries of the Kingdom. Unfortunately, the screenwriter did not have the priest say anything about that kingdom (e.g., how to get in it). Instead, the priest’s focus, consistent with the history of theology, is left at Christ’s identity (i.e., Christology) in salvation (i.e., Soteriology) even though the less abstract teachings of Jesus on how to enter his Father’s kingdom, such as benevolence even to detractors and enemies, would be more easily comprehended by viewers.

After the priest’s abstract theological point, the clerics rapid-fire contending points so fast and without sufficient explanation to Mannix (who seems clueless even though he goes to confession daily) that the viewers are clearly not deemed able to follow a theological discussion. Yet the film makes a straw-man’s argument by presenting the dialogue at such a fast pace that little could be gained from the ideas expressed.

The Protestant minister says that Jesus is part God. The rabbi counters that the historical Jesus was a man. Mannix, a Roman Catholic, asks, “So God is split?” to which the Catholic priest answers, “Yes and no.” The Greek priest says, “Unity in division” and the Protestant minister adds, “And division in unity.” Such word games do not advance a viewer’s comprehension of the dialogue. As if standing in for the viewer, Mannix loses his concentration and admits, “I don’t follow that.” The best line of the movie comes when the rabbi replies, “You don’t follow it for a very simple reason; these men are screwballs.”

From the Jewish standpoint, the Christian clerics have gotten themselves tied up in knots because they are claiming something that a human being is both fully human and fully divine. Aside from a historical Jesus, the god-man character in faith narratives goes against the Jewish belief that a chasm separates human beings from God. The belief that God has an incarnate human form (i.e., a human body) smacks Jews as a case of self-idolatry.  As confirmed at the Council of Nicaea (325 C.E.), Christian theology upholds that Jesus has two natures in himself—the divine and the human (except for sin). The two natures stay distinct in Jesus, so the divine is of the same substance (consubstantial) with the other two manifestations (or “persons”) of the Trinity; the human nature is unaffected by the divine except for the former being without sin. This is necessary so Jesus’ self-sacrifice on the Cross can be for other people rather than to pay the price of his own sin.

For the viewers, an analogy would have served better than the abstractions in the dialogue. Oil and water in a cup, for example, would have been more easily understood. The screenwriters fare better when the theological discussion turns to God (i.e., the Godhead). The Catholic priest claims that the Jews worship a god who has no love. “God loves Jews,” the rabbi retorts. Reacting to the unloving way in which Yahweh treats other people, the Protestant minister insists that God loves everyone. Yahweh’s statement that vengeance is His does not square with God being love. In his writings, Nietzsche argues that this incongruity discredits the conception of Yahweh in the Bible. It is the discredited conception that Nietzsche refers to in writing, “God is dead.” Fortunately, as St. Denis points out in his writings, God transcends human conceptions of God. The screenwriter could have had the Rabbi make this point, and moreover, that the Christian clerics are too obsessed with theological distinctions that assume the validity of the operative conception wherein a vice belongs to God, which is perfect goodness (omnibenevolent).   

As if channeling Augustine to refute the rabbi, the Catholic priest says, “God is love.” Calvin’s writings contain the same point, which can be construed as the core of Christianity. Whereas Augustine’s theological love (caritas) is human love raised to the highest good (i.e., God), Calvin’s is the divine self-emptying (agape) love. Whether or not human nature, even Eros, is part of Christian theological love, it manifests as universal benevolence (benevolentia universalis). In the film, the rabbi could have asked the other clerics whether humans are capable of self-emptying divine love (i.e., agape), and how the god of love handles the evil people, given that God is all-powerful (omnipotent). The clerics could have pointed very concretely to how a person can enter the Kingdom of God.

Instead, the Greek priest gets existential, insisting that the basis of God is love is, “God is who He is.” The screenwriter missed an opportunity for the rabbi to say, God is I Am. The implication is that theological love is divine existence, which transcends existence within Creation. God’s nature and very existence as love may thus be wholly other than human conceptions and experiences of love and existence. St. Denis made this point in the sixth century, and yet, as David Hume pointed out in the eighteenth century, the human brain is naturally inclined to view the unknown by attributing human characteristics to it.

The theological dialogue in the meeting scene could have brought the viewers to the point of appreciating God’s wholly otherness as transcending even the polished theological distinctions that we make. However, Mannix, who goes to confession daily, personifies the assumption that even religious viewers would get lost in theological dialogue in a film even though the rushed dialogue is rigged to support this assumption. The studio executive, for whom profitability is important, states up front in the meeting that he just wants to know whether the portrayal of Jesus in the film being made offends “any reasonable American regardless of faith or creed. I want to know if the theological elements are up to snuff.” Given the rabbi’s statements, however, the portrayal of Jesus as a god-man would be controversial at least to Jews. So Mannix really means to Christians. That’s all Mannix wants from the meeting, so to him even the theological bantering is a distraction. In fact, it could invite controversy for the film, Hail Caesar!, even though the film within the film is not controversial. On this meta-level, the religious dialogue is written as comedic perhaps for this reason, though by 2016 avoiding controversy would not likely be a concern. To be sure, even then for a cleric to suggest that divine mystery goes beyond the Christian understanding of Jesus being of two distinct natures would invite controversy. St. Denis’ claim that God transcends even our conception of the Trinity would certainly be controversial even in the early twenty-first century.   

Regarding the 1950s film within the film, Mannix asks at the end of the meeting scene, “Is our depiction of Jesus fair?” Without questioning Mannix's underlying assumption that fairness means non-controversial, the Protestant minister, answers, “There’s nothing to offend a reasonable man.” By implication, to present anything that offends a reasonable person would be unjust even if controversy would likely occur from presenting advances in theological understanding, including alternative views, which alter or question the default.  A reasonable person is almost defined as one who holds the orthodox (i.e., doctrinal) belief on Jesus’ identity (i.e., Christology). By implication, it is fair if an unreasonable man—a person who has a “deviant” Christological belief—is offended. Such fairness, it turns out, is not so fair; it is at the very least biased in favor of the tyranny of the status quo both as it applied to theological interpretation and the wider heavily-conformist American society in the 1950s.

Mannix represents the position that theology can and should be filtered through the lenses of business. That of the sacred that reaches the viewers must survive the cutting board of the profane. Because the Catholic priest says that the portrayal of Jesus in the film being made in Hail Caesar! is too simplistic, perhaps the message is that only simplified theology survives. While this point applies well to 1950s Hollywood cinema, the plethora of controversial films on Jesus since the utopian convention-defying days of the late 1960s in America suggests that controversial films can indeed be profitable, at least if a wider society is no longer so conformist. Indeed, societal judgments on what is controversial have varied over time. 

Even theologians’ views of profit-seeking have changed through the centuries of Christianity. Until the Commercial revolution, the dominant view was that salvation and money are mutually exclusive.[1] The rich man cannot enter the Kingdom of Heaven. Willowing down theology to suit profitability would have been deemed anathema. With greater importance being attributed to Christian virtues actualized by profit-seeking followed by the belief that God rewards Christians monetarily for having true belief (i.e., that Jesus saves souls), Christian clerics in the twentieth century could be more accommodating of studio executives. The end of reaching a large audience, for instance, could have been believed to justify unprofitable scraps of theology on the cutting-room floor. The historical uncoupling of greed from wealth and profit-seeking, having been accomplished by the end of the Italian Renaissance, made permissible such an accommodation. Indeed, if God is believed to reward faithful Christians monetarily, as is held in the Prosperity Gospel, then a profit-seeking studio executive would be seen as being favored by God in using profit as the litmus-test for theology. 

Although in the film's period of the 1950s any explicit questioning and criticism of the operative assumptions in Hail, Caesar! would likely have been squashed like bugs, the screenwriter could have included such material (even the squashing) so the viewers in 2016 could have a better understanding of just how narrow, and even arbitrary, the film's historical assumptions are. Therefore, both in terms of theology and the related societal context, the screenwriter could have delivered more to both inform and entertain, with the uplift including what naturally comes from putting a theology and social reality (i.e., of the 1950s) in a broader, contextual macro- or meta-perspective. 

Thursday, September 28, 2017

Writing an Original Screenplay

Jay Fernandez of The Hollywood Reporter asks, “Who’s to blame for the lack of original movie projects being submitted to film studios these days?” He points to vertical integration and a bottom-line reliance on pre-branded franchises, plus diminished film slates, producer deals, and writing jobs. Indeed, in early 2011, spec submissions were down by more than half.

Given the increased competition and the pressure of the studios, writers and agents “looking to maintain careers and commissions” have been “abandoning original screenplays to deliver template-fitting material.” As one lit agent said, “It’s the system that’s at fault, not the writer.” Of course, it could also be argued that studios have been going for known commodities, such as in multiple sequals, because the writers have run out of material. According to one studio head, writers “can’t get themselves up to write something original.”

I must admit I have looked at all the formulaic films and wondered whether narrative itself had been exhausted. The rigidity of a screenplay’s structure and format, for instance, must surely narrow the sort of narrative that can come through the pipeline.

For example, having an inciting event 10 to 12 pages in and a critical event about 10 pages from the end means that the narrative’s tension runs from 10-12 pages in until 10 pages from the end. Having a regularity akin to Joseph Campbell’s journey of the hero, a screenplay’s protagonist is bound to be seeking to restore equilibrium from 10-12 pages in until 10 pages from the end. Would it kill a narrative if the protagonist is seen in his or her new world for more than ten pages? Might the viewers enjoy seeing the protagonist in his or her original world for more than 10-12 pages? Furthermore, how might film narratives differ if the instigating event were to happen up front?

As tempting as it might be to loosen up the screenplay format (assuming it is arbitrary from the standpoint of what makes good narrative), it is worth asking whether original narratives are still possible even within the screenplay box. If they are possible, it is worth investigating how writers can come up with original plots. I suspect the answer lies in the writer becoming aware of the assumptions in his or her extant stories so as to be able to relax or change paradigms or frameworks so as to come up with novel narratives.

Also, a writer could do worse than study classic myth so as to get a deeper sense of basic themes that could be woven into new fabric for today. By this I do not mean that modern writers should simply pour old wine into new bottles; rather, the ancient ingredients—once known—can be interwoven in new ways to create new plot structures.  Simply engaging in thought-experiments in coming up with innovative short stories can be like weight-lifting for the writer interested in going out and playing in game of screenwriting. 

Of course, context matters, and studios having allowed themselves to be more dependent on remakes and reinventions has translated into “creative stagnation,” according to Fernandez. Working within the confines of prefab projects, writers are given the house in order to decorate it. As one writer observed, “You can’t build your own house, and you can’t change the house.” That is hardly the sort of context in which the narratives that can generate real interest in cinema are likely to be purchased, let alone written.

As the field of writers narrows and the studios become increasingly risk-averse as the costs of producing a film increase, creativity must be reckoned as collateral damage.Yet even in this eye of the needle, even just those few writers who have gained entry can think outside the box and make alliances with the talent to lobby producers for relatively small-budget projects. More ideally, actors and even producers could use the social media and explore blogs in order to look beyond the usual suspects if only to get an inkling of the alternative stories out there in small electronic ponds called blogs (perhaps one all-too-imaginative writer will write a screenplay on the blog-pond monster that eats up the radiation in Japan and saves the day--the antithesis of Godzilla).

In short, there are indeed fruitful alternatives to deconstruction (e.g., the New Wave, Neo-Realism). We need not eclipse narrative, as if the human race has outlived story-telling. We need not give up on the possibility of rich, new stories that have not hitherto been thought and told.


Source:

Jay A. Fernandez, “Crisis at the Movies: No New Ideas,” The Hollywood Reporter, May 20, 2011, pp. 8-9.

Wednesday, March 11, 2015

Disney Re-Making Stories: The End of Creativity?

Politicians running for re-election may “remake themselves.” Companies “reinvent themselves.” If the company happens to make films, are the stories necessarily reinvented—essentially being retold—too? If this becomes the norm, is the implication that storytellers have exhausted the story plotlines that the human mind can conceive? Perhaps retelling old stories is simply laziness and corporate expediency at the expense of substance.

In March 2015, Walt Disney Pictures announced that it would concentrate on live-action versions of classic fairy tales.[1] That the latter had come from Walt Disney Animation Studios renders the strategy synergistic, which is to say, convenient financially. For one thing, the same market-segment is “carried along.” In the case of “Cinderella,” the operative demographic is girls. Other “reinvented” stories on Disney’s radar screen at the time included “Alice in Wonderland,” “The Jungle Book,” “Beauty and the Beast,” and “Dumbo.” Novelty in storytelling seems to be lost in the same old, same old—albeit in new packaging.

To be sure, some narrative creation goes with even such re-tellings. For example, the re-made “Cinderella” includes a back story explaining the step-mother’s cruelty, new stuff on the prince’s relationship with his father, the king, and a reason why Cinderella “doesn’t run from home or fight back.”[2] These additions take the basic story as a given, however, and this tendency may imply that we have squeezed out all the great plot-lines that we can possibly imagine. Even if the well of plot-types has not gone dry because they have pretty much already been lifted out into the light of day, the habit of “re-inventing” existing stories can orient energy away from narrative creativity to the extent that the empty well becomes a self-fulfilling prophesy. Audiences used to being spoon-fed the same story over and over may come to expect nothing more and reward the film companies that take the road most travelled. That is to say, the status quo can become a virtual black hole to which potential creative energy cannot escape. As real as the limitations facing creative storytellers may seem, at least some of the constraints may be contrived, and thus artificial.



[1] Ben Fritz, “Disney Recycles Fairy Tales, Minus Cartoons,” The Wall Street Journal, March 11, 2015.
[2] Ibid.

Friday, April 27, 2012

Hollywood Bribes China

The Foreign Corrupt Practices Act, known as F.C.P.A., “forbids American companies from making illegal payments to government officials or others to ease the way for operations in foreign countries.”[1] The practical difficulty facing American companies doing business around the world is that in some cultures bribes are so ubiquitous they are simply a part of doing business.  For American companies to refuse to participate in what is generally expected can be a competitive disadvantage, particularly if substitutes exist and the practice is widespread.


The full essay is in Cases of Unethical Business: A Malignant Mentality of Mendacity, available in print and as an ebook at Amazon.


1. Edward Wyatt, Michael Cieply, and Brooks Barnes, “S.E.C. Asks if Hollywood Paid Bribes in China,” The New York Times, April 25, 2012.

Wednesday, January 18, 2012

“The Great Gatsby” in 3D

It is difficult for us mere mortals to take a step back and view the wider trajectory that we are on. It is much easier to relate today’s innovation back to the status quo and pat ourselves on the back amid all the excitement over the new toy. I content that this is the case in cinema.

I was enthralled in viewing Avatar, the film in which James Cameron pushed the envelope on 3D technology on Pandora even as he added the rather down-to-earth element of a biologist who smokes cigarettes. Three years later, his other epic film, Titanic, would be re-released in 3D a century to the month after the actual sinking. As if a publicity stunt choreographed by Cameron himself, the Costa Concordia had conveniently hit a reef about twenty feet from an island off the coast of Tuscany three months before the re-release. “It was like a scene out of Titanic,” one passenger said once on dry land—perhaps a stone’s throw from the boat.

The question of whether a serious drama without a fictional planet or a huge accident can support an audience’s tolerance for 3D glasses was very much on the mind of Baz Luhrmann as he was filming his 3D rendition of F. Scott Fitzgerald’s “The Great Gatsby” in 2011. According to Michael Cieply, Luhrmann’s film “will tell whether 3-D can actually serve actors as they struggle through a complex story set squarely inside the natural world.”[1] According to Cieply, the director spoke to him of using 3D to find a new intimacy in film. “How do you make it feel like you’re inside the room?” Luhrmann asked.[2] This is indeed 3D coming into a state of maturity, past the rush of thrilling vistas and coming-at-you threats. Indeed, for the viewer to feel more like he or she is “inside the room” places the technology on a longer trajectory.

“The Great Gatsby,” for instance, was first on the screen as “Gatsby,” a silent film in 1926—just a year after the novel had been published. Being in black and white and without even talking, the film could hardly give the viewers the sense of being “inside the room.” Then came the 1949 version directed by Elliott Nugent. A review in the New York Times referred to Alan Ladd’s reversion to “that stock character he usually plays” and to the “completely artificial and stiff” direction. So much for being “inside the room.” Even the 1974 version starring Robert Redford left Luhrmann wondering just who the Gatsby character is. More than 3D would presumably be needed for the viewers to feel like they are “inside the room.” Even so, 3D could help as long as the other factors, such as good screenwriting, acting, and directing, are in line.

So Luhrmann and his troupe viewed Hitchcock’s 3D version of “Dial M for Murder” (1954)—this date itself hinting that 3D is not as novel as viewers of “Avatar” might have thought. Watching “Dial M” was, according to Luhrmann, “like theater”—that is, like really being there. Ironically, 3D may proffer “realism” most where films are set like (i.e., could be) plays. Polanski’s “Carnage” is another case in point, being almost entirely set in an apartment and hallway. With such a set, a film could even be made to be viewed as virtual reality (i.e., by wearing those game head-sets). In contrast, moving from an apartment living room one minute to the top of a skyscraper the next might be a bit awkward viewed in virtual reality. In that new medium, the viewer could establish his or her own perspective to the action and even select from alternative endings (assuming repeat viewings).

In short, 3D can be viewed as “one step closer” to being “inside the room.” As such, the technology can be viewed as a temporary stop in the larger trajectory that potentially includes virtual reality—really having the sense of being inside the room, but for direct involvement with the characters and being able to move things. Contrasting “Avatar” with “Gatsby” is mere child’s play compared to this. The most significant obstacle, which may be leapt over eventually as newer technology arrives, is perhaps the price-point for 3D. In my view, it is artificially high, and too uniform.

Luhrmann’s budget of $125 million before government rebates is hardly more than conventional releases. Even if theatres charge $3 more for 3D films because of the cheap glasses and special projectors, it might be in the distributors’ interest to see to it that the films wind up costing consumers the same as a conventional one shown at a theatre. As an aside, it is odd that films with vastly different budgets have the same ticket price (which suggests windfalls for some productions, which belie claims of competitive market). In other words, a film of $125 million distributed widely could be treated as a conventional film in terms of the final pricing, and it need not be assumed that theatres would be taking a hit. Adding more to already-high ticket prices is a model that does not bode well for 3D as a way-station on the road to virtual reality. Of course, technology could leap over 3D if greed artificially choke off demand for 3D glasses. I for one am looking forward to virtual reality. Interestingly, the filmmakers shooting on the cheap with digital cameras then distributing via the internet may tell us more about how films in virtual reality might be distributed and viewed than how 3D films are being distributed and priced. People have a way of voting with their wallets (and purses), and other candidates have a way of popping up unless kept out by a pushy oligarch. So perhaps it can be said that, assuming a competitive marketplace, 3D may become a viable way-station on our way to virtual reality on Pandora.


1. Michael Cieply, “The Rich Are Different: They’re in 3-D,” The New York Times, January 17, 2012. 
2. Ibid.

Tuesday, January 17, 2012

Hollywood on Risk: Snubbing Lucus’s “Red Tails”

When George Lucus showed Red Tails to executives from all the Hollywood studios, every one of the execs said no. One studio’s executives did not even show up for the screening. “Isn’t this their job?” Lucas said, astonished. “Isn’t their job at least to see movies? It’s not like some Sundance kid coming in there and saying, ‘I’ve got this little movie — would you see it?’ If Steven (Spielberg) or I or Jim Cameron or Bob Zemeckis comes in there, and they say, ‘We don’t even want to bother to see it.”[1] According to one newpaper, the snub implied that “Lucas’s pop-culture collateral — six ‘Star Wars’ movies, four ‘Indiana Jones’ movies, the effects shop Industrial Light and Magic and toy licenses that were selling (at least) four different light sabers . . .  — was basically worthless.”[2] As a result, Lucas paid for everything, including the prints, to enable the film’s opening. What can explain this bizarre snub?

Lucus was “battling former acolytes who [had] become his sworn enemies.”[3] This would be Star Wars fans, or “fanboys,” who have been upset because Lucus has made some changes to the films in new editions. “’On the Internet, all those same guys that are complaining I made a change are completely changing the movie,’ Lucas says, referring to fans who, like the dreaded studios, have done their own forcible re-edits.”[4] However, in being directed to black teenagers, “Red Tails” may not be directed to “Star Wars” fans. The snub could simply reflect the way business is done in Hollywood—meaning its tendency to be conservative, or hesitant, toward new ideas.

Regardless of a director’s past filmography, if the film being proposed does not fit with the current tastes of the targeted market segment, there’s not going to much studio interest. Lucus readily admits there’s not really much swearing in Red Tails. Nor is there a huge amount of blood in it; nobody’s head’s going to get blown off. Rather, the stress is on patriotism, and this is supposed to work for black teenagers. The fact that Lucus made Star Wars and Indiana Jones does not mean that he is right on Red Tails. At the same time, it was not as if he were an unknown. Studio execs could have given the filmmaker’s past accomplishments some weight, if only as proffering seasoned judgment from experience.

Moreover, marketing technicians are not always right in anticipating how word might spread concerning a film that could change tastes. Being confined to current tastes, filmmakers could never lead. Cuba Gooding Jr., one of the stars of Red Tails, points out that even a blockbuster can be unanticipated by the studios’ gatekeepers. “I like to say James Cameron made a movie just like this,” he said excitedly. “Instead of black people, there were blue people being held down by white people. It was called ‘Avatar!’ And the studios said the same thing to him: ‘We can’t do a movie with blue people!’”[5] Particularly where new technology and a different narrative are involved, the studios could be far too timid even for their own financial good. Lucus could have been reacting to this more than to childish fans.

“I’m retiring,” Lucas said. “I’m moving away from the business, from the company, from all this kind of stuff.”[6] Byran Curtis,  a reporter, concludes of Lucus’s decision, “He can hardly be blamed.” Rick McCallum, who had been producing Lucas’s films for more than 20 years, said “Once this is finished, he’s done everything he’s ever wanted to do. He will have completed his task as a man and a filmmaker.” According to Curtis, “Lucas has decided to devote the rest of his life to what cineastes in the 1970s used to call personal films. They’ll be small in scope, esoteric in subject and screened mostly in art houses.” Besides understandably being tired of ahistoric, short-term-financially-oriented studio executives and childish fans, Lucus had accomplished his task “as a man and a filmmaker.”[6] He could literally afford to spend the rest of his working life playing in pure creativity without regard to commercial roadblocks.

It will be others’ task to try to narrow the distance between that realm and that of the bottom-line-oriented studios. This is perhaps the challenge—the true bottom-line: namely, how to tweak the studios’ business model so creativity has enough room to breathe. Part of the solution could involve the increasing ease in filmmaking on the cheap, enabled by technological advances in equipment such as digital cameras and in distribution (e.g., the internet rather than theatres), as well as by an over-supply of actors. Young people in particular have taken to watching movies on a laptop or ipad. Any resulting downward pressure on price could affect the costs of even the blockbusters, such that actors making $20 million or more per film could be a thing of the past. As of the end of the first decade of the twenty-first century, the cost structure in Hollywood had all the distortions of an oligopoly (even monopoly), with the result that movie tickets were too high for two hours of movie experience. From the constriction that naturally comes with high prices, the industry itself could expand in terms of viewers and financially-viable genres of film were underlying cost-structure deflated by competition from the low end.

In retiring to make films “on the fly,” Lucus was once again ahead of the curve in orienting himself to the more fluid, less risk-averse “art house” world of filmmaking. While traditional studios and theatres will not contort themselves to fit it, the industry itself should look more diverse in 2020—running from high-priced “Avatar”-like 3D IMAX “experiences" to more films at a lower price downloadable on an ipad. Looking even further out, I would not be surprised if “films” in virtual reality make traditional movie theatres obsolete. I would not expect the studio executives who were not even willing to hear Lucus out to be among the trailblazers. In an industry like cinema, good far-sighted vision should be, and ultimately is, rewarded even if today’s bottom-line is in the driver’s seat.


1. Byran Curtis, “George Lucus Is Ready to Roll the Credits,” The New York Times, January 17, 2012. 
2. Ibid.
3. Ibid.
4. Ibid.
5. Ibid.
6. Ibid.

Wednesday, April 27, 2011

Computer Technology Revolutionizing Industries: Books and Films

Crude oil was first drilled in 1859 in northwestern Pennsylvania (not in the desert of the Middle East). It was not long before oil lamps became ubiquitous, lengthening the productive day for millions beyond daylight hours. Just fifty or sixty years later, as electricity was beginning to replace the lamps, Ford’s mass-produced automobile was taking off, providing an alternative use of crude oil. For those of us alive in the early decades of the twenty-first century, electric lighting indoors and cars on paved roads have been around as long as we can remember. As a result, we tend to assume that things will go on pretty much as they “always” have. Other than for computer technology, the end of the first decade of the 21st century looks nearly indistinguishable from the last thirty or forty years of the last century. As the second decade of the 21st century began, applications based on computer technology were reaching a critical mass in terms of triggering shifts in some industries that had seemingly “always” been there.  Books, music and movies were certainly among the fastest moving, perhaps like the dramatic change in lighting and cars beginning a century and a half before with the discovery of crude oil.


The full essay is at "Computer Technology Revolutionizing Industries."

Tuesday, April 26, 2011

Organizational Bureaucracy at Odds with Creativity in Film

Art through corporate bureaucracy can be likened to oil and water. The rise of the studio system to produce film as an art form thus evinces a necessary evil. To be sure, organization is necessary to literally organize the various facets involved in the production of a film. However, needless managerial levels have gone beyond what is needed for coordination, particularly in television, and have stifled good narrative in the process.

Ken Loach, a feature and television film director, declared, “Television kills creativity; work is produced beneath a pyramid of producers, executive producers, commissioning editors, heads of department, assistant heads of department, and so on, that sit on top of the group of people doing the work, and stifle the life out of them.”[1] These suits are told to control the creativity even though the latter cannot be controlled without dying out in the process. According to Loach, “if you’ve got ten people sitting on your shoulder you can’t be good, you can’t be creative.”[2] For example, directors say they are told that they are not allowed to work with the writers. Instead, the directors work with managers, who somehow view themselves as qualified to write narrative because they are oriented to business factors. The result has been artificially-constructed television programing akin to politicians running solely off polls. Although financial concerns have a legitimate place, they are of such import to the layers of managers that cheap reality shows have trumped serious drama with a coherent, thought-out plot.

According to Loach, television, which “began with such high hopes,” has become “a grotesque reality show.”[3] To be sure, Loach admits that “some good work gets through.”[4] Even so, it is much too hard for it to survive the inevitable onslaught of the bureaucratic knives unscathed. The editing done by managers is fundamentally different than that which writers would do—and not for the better.

Perhaps rather than tearing up scripts that have been accepted, managers could have confidence in their own decisions in accepting the scripts by letting the writers themselves work out any changes with the directors. In other words, in putting an accepted script through the meat-grinder, are not executives and their staff undercutting their own decision to accept the script?  Of course, a particular acceptance could be to say that a script is only “good enough to get through the door.”  In other words, it would be understood that the script is to be considered as only partially done when it arrives. I would caution against such an “acceptance” because managers oriented to business matters are not likely to function as surrogate writers in finishing the job. A writer is a writer whereas a manager is a manager. Business expertise does not proffer the ability to tell a story.

Therefore, I contend that scripts ought to be accepted that can stand on their own as scripts. That is to say, the accepting executive ought to believe that the scripts he or she pays for are good already, and thus that the respective writers can be trusted to accommodate changes that the director believes are necessary.  

A producer ought to be on the look-out for the following: “What writers need to write are original stories, original characters, plot, conflict, things that dig into our current experience. Things that really show us how we’re living, give us a perspective on what is happening”[5] (p. 41). Sometimes in watching a movie, I can sense what will come next because the formula has already become hackneyed.  I have even thought that nearly a century of films has perhaps exhausted good narrative.

The screenplay’s structure is so “scripted” that the exactitude of the uniform structure may itself willow away originality and creativity. It is perhaps like trying to fit lots of different shapes through a very small hole.  The defining structure, such as there being three acts—the first running twelve to fifteen pages and ending in a triggering event that in turn leads in act two to a critical event that is seen to be resolved in the last act—seems needlessly confining. Are there not other possible structures compossible with film narrative? 

On the other hand, I suspect that creativity can still be applied through the existing structure if there are original stories and characters out there in someone’s imagination. However, the standard structure ought not be allowed to exclude any stories that are original yet not conducive to that particular structure. Perhaps a new structure could naturally come out of such an original story. I suspect that the specificity of the formatting and length is primarily a means of standardizing incoming scripts so they can be more easily compared. While convenient, the guidelines may be contributing to movie-goers viewing the films as too formulaic.  For example, boy meets girl, girl pushes boy away, boy wins back girl, and the two embrace. Girl goes with other boy is scarcely off the formula.

In any case, creativity is urgently needed among screenwriters, and the protection (and respect) of creativity is urgently needed among managers having control over the art. Just because a person can control something doesn’t mean they should hold it so tightly—squeezing the air out of it.


1. Ken Loach, “Between Commodity and Communication: Has Film Fulfilled Its Potential?” International Socialist Review, 76 (March-April 2011), 28-44, p. 40.
2. Ibid., p. 41.
3. Ibid.
4. Ibid.