Spoiler Alert: These essays are ideally to be read after viewing the respective films.

Friday, April 27, 2012

Hollywood Bribes China

The Foreign Corrupt Practices Act, known as F.C.P.A., “forbids American companies from making illegal payments to government officials or others to ease the way for operations in foreign countries.”[1] The practical difficulty facing American companies doing business around the world is that in some cultures bribes are so ubiquitous they are simply a part of doing business.  For American companies to refuse to participate in what is generally expected can be a competitive disadvantage, particularly if substitutes exist and the practice is widespread.


The full essay is in Cases of Unethical Business: A Malignant Mentality of Mendacity, available in print and as an ebook at Amazon.


1. Edward Wyatt, Michael Cieply, and Brooks Barnes, “S.E.C. Asks if Hollywood Paid Bribes in China,” The New York Times, April 25, 2012.

Wednesday, January 18, 2012

“The Great Gatsby” in 3D

It is difficult for us mere mortals to take a step back and view the wider trajectory that we are on. It is much easier to relate today’s innovation back to the status quo and pat ourselves on the back amid all the excitement over the new toy. I content that this is the case in cinema.

I was enthralled in viewing Avatar, the film in which James Cameron pushed the envelope on 3D technology on Pandora even as he added the rather down-to-earth element of a biologist who smokes cigarettes. Three years later, his other epic film, Titanic, would be re-released in 3D a century to the month after the actual sinking. As if a publicity stunt choreographed by Cameron himself, the Costa Concordia had conveniently hit a reef about twenty feet from an island off the coast of Tuscany three months before the re-release. “It was like a scene out of Titanic,” one passenger said once on dry land—perhaps a stone’s throw from the boat.

The question of whether a serious drama without a fictional planet or a huge accident can support an audience’s tolerance for 3D glasses was very much on the mind of Baz Luhrmann as he was filming his 3D rendition of F. Scott Fitzgerald’s “The Great Gatsby” in 2011. According to Michael Cieply, Luhrmann’s film “will tell whether 3-D can actually serve actors as they struggle through a complex story set squarely inside the natural world.”[1] According to Cieply, the director spoke to him of using 3D to find a new intimacy in film. “How do you make it feel like you’re inside the room?” Luhrmann asked.[2] This is indeed 3D coming into a state of maturity, past the rush of thrilling vistas and coming-at-you threats. Indeed, for the viewer to feel more like he or she is “inside the room” places the technology on a longer trajectory.

“The Great Gatsby,” for instance, was first on the screen as “Gatsby,” a silent film in 1926—just a year after the novel had been published. Being in black and white and without even talking, the film could hardly give the viewers the sense of being “inside the room.” Then came the 1949 version directed by Elliott Nugent. A review in the New York Times referred to Alan Ladd’s reversion to “that stock character he usually plays” and to the “completely artificial and stiff” direction. So much for being “inside the room.” Even the 1974 version starring Robert Redford left Luhrmann wondering just who the Gatsby character is. More than 3D would presumably be needed for the viewers to feel like they are “inside the room.” Even so, 3D could help as long as the other factors, such as good screenwriting, acting, and directing, are in line.

So Luhrmann and his troupe viewed Hitchcock’s 3D version of “Dial M for Murder” (1954)—this date itself hinting that 3D is not as novel as viewers of “Avatar” might have thought. Watching “Dial M” was, according to Luhrmann, “like theater”—that is, like really being there. Ironically, 3D may proffer “realism” most where films are set like (i.e., could be) plays. Polanski’s “Carnage” is another case in point, being almost entirely set in an apartment and hallway. With such a set, a film could even be made to be viewed as virtual reality (i.e., by wearing those game head-sets). In contrast, moving from an apartment living room one minute to the top of a skyscraper the next might be a bit awkward viewed in virtual reality. In that new medium, the viewer could establish his or her own perspective to the action and even select from alternative endings (assuming repeat viewings).

In short, 3D can be viewed as “one step closer” to being “inside the room.” As such, the technology can be viewed as a temporary stop in the larger trajectory that potentially includes virtual reality—really having the sense of being inside the room, but for direct involvement with the characters and being able to move things. Contrasting “Avatar” with “Gatsby” is mere child’s play compared to this. The most significant obstacle, which may be leapt over eventually as newer technology arrives, is perhaps the price-point for 3D. In my view, it is artificially high, and too uniform.

Luhrmann’s budget of $125 million before government rebates is hardly more than conventional releases. Even if theatres charge $3 more for 3D films because of the cheap glasses and special projectors, it might be in the distributors’ interest to see to it that the films wind up costing consumers the same as a conventional one shown at a theatre. As an aside, it is odd that films with vastly different budgets have the same ticket price (which suggests windfalls for some productions, which belie claims of competitive market). In other words, a film of $125 million distributed widely could be treated as a conventional film in terms of the final pricing, and it need not be assumed that theatres would be taking a hit. Adding more to already-high ticket prices is a model that does not bode well for 3D as a way-station on the road to virtual reality. Of course, technology could leap over 3D if greed artificially choke off demand for 3D glasses. I for one am looking forward to virtual reality. Interestingly, the filmmakers shooting on the cheap with digital cameras then distributing via the internet may tell us more about how films in virtual reality might be distributed and viewed than how 3D films are being distributed and priced. People have a way of voting with their wallets (and purses), and other candidates have a way of popping up unless kept out by a pushy oligarch. So perhaps it can be said that, assuming a competitive marketplace, 3D may become a viable way-station on our way to virtual reality on Pandora.


1. Michael Cieply, “The Rich Are Different: They’re in 3-D,” The New York Times, January 17, 2012. 
2. Ibid.

Tuesday, January 17, 2012

Hollywood on Risk: Snubbing Lucus’s “Red Tails”

When George Lucus showed Red Tails to executives from all the Hollywood studios, every one of the execs said no. One studio’s executives did not even show up for the screening. “Isn’t this their job?” Lucas said, astonished. “Isn’t their job at least to see movies? It’s not like some Sundance kid coming in there and saying, ‘I’ve got this little movie — would you see it?’ If Steven (Spielberg) or I or Jim Cameron or Bob Zemeckis comes in there, and they say, ‘We don’t even want to bother to see it.”[1] According to one newpaper, the snub implied that “Lucas’s pop-culture collateral — six ‘Star Wars’ movies, four ‘Indiana Jones’ movies, the effects shop Industrial Light and Magic and toy licenses that were selling (at least) four different light sabers . . .  — was basically worthless.”[2] As a result, Lucas paid for everything, including the prints, to enable the film’s opening. What can explain this bizarre snub?

Lucus was “battling former acolytes who [had] become his sworn enemies.”[3] This would be Star Wars fans, or “fanboys,” who have been upset because Lucus has made some changes to the films in new editions. “’On the Internet, all those same guys that are complaining I made a change are completely changing the movie,’ Lucas says, referring to fans who, like the dreaded studios, have done their own forcible re-edits.”[4] However, in being directed to black teenagers, “Red Tails” may not be directed to “Star Wars” fans. The snub could simply reflect the way business is done in Hollywood—meaning its tendency to be conservative, or hesitant, toward new ideas.

Regardless of a director’s past filmography, if the film being proposed does not fit with the current tastes of the targeted market segment, there’s not going to much studio interest. Lucus readily admits there’s not really much swearing in Red Tails. Nor is there a huge amount of blood in it; nobody’s head’s going to get blown off. Rather, the stress is on patriotism, and this is supposed to work for black teenagers. The fact that Lucus made Star Wars and Indiana Jones does not mean that he is right on Red Tails. At the same time, it was not as if he were an unknown. Studio execs could have given the filmmaker’s past accomplishments some weight, if only as proffering seasoned judgment from experience.

Moreover, marketing technicians are not always right in anticipating how word might spread concerning a film that could change tastes. Being confined to current tastes, filmmakers could never lead. Cuba Gooding Jr., one of the stars of Red Tails, points out that even a blockbuster can be unanticipated by the studios’ gatekeepers. “I like to say James Cameron made a movie just like this,” he said excitedly. “Instead of black people, there were blue people being held down by white people. It was called ‘Avatar!’ And the studios said the same thing to him: ‘We can’t do a movie with blue people!’”[5] Particularly where new technology and a different narrative are involved, the studios could be far too timid even for their own financial good. Lucus could have been reacting to this more than to childish fans.

“I’m retiring,” Lucas said. “I’m moving away from the business, from the company, from all this kind of stuff.”[6] Byran Curtis,  a reporter, concludes of Lucus’s decision, “He can hardly be blamed.” Rick McCallum, who had been producing Lucas’s films for more than 20 years, said “Once this is finished, he’s done everything he’s ever wanted to do. He will have completed his task as a man and a filmmaker.” According to Curtis, “Lucas has decided to devote the rest of his life to what cineastes in the 1970s used to call personal films. They’ll be small in scope, esoteric in subject and screened mostly in art houses.” Besides understandably being tired of ahistoric, short-term-financially-oriented studio executives and childish fans, Lucus had accomplished his task “as a man and a filmmaker.”[6] He could literally afford to spend the rest of his working life playing in pure creativity without regard to commercial roadblocks.

It will be others’ task to try to narrow the distance between that realm and that of the bottom-line-oriented studios. This is perhaps the challenge—the true bottom-line: namely, how to tweak the studios’ business model so creativity has enough room to breathe. Part of the solution could involve the increasing ease in filmmaking on the cheap, enabled by technological advances in equipment such as digital cameras and in distribution (e.g., the internet rather than theatres), as well as by an over-supply of actors. Young people in particular have taken to watching movies on a laptop or ipad. Any resulting downward pressure on price could affect the costs of even the blockbusters, such that actors making $20 million or more per film could be a thing of the past. As of the end of the first decade of the twenty-first century, the cost structure in Hollywood had all the distortions of an oligopoly (even monopoly), with the result that movie tickets were too high for two hours of movie experience. From the constriction that naturally comes with high prices, the industry itself could expand in terms of viewers and financially-viable genres of film were underlying cost-structure deflated by competition from the low end.

In retiring to make films “on the fly,” Lucus was once again ahead of the curve in orienting himself to the more fluid, less risk-averse “art house” world of filmmaking. While traditional studios and theatres will not contort themselves to fit it, the industry itself should look more diverse in 2020—running from high-priced “Avatar”-like 3D IMAX “experiences" to more films at a lower price downloadable on an ipad. Looking even further out, I would not be surprised if “films” in virtual reality make traditional movie theatres obsolete. I would not expect the studio executives who were not even willing to hear Lucus out to be among the trailblazers. In an industry like cinema, good far-sighted vision should be, and ultimately is, rewarded even if today’s bottom-line is in the driver’s seat.


1. Byran Curtis, “George Lucus Is Ready to Roll the Credits,” The New York Times, January 17, 2012. 
2. Ibid.
3. Ibid.
4. Ibid.
5. Ibid.
6. Ibid.

Monday, January 16, 2012

The Iron Lady

Sometimes a film is worth seeing just to watch an excellent actor capture an interesting character. This applies to Meryl Streep playing Julia Child in Julie and Julia and Margaret Thatcher in The Iron Lady. I am writing on the The Iron Lady a day after seeing it and watching Streep accept a Golden Globe for her role in the film. Prior to seeing the film, I had heard critics say that the film itself pales in comparison with Streep’s performance. I concur, though whereas the critics complain of the extent of disjunction between Thatcher as the prime minister and Thatcher as an old woman in dementia, I want to point to the sheer extent of “back and forth” between the two. Typically, there would be a snippet of Thatcher as prime minister, than back to the old woman in the dark, then back again to the past. A viewer could get whiplash. I would have preferred to begin at the beginning—with Thatcher’s start in politics—and work up to the dementia (giving the old Thatcher much, much less air time). Perhaps the “linear chronological” approach was presumed too straight-forward, or boring, by the screenwriter or director. However, any story naturally has a beginning, middle and an end, and too much jumping around can eclipse the natural progression.

A more serious problem may exist, moreover, should the viewer wonder what the conflict in the story is. In other words, what or who is the antagonist? Sadly, if it is dementia itself, there is little suspense in the outcome. Perhaps the only suspense in that regard what whether she would get rid of her dead husband, Dennis. Unfortunately, that character had more screen time dead than alive. If the main conflict is Thatcher’s political support while in office, that too could hold little drama. Likewise, it is difficult to view a “war” over a few islands off Argentina as significant to justify Thatcher’s urging her fellow British people that it is a day to feel proud to be British. Perhaps the tension between her ambition and household could have had potential had it been developed beyond a breakfast scene, though it is doubtful that the antagonist in Dennis could have given that conflict enough strength. Of course, if watching Streep inhabit Thatcher is the viewer’s aim, then perhaps drama is of secondary importance. Still, a nice story would have been a nice cherry on the sundae.

Furthermore, I contend the screenwriter failed to capitalize on some rather obvious opportunities to draw viewers into the story. Most notably, both Queen Elizabeth and President Reagan were alluded to, yet without any parts in the narrative (i.e., screen time beyond a bizarre brief dance in Reagan’s case). What, for instance, if Helen Mirren had reprised her role as Elizabeth for a few scenes with the Iron Lady? Might there have been any drama there? I suspect so. What did the Queen think of the Thatcher herself, her conservatism in the recession, and the Falklands War? Did the Queen play any indirect or subtle role in Thatcher’s fall from power? Concerning Reagan, what if we could have seen a bit of what might have been the real relationship between him and Thatcher? Might the screenwriter have gone native, leaving California for Britain? For that matter, what about showing Thatcher at Reagan’s funeral? These were major opportunities strangely lost in favor of a brief shot of the palace as Thatcher was becoming Prime Minister and of a brief dance with Reagan (which was strange in the montage). At the very least, the screenwriter missed a major opportunity by failing to capitalize on the Queen’s jubilee and irritate progressives by delving into two conservative political soulmates doing more than dancing across the screen.

Concerning Steep’s Thatcher, it is difficult to be critical. Besides the uncomfortable “leap” from the young Margaret Roberts and Thatcher as a new member of parliament, Streep herself may put too much stress on certain words in mimicking Thatcher’s sentences. The emphasis itself reminded me a bit of Streep’s Julia Child. To be sure, both characters are strong women, which undoubtedly drew Streep to the two roles. Whereas Streep probably found little not to like in Julia, the conservative politics of Thatcher must have been an obstacle. Yet even here, Streep’s maturity can be seen. “It was interesting to me to look at the human being behind the headlines," Streep said, "to imagine what it's like to a live a life so huge, controversial, and groundbreaking in the winter of that life, and to have sort of a compassionate view for someone with whom I disagree."[1] If only more prime ministers had that sort of compassion!

Ironically, at least as depicted in the film, Margaret Thatcher did not have much compassion for even her own partisans—though they were voicing compassion for the unemployed. As a viewer enthralled by Streep’s acting ability, I found it difficult to care about the protagonist—the lack of drama exacerbating this problem. Perhaps Streep’s acting could be criticized in the end for not having sufficiently communicated her compassionate view of someone with whom she disagrees. In the ending scene itself, it is difficult to feel anything for the old woman wandering in her hallway, regardless of her past politics. The film’s true antagonist may be meaninglessness, or death itself, and I’m not sure the film survives it. Furthermore, I don’t think we find a protagonist doing more than flirt with the inevitable.  How much drama can there be in facing certainty? To be sure, we all flirt with the fact that each of us will die—for all the significance each of us thinks is in our daily battles, we barely acknowledge that one day we ourselves won’t exist and that in a few generations (or centuries for some) we will be forgotten. This is part of the human condition that screenwriters attempt to capture. Even so, perhaps the dementia in The Iron Lady is more of a taste of reality than the viewers would care to tolerate, least of all for entertainment!