Spoiler Alert: These essays are ideally to be read after viewing the respective films.
Showing posts with label technology. Show all posts
Showing posts with label technology. Show all posts

Monday, February 27, 2017

Virtual Reality: Not Coming to a Theatre Near You

Virtual reality may be coming your way, and when it hits, it could hit big—as if all at once. The explosion of computers and cell phones provides two precedents. “Technologists say virtual reality could be the next computing platform, revolutionizing the way we play games, work and even socialize.”[1] Anticipating virtual reality as the next computing platform does not do the technology justice. I submit that it could revolutionize “motion pictures.” Even though the impact on screenwriting and filmmaking would be significant, I have in mind here the experience of the viewer.

Whereas augmented reality puts “digital objects on images of the real world,” virtual reality “cuts out the real world entirely.”[2] As a medium for viewing “films”—film itself already being nearly antiquated by 2017—virtual reality could thus cut out everything but a film’s story-world. The suspension of disbelief could be strengthened accordingly. The resulting immersion could dwarf that which is possible in a movie theatre. Already as applied to playing video games, “such full immersion can be so intense that users experience motion sickness or fear of falling.”[3] Imagine being virtually in a room in which a man is raping a woman, or a tiger is ready to pounce—or eating its prey, which happens to be a human whom you’ve virtually watched grow up. The possible physiological impacts on a viewer immersed in stressful content would present producers with ethical questions concerning how far it is reasonable to go—with the matter of legal liability not far behind, or in front. Watching, or better, participating in a film such as Jurassic Park could risk a heart attack.

On the bright side, the craft of light and storytelling made virtual could enable such amazing experiences that simply cannot be experienced without virtual reality being applied to film. To be immersed on Pandora in a nighttime scene of Avatar, for example, would relegate even the experience of 3-D in a theatre. The mind would not need to block out perspectivally all but the large rectangle at a distance in front. In short, the experience of watching a film would be transformed such that what we know as going to a movie would appear prehistoric—like travelling by horse to someone who drives a sports car.



1. Cat Zakrzewski, “Virtual Reality Comes With a Hitch: Real Reality,” The Wall Street Journal, February 24, 2017.
2. Ibid.
3. Ibid.

Monday, January 9, 2017

Passengers

Augustine wrote that Christians are ideally in the world but not of it. The fallen world is not the Christian’s true home. For the 5000 (plus crew) prospective colonists hibernating aboard a mammoth spaceship in the film, Passengers (2016), the planet Earth was presumably not their true home—or maybe that home was becoming climatically rather untenable and the 5000 were lucky souls heading for a new, unspoiled home. In any event, the film’s central paradigm can be characterized as “travel to” and “end-point.” That is to say, means and end characterize this picture at a basic level. The film is particularly interesting at this level in that so much value is found to reside in the means even as the end is still held out as being of great value.
For Aurora Lane, intentionally woken by Jim Preston with 89 more years to go on the trip, Earth had not been home in the sense that home is where love has been found. For her, home was mobile—moving through space at half light-speed—for she found love with Jim in spite of the fact that he had deprived her of living to see the end-point, the colony-planet. In refusing Jim’s new-found way of putting her back to sleep so she could wake again just four months before the end of the voyage, Aurora must have realized that she had found her home with Jim traveling through space. With plentiful food and drink, and no need even of money, Aurora and Jim faced a downside only in the possibility of encroaching loneliness. Headless waiters and a bottomless bartender—all robots—could not be said to give rise to any viable sense of community.

It is strange, therefore, that 89 years later, at the end of the voyage, the awakened crew and passengers do not encounter any offspring having been made out of Jim and Aurora’s love. The couple having realized that they would not live to see the new world, would they not have naturally wanted to have children who would have a chance of seeing the prospective paradise? It seems to me that the screenwriter did not think out the consequences of the couple’s decision far enough in this respect. The awakened passengers and crew should have come upon both trees and the grown children whose entire life had been in space.

In spite of having only each other, perhaps Aurora and Jim relish the peace that can be so compromised in a community (imaging having an apartment complex all to yourself!) and the freedom from the insecurity of want—two assets that could only be found during the journey. The spectacular views of space are also worthy (although it is difficult even to imagine a ship of such material that could withstand such a close pass to a sun). Yet, even so, how difficult it is for us—the audience—to understand why Aurora and Jim could possibly come to prefer a life spent entirely en route, on transportation. We are so used to being goal-oriented, teleological beings that we miss the sheer possibility that the journey itself might constitute a full life worth living.

Abstractly stated, we are so used to relegating means to an end as long at the end is viable that we have great trouble enjoying the means apart from the end. As long as the end stands a chance of being realizable, we cannot ignore it and thus fully rest content along the way.

The ability to reason about means and ends is a virtue.[1] Interestingly, virtuous actions “may be pursued ‘instrumentally’ but must be done ‘for their own sake.’ . . . They must be ends in themselves. . . . Actions truly expressive of the virtues are actions in which the means are prized at least as much as the extrinsic ends to which they are directed. . . . The telos, the best life for human beings to live, is an inclusive end constituted in large part by virtuous activity.”[2] In other words, virtues are both means and ends. A person should value acting virtuously for itself, rather than merely as a means to an end. While not a virtue-ethics guy, Kant uses this characterization in Critique of Practical Reason to claim that human beings should be valued as ends in themselves, rather than merely as means to other ends (e.g., manipulated). Can a boss ever push his use of his subordinates for his own ends sufficiently out of his mind to value those people as ends in themselves—as having inherent value?

The space voyage in the film is shown at first as only a means to a distinctly different end, the colony. Yet by the story’s end, the spaceship comes to be an end in itself too. Due to the length of the trip and the appreciably shorter human lifespan, Jim and Aurora find value in the means not as a means, but only as an end in itself.  Yet as human beings, could they ever come to disconnecting the spaceship from awareness of its end? Could Jim and Aurora ever feel a sense of ease on board without the sense that they have lost or given up the spaceship as a means? For the remainder of their lives, the colony is ahead of them. Is it even possible that two human beings could become oblivious to this fact?

Here on Earth, the Christmas season is so oriented to Christmas Eve and Christmas Day that it is scarcely imaginable that the festive atmosphere during the first three weeks of December could be chosen over Christmas itself. I suspect that more adults like Aurora and Jim, being without family, would prefer the season over the holiday itself—even opting out of it. Yet can a person come to enjoy a Christmas show or attend a Christmas party without having in mind the “not yetness” and the “betterness” of Christmas itself? What if the experience with friends at the Christmas Party two weeks before the actual holiday is better than the saccharine day itself? Can the experience ever hope to get its due regard and esteem for its own sake even as it is regarded as a means?




1. Joseph J. Kotva, The Christian Case for Virtue Ethics (Washington, D.C.: Georgetown University Press, 1996): 24.
2. Ibid., p. 25.



Monday, April 7, 2014

So Ends an Era: Classic Hollywood Cinema (1930-1950)

With the deaths of Shirley Temple on February 10, 2014, and of Mickey Rooney (Joe Yule) two months later, the world lost its two last major (on-screen) living connections to the classic Hollywood cinema of the 1930s and 1940s. The similarly-clustered deaths of Ed McManon, Farah Fawcett, and Michael Jackson during the summer of 2009 may have given people the impression that the celebrity world of the 1970s had become history in the new century.  

"Mickey Rooney" as "Andrew Hardy," flanked by his on-screen parents. Together, these three characters give us a glimpse of family life in a bygone era. Even in the 1940s, Andy Hardy's father may have been viewed as representing still another era, further back and on its way out. (Image Source: Wikipedia)


Lest we lament too much the loss of these worlds, as per the dictum of historians that history is a world lost to us, we can find solace in the actors’ immortality (and perhaps immorality) on screen. However, in the fullness of time, by which is not meant eternity (i.e., the absence of time as a factor or element), even films as illustrious or cinematically significant as Citizen Kane, Gone with the Wind, His Girl Friday, The Wizard of Oz, Philadelphia Story, Dracula, and even Mickey Rooney’s Andrew Hardy series of films will find themselves representing a decreasing percentage of films of note—assuming cinema or some continued evolution thereof goes on. As great as some ancient plays like Antigone are, the vast majority of Westerners today have never heard of the work (not to mention having seen it). Even more recent plays, such Shakespeare’s, are not exactly block-busters at movie theatres.

To be sure, cinema (and “the lower house,” television) has eclipsed plays as a form of story-telling. However, another technological innovation may displace our privileged mode sometime in the future. Virtual reality, for example, may completely transform not only how we watch movies, but also film-making itself (e.g., reversing the tendency of shorter shots and scenes so not to disorient the immersed viewer). Although the old “black and whites” can be colored and even restored, adapting them so the viewer is in a scene would hardly be possible without materially altering the original.

Aside from the decreasing proportion phenomenon relegating classic Hollywood gems, who’s to say how much play they will get even two hundred years from 2014, not to mention in 2500 years.  Even our artifacts that we reckon will “live on” forever (even if global warming has rid the planet of any humans to view the classics) will very likely come to their own “clustered deaths.” We humans have much difficulty coming to terms with the finiteness of our own world and ourselves within a mere slice of history. As Joe Yule remarked as Mickey Rooney in 2001, “Mickey Rooney is not great. Mickey Rooney was fortunate to have been an infinitesimal part of motion pictures and show business.”[1] En effet, motion pictures can be viewed as an infinitesimal phenomenon from the standpoint of the totality of history.




[1]Donna Freydkin, “Mickey Rooney Dead at 93” USA Today, April 7, 2014. 

Monday, February 10, 2014

Narrative Catching Up to Technological Eye-Candy: The Return of Substance

Even after the century known for its astonishing technological advances, the human inclination to revert to a childlike state in innocently going overboard with the new toys proffered by still more advances as the twenty-first century gained its own footing. With regard to film, revolutionary special effects based on computer technology far outstripped any directorial investment in depth of story, including the characters. Even before the advent of computer special-effects way back in the 1970s, Charleston Heston starred in Earthquake,  a film worthy of note only for the creation of an “earthquake-like” experience for viewers thanks to surround sound with a lot of base. The narrative was bland and the characters were mere cut-outs.

Years later, as part of a course at a local public-access cable studio, I concocted a music video out of footage the instructor and I had shot of a salsa band playing in-studio. After too many hours in with the computerized editing machine, I proudly emerged with my new Christmas tree only for the instructor, Carlos, to hand the tapes back to me. “Now make one without going over-board on all the bells and whistles,” he wisely directed. I had indeed put in just about everything I could find. Back in the small editing room, I used the fun fades sparingly, as good writers use adjectives.

For years after that course and some experience shooting and directing public-access programming, I would recall the lesson each time I saw yet another film sporting the newest in film-making technology yet otherwise empty of substance. James Cameron was a notable exception, centering Titanic (1997) not just on the obvious—the sinking (by means of a real ship in-studio)—but also on a romance undergirded by substantial character development. The next film to successfully do justice to both technological development and depth of characterization along with a darn good story was Cameron’s own Avatar (2009). That Cameron accomplished such a technological leap in film-making without sacrificing characterization and narrative says something rather unflattering about all the technological eye-candy that has brought with it huge cavities in narrative and characters.

In spite of the release date of Avatar 2 being in 2016, David Cameron has put out a preliminary trailer.

It was not until I saw Gravity (2013) that I discovered a litmus test for determining whether a film-making advance has come at the expense of narrative substance. Sandra Bullock gave such an authentically-emotional performance that at one point I found myself oblivious to the stunning visuals of Earth from orbit. In watching Avatar, I had become so taken with Neytiri’s eye-expressions that the technologically-achieved visuals on Pandora receded into the background. As a criterion, the re-integration of dazzling technologically-derived visuals back into the background as emotional-investment in a character re-assumes its central place in the foreground of the suspension of disbelief can separate the “men from the boys” in terms of film-making. 

Tuesday, February 4, 2014

Going to the Max: The IMAX Experience

Despite being more expensive, going to see a movie being screened on an IMAX screen has been leading the industry’s rebound in revenue and attendance. Through the first six weeks of 2012, IMAX ticket sales were $55 million in the U.S., a 45% increase over the same period in 2011. According to USA Today, the “surge outpaces the industry’s overall rebound of about 20 percent. The key is that the IMAX experience, which is predicated on screens up to 60 feet tall, cannot be reproduced on ipods, laptops, television screens, or even home "theaters."

The IMAX screen. Anyone considering getting one installed at home might want to consider adding a few more floors and a cathedral ceiling first. (Wikimedia Commons)

IMAX is “fulfilling the promise that 3-D didn’t keep, that it would be unlike anything you’ve seen,” says Jeff Bock of Exhibitor Relations, “(a)nd unlike great sound or 3-D glasses, you can’t replicate IMAX at home unless you have a six-story screen.” It is principally because IMAX is only available in theatres (and museums) that the number of IMAX screens tripled from 2008 to 2012. This has major implications for the sort of films that theatres will want to show, as not all films are equal with respect to the advantages of the large screen.

Films such as Titanic are literally tailor-made for a huge screen. In general, action films, such as John Carter, and animated films, such as The Lorax, take most advantage of the big screen. Dramas, on the other hand, are less well-suited, with the caveat that a cinematography that includes sweeping landscapes can come off as vistas when shown on IMAX. Even so, it might be that as the technological means to watch films proliferates, only screenplays particularly suited to the IMAX will be oriented to theatres. Obviously, this is a generality; it would likely still be in the theatre owners’ interest to retain the traditional screens, and thus continue to show dramas. Even so, the demand for them—unlike the films that play well on IMAX—is likely to be stagnant or decline.

The implication for screenwriters is that the type of venue should be more salient in the writing. If the story is apt to be particularly well suited to be shown on IMAX screens, this could be reflected in how the characters look as well as what they do. It might be that the elements that play so well on IMAX are such that the narrative itself is diminished. If so, additional attention to the story elements may be advisable. In John Carter, for example, viewers may be so captivated by the large characters that the plot could fall by the wayside. Making the major story elements (e.g., critical event) more salient could counter this effect of the big screen. Regarding stories not so inclined to IMAX, the screenwriter might want to consider how the writing could take into account the small screen (e.g., ipod or laptop) format. It might be more difficult, for example, to follow a lot of action.  By implication, directors should also consider the impact of the format (e.g., filming action at a distance in a drama to be viewed mostly on ipods and laptops). In fact, the editing process could even take into account the viewing format by putting out two different versions (sort of like the theatrical and director’s cuts now).

In short, if cinemas are to survive, it could be because they can proffer something that no other venue can have. IMAX is a case in point. This does not mean that all films are equally well suited to the format. Even for the film genres that take particular advantage of being shown on a large screen, theatre owners should encourage screenwriters, directors and editors to take the format into account. It could even be that some types of story (and even some elements, such as the climax) are particularly well suited to being shown on IMAX (as well as in 3-D). Moreover, the relationship between technology and narrative warrants more attention. Indeed, it may be that the twenty-first century may be known to future historians for how technology told stories.

Source:

Scott Bowles, “IMAX Is Delivering What 3-D Couldn’t,” USA Today, March 22, 2012. 

Titantic: Film Chasing History

James Cameron’s Titanic was released in 1997—twelve years after the wreck had been found in the icy north Atlantic. By 2013, there had been enough empirical study on how the ship actually broke apart and went down that we could look back at the depiction in Cameron’s film as at least in part erroneous. Interestingly, Cameron himself sponsored and was actively involved with the studies that would effectively “semi-fictionalize” his own depiction. Rather than trying to protect his depiction by getting the studies to confirm what his best guess had been at the time of filming, Cameron engaged in a determined effort to get to a definitive answer as to what really happened. This in turn lead to some interesting questions.
According to Cameron's documentary in 2012, the back end of Titanic only reached 23 degrees, far less than depicted in this picture (and in the 1997 movie). Source: fxguide.

In any historical piece, the “film world” is not the same as what really happened. The sad truth is that the world of the past is forever lost once it is past. Seeing Daniel Day Lewis as Lincoln (2012), I could not but think that the former president must really have been as depicted. However, much of my image of what Lincoln must have been like has come from the myriad of stories. As a child, I had seen his log cabin in Salem, Illinois, and his law office and house in Springfield, and I had watched other portrayals of the man on television. Even as I marveled at Lewis’s depiction of the man, I found the screenplay itself too idealistic. For instance, Lincoln represented large railroads as a lawyer in Illinois, and he overruled his own Secretary of the Interior in agreeing to pay the transcontinental railroads mountain rates for building track on flat land in the West. It is odd, therefore, that in trying to get votes on the anti-slavery amendment, he is depicted in the film as being so concerned that no bribes be paid. In short, Hollywood seems hardpressed to completely expunge the accumulated mythos element even when trying for historical realism.
To take advantage of having access to Titanic’s wreckage before it is completely eaten by bacteria, Cameron did not rest with what had been theorized at the time of his film-shoots. He sponsored additional studies, bringing the experts together and turning that meeting into a documentary in 2012. In doing so, he knowingly risked making his own depiction obsolete, or at least partially flawed. As shown in his documentary, he was more interested in getting as close to what really happened than protecting his depiction. The issue for him was whether to reshoot the ending. Seeing a potential series of such changes as more and more is grasped  or theories change, he decided to keep his original ending.
His decision is in line with highlighting the dramatic, even if at the expense of new knowledge. I have in mind the scene in which the back of the ship is standing vertical in the air. The two protagonists “ride” the ship down until it submerges. As of Cameron’s documentary, the studies postulated that the steepest angle was 23 degrees, with the ship splitting in half at that point. The back end sank into the water, turning over as it did so. Of course, this too must be taken as conjecture. There was no video taken at the actual scene, and the eye-witness accounts differ. The frustrating truth is that we will never have a flawless picture of what really happened. This is not to say that progress cannot be made, and Cameron should be applauded for being so determined on this task even though his depiction in the film stood to lose ground. Indeed, after watching the video depiction that Cameron made in his documentary, I view his film differently—at least the now-rather-extreme vertical position of the back end of the ship.
In his documentary, Cameron points to the hubris that when into the ship “that could not sink.” The preoccupation with size got ahead of itself. Put another way, systemic risk was ignored. Similarly, he points out, we did not see the iceberg coming in 2008 as the economy hit an unknown quantity of mortgage-backed derivatives and insurance swaps. Even after that, he goes on, we were headed right for a global-warming “iceberg” with a “rudder” too small to avoid the obstacle in time. Just today, the Huffington Post ran a headline concerning climate change, “Its Happening.” At the end of his documentary, Cameron suggests that too many people holding power are making money in the status quo for a sufficient amount of change (i.e., rudder) to occur before “it hits.” It is as though Titanic’s captain could see the iceberg far out in front yet was too invested in the ship’s course to deviate.
Even in assuming back in 1997 that Cameron captured on film what really happened (I made that assumption), there is hubris. Even the updated graphic in Cameron’s documentary in 2012 should be taken as provisional. As stated above, we will never be able to know what really happened. Whether in what we think we know about a bygone world, building a ship that cannot sink, leaving new financial instruments unregulated, or putting off legislation that would counter global warming, we as a species presume we know more than we do. We naturally get ahead of ourselves, and thus ironically risk our own progress and indeed even our very future as a species.

Monday, January 13, 2014

The Television and Cinema Revolution: Virtual Reality and Holograms to Sideline UHD Curves

Ultimate high-definition, or UHD, which refers to television screens sporting at least four times the number of pixels as “mere” high-definition, goes beyond the capacity of the human eye. Hence, the potential problem in going beyond ultimate is moot; we could not visually discern any difference. Even so, I suspect this inconvenient detail would not stop people in marketing from primping the product as “beyond ultimate.” One unfortunate byproduct of such inventive marketing may ironically not be visible at all. Specifically, the illusionary distractions, or marketing mirages, come with an opportunity cost; that is, being suckered into seemingly exciting innovations can distract us from noticing other technological advances whose applications could alter entire industries and transform our daily lives.

At the International Consumer Electronics Show in January, 2014, Sony and Samsung showcased UDH televisions with curved screens. The “more immersive viewing” is possible only on larger screens; John Sciacca, an installer of audio/video, reports that at the small-screen level, the curved shape “has no advantage.”[1] Accordingly, the televisions with the option on display at the electronics show in Las Vegas, Nevada range from 55 to 105-inch screens.[2] The verdict there was mixed, the usual suspects playing their expected roles.

The 105-inch UHD television. Paragon of the 21st century?  (Image Source: Forbes)

Kaz Hirai, the CEO of Sony, insisted in an interview that “some people actually like [the curved screen] very much.”[3] He went on to add, “I personally think it’s a great experience because you do have that feeling of being a little bit more surrounded. . . . After you watch curved TV for a while and then you just watch a flat panel, it looks like, ‘What’s wrong with this panel?’”[4] In the midst of this vintage self-promotion, he did admit, as if as an afterthought, that the curved feature is “not for everyone.”[5]

John Sciacca would doubtless call the matter of preference an understatement. “For displays, at least, curved is a ‘marketing gimmick.’ I know that research has gone into finding the ideal curve for the ideal seating distance, but I think that it is still limiting its best results for a much narrower viewing position.”[6] That is, the curved shape can be counterproductive in settings where more than two or three people are sitting close together straining to capture the sweet spot of ideal viewing. To be sure, at that “dot” on a room diagram, Sciacca admits that the curved shape on big screens (i.e., 55-inch and up) “has real benefits for front projection as it has to do with how the light hits the screen at different points, and a curve helps with brightness uniformity and geometry.”[7] Granted, but who wants to do geometry in order to figure out where to sit?

Accordingly, the curved screen “seems like a marketing thing,” rather than anything “substantive,” says Kurt Kennobie, the owner of an audio-video store in Phoenix.[8] Kaz Hirai would doubtless disagree, at least publically. The usual trajectory of public attention would simply follow this debate back and forth, treating it as though it were a riveting tennis match. The fans getting their adrenaline fix would hardly notice the progress being made in a potentially transformative rather than incremental technology applicable to television and cinema. This invisible cost in chasing after minor points lies, in other words, in missing the big picture, which in this case involves a developing technology that leaps over the “curved” controversy entirely.

What exactly is this so-called revolutionary technology? It already exists, in gaming, which incidentally already merges movie and video-game features, and here’s the key, applied as “virtual reality.” Lest the philosophers among us just caught a whiff of metaphysics, the “reality” to which I refer is simply a novel way of viewing visuals such as video games. The viewing is provided by a headset. 

For example, Avegant’s head-mounted Glyph sports a “virtual retina display.” The devise “makes use of a couple million tiny mirrors—similar to the technology found in DLP projectors—to project a 720p [pixel] image straight onto your retina. The result is a stereoscopic 3D display that has almost zero crosstalk and no noticeable pixelation.”[9] In other words, suddenly the “curved vs flat” debate is obsolete, or trumped, as though by an interloper.

Moreover, the application of Glyph to watching television and movies could marginalize (though I suspect not eliminate) screens, whether in homes or movie theaters. That’s right, the cinemas would face a stiff headwind, and therefore likely be forced to concentrate on their comparative advantage—the “ultra” big screen. I suspect that experience would survive, for people would probably not want to confine all viewing to one mode. Indeed, even the “virtual reality” means of great immersion might have to contend with an even newer mode of viewing—that of the hologram. 


Needless to say, both means would mean changes to how films are written, shot, and edited. Perhaps a given film would have screen, virtual reality, and holograph versions, which at least in the case of virtual reality might involve alternative storylines allowing for viewer choice. A person could ask, for example, “How would the narrative play out were this or that character to die rather than live?”

With the increasing modes of viewing, the world as we know it in the 2010’s could change in such a marked way. In fact, people living in 2100 might look back on 2014 akin to how people in 2000 looked back on the automobile, telephone, and electric light dramatically changing daily life in the 1910’s.  The new viewing applications, along with the development of the internet and mobile devises, could distinguish the 2010’s in retrospect, and, moreover, the twenty-first century from the previous one. Known in its day as the century of technological change, the twentieth century could yet lose its mantle to a century in which mankind is cured of death, barring an accident or natural disaster. Change is relative, which is precisely my point concerning virtual reality and curved screens. Perhaps human perspective is itself curved in a way that only the most transient and titillating image lies in the “sweet spot.”



[1] Mike Snider, “Finding the Sweet Spot of Curved Displays,” USA Today, January 10, 2014.
[2] Ibid.
[3] Ibid.
[4] Ibid.
[5] Ibid.
[6] Ibid.
[7] Ibid.
[8] Ibid.
[9] USA Today, “Editors’ Choice Awards,” January 10, 2014. I should note that I have no financial interest in Avegant or the Glyph. I am merely using this product to make a broader point.

Wednesday, January 18, 2012

“The Great Gatsby” in 3D

It is difficult for us mere mortals to take a step back and view the wider trajectory that we are on. It is much easier to relate today’s innovation back to the status quo and pat ourselves on the back amid all the excitement over the new toy. I content that this is the case in cinema.

I was enthralled in viewing Avatar, the film in which James Cameron pushed the envelope on 3D technology on Pandora even as he added the rather down-to-earth element of a biologist who smokes cigarettes. Three years later, his other epic film, Titanic, would be re-released in 3D a century to the month after the actual sinking. As if a publicity stunt choreographed by Cameron himself, the Costa Concordia had conveniently hit a reef about twenty feet from an island off the coast of Tuscany three months before the re-release. “It was like a scene out of Titanic,” one passenger said once on dry land—perhaps a stone’s throw from the boat.

The question of whether a serious drama without a fictional planet or a huge accident can support an audience’s tolerance for 3D glasses was very much on the mind of Baz Luhrmann as he was filming his 3D rendition of F. Scott Fitzgerald’s “The Great Gatsby” in 2011. According to Michael Cieply, Luhrmann’s film “will tell whether 3-D can actually serve actors as they struggle through a complex story set squarely inside the natural world.”[1] According to Cieply, the director spoke to him of using 3D to find a new intimacy in film. “How do you make it feel like you’re inside the room?” Luhrmann asked.[2] This is indeed 3D coming into a state of maturity, past the rush of thrilling vistas and coming-at-you threats. Indeed, for the viewer to feel more like he or she is “inside the room” places the technology on a longer trajectory.

“The Great Gatsby,” for instance, was first on the screen as “Gatsby,” a silent film in 1926—just a year after the novel had been published. Being in black and white and without even talking, the film could hardly give the viewers the sense of being “inside the room.” Then came the 1949 version directed by Elliott Nugent. A review in the New York Times referred to Alan Ladd’s reversion to “that stock character he usually plays” and to the “completely artificial and stiff” direction. So much for being “inside the room.” Even the 1974 version starring Robert Redford left Luhrmann wondering just who the Gatsby character is. More than 3D would presumably be needed for the viewers to feel like they are “inside the room.” Even so, 3D could help as long as the other factors, such as good screenwriting, acting, and directing, are in line.

So Luhrmann and his troupe viewed Hitchcock’s 3D version of “Dial M for Murder” (1954)—this date itself hinting that 3D is not as novel as viewers of “Avatar” might have thought. Watching “Dial M” was, according to Luhrmann, “like theater”—that is, like really being there. Ironically, 3D may proffer “realism” most where films are set like (i.e., could be) plays. Polanski’s “Carnage” is another case in point, being almost entirely set in an apartment and hallway. With such a set, a film could even be made to be viewed as virtual reality (i.e., by wearing those game head-sets). In contrast, moving from an apartment living room one minute to the top of a skyscraper the next might be a bit awkward viewed in virtual reality. In that new medium, the viewer could establish his or her own perspective to the action and even select from alternative endings (assuming repeat viewings).

In short, 3D can be viewed as “one step closer” to being “inside the room.” As such, the technology can be viewed as a temporary stop in the larger trajectory that potentially includes virtual reality—really having the sense of being inside the room, but for direct involvement with the characters and being able to move things. Contrasting “Avatar” with “Gatsby” is mere child’s play compared to this. The most significant obstacle, which may be leapt over eventually as newer technology arrives, is perhaps the price-point for 3D. In my view, it is artificially high, and too uniform.

Luhrmann’s budget of $125 million before government rebates is hardly more than conventional releases. Even if theatres charge $3 more for 3D films because of the cheap glasses and special projectors, it might be in the distributors’ interest to see to it that the films wind up costing consumers the same as a conventional one shown at a theatre. As an aside, it is odd that films with vastly different budgets have the same ticket price (which suggests windfalls for some productions, which belie claims of competitive market). In other words, a film of $125 million distributed widely could be treated as a conventional film in terms of the final pricing, and it need not be assumed that theatres would be taking a hit. Adding more to already-high ticket prices is a model that does not bode well for 3D as a way-station on the road to virtual reality. Of course, technology could leap over 3D if greed artificially choke off demand for 3D glasses. I for one am looking forward to virtual reality. Interestingly, the filmmakers shooting on the cheap with digital cameras then distributing via the internet may tell us more about how films in virtual reality might be distributed and viewed than how 3D films are being distributed and priced. People have a way of voting with their wallets (and purses), and other candidates have a way of popping up unless kept out by a pushy oligarch. So perhaps it can be said that, assuming a competitive marketplace, 3D may become a viable way-station on our way to virtual reality on Pandora.


1. Michael Cieply, “The Rich Are Different: They’re in 3-D,” The New York Times, January 17, 2012. 
2. Ibid.

Tuesday, January 17, 2012

Hollywood on Risk: Snubbing Lucus’s “Red Tails”

When George Lucus showed Red Tails to executives from all the Hollywood studios, every one of the execs said no. One studio’s executives did not even show up for the screening. “Isn’t this their job?” Lucas said, astonished. “Isn’t their job at least to see movies? It’s not like some Sundance kid coming in there and saying, ‘I’ve got this little movie — would you see it?’ If Steven (Spielberg) or I or Jim Cameron or Bob Zemeckis comes in there, and they say, ‘We don’t even want to bother to see it.”[1] According to one newpaper, the snub implied that “Lucas’s pop-culture collateral — six ‘Star Wars’ movies, four ‘Indiana Jones’ movies, the effects shop Industrial Light and Magic and toy licenses that were selling (at least) four different light sabers . . .  — was basically worthless.”[2] As a result, Lucas paid for everything, including the prints, to enable the film’s opening. What can explain this bizarre snub?

Lucus was “battling former acolytes who [had] become his sworn enemies.”[3] This would be Star Wars fans, or “fanboys,” who have been upset because Lucus has made some changes to the films in new editions. “’On the Internet, all those same guys that are complaining I made a change are completely changing the movie,’ Lucas says, referring to fans who, like the dreaded studios, have done their own forcible re-edits.”[4] However, in being directed to black teenagers, “Red Tails” may not be directed to “Star Wars” fans. The snub could simply reflect the way business is done in Hollywood—meaning its tendency to be conservative, or hesitant, toward new ideas.

Regardless of a director’s past filmography, if the film being proposed does not fit with the current tastes of the targeted market segment, there’s not going to much studio interest. Lucus readily admits there’s not really much swearing in Red Tails. Nor is there a huge amount of blood in it; nobody’s head’s going to get blown off. Rather, the stress is on patriotism, and this is supposed to work for black teenagers. The fact that Lucus made Star Wars and Indiana Jones does not mean that he is right on Red Tails. At the same time, it was not as if he were an unknown. Studio execs could have given the filmmaker’s past accomplishments some weight, if only as proffering seasoned judgment from experience.

Moreover, marketing technicians are not always right in anticipating how word might spread concerning a film that could change tastes. Being confined to current tastes, filmmakers could never lead. Cuba Gooding Jr., one of the stars of Red Tails, points out that even a blockbuster can be unanticipated by the studios’ gatekeepers. “I like to say James Cameron made a movie just like this,” he said excitedly. “Instead of black people, there were blue people being held down by white people. It was called ‘Avatar!’ And the studios said the same thing to him: ‘We can’t do a movie with blue people!’”[5] Particularly where new technology and a different narrative are involved, the studios could be far too timid even for their own financial good. Lucus could have been reacting to this more than to childish fans.

“I’m retiring,” Lucas said. “I’m moving away from the business, from the company, from all this kind of stuff.”[6] Byran Curtis,  a reporter, concludes of Lucus’s decision, “He can hardly be blamed.” Rick McCallum, who had been producing Lucas’s films for more than 20 years, said “Once this is finished, he’s done everything he’s ever wanted to do. He will have completed his task as a man and a filmmaker.” According to Curtis, “Lucas has decided to devote the rest of his life to what cineastes in the 1970s used to call personal films. They’ll be small in scope, esoteric in subject and screened mostly in art houses.” Besides understandably being tired of ahistoric, short-term-financially-oriented studio executives and childish fans, Lucus had accomplished his task “as a man and a filmmaker.”[6] He could literally afford to spend the rest of his working life playing in pure creativity without regard to commercial roadblocks.

It will be others’ task to try to narrow the distance between that realm and that of the bottom-line-oriented studios. This is perhaps the challenge—the true bottom-line: namely, how to tweak the studios’ business model so creativity has enough room to breathe. Part of the solution could involve the increasing ease in filmmaking on the cheap, enabled by technological advances in equipment such as digital cameras and in distribution (e.g., the internet rather than theatres), as well as by an over-supply of actors. Young people in particular have taken to watching movies on a laptop or ipad. Any resulting downward pressure on price could affect the costs of even the blockbusters, such that actors making $20 million or more per film could be a thing of the past. As of the end of the first decade of the twenty-first century, the cost structure in Hollywood had all the distortions of an oligopoly (even monopoly), with the result that movie tickets were too high for two hours of movie experience. From the constriction that naturally comes with high prices, the industry itself could expand in terms of viewers and financially-viable genres of film were underlying cost-structure deflated by competition from the low end.

In retiring to make films “on the fly,” Lucus was once again ahead of the curve in orienting himself to the more fluid, less risk-averse “art house” world of filmmaking. While traditional studios and theatres will not contort themselves to fit it, the industry itself should look more diverse in 2020—running from high-priced “Avatar”-like 3D IMAX “experiences" to more films at a lower price downloadable on an ipad. Looking even further out, I would not be surprised if “films” in virtual reality make traditional movie theatres obsolete. I would not expect the studio executives who were not even willing to hear Lucus out to be among the trailblazers. In an industry like cinema, good far-sighted vision should be, and ultimately is, rewarded even if today’s bottom-line is in the driver’s seat.


1. Byran Curtis, “George Lucus Is Ready to Roll the Credits,” The New York Times, January 17, 2012. 
2. Ibid.
3. Ibid.
4. Ibid.
5. Ibid.
6. Ibid.

Wednesday, April 27, 2011

Computer Technology Revolutionizing Industries: Books and Films

Crude oil was first drilled in 1859 in northwestern Pennsylvania (not in the desert of the Middle East). It was not long before oil lamps became ubiquitous, lengthening the productive day for millions beyond daylight hours. Just fifty or sixty years later, as electricity was beginning to replace the lamps, Ford’s mass-produced automobile was taking off, providing an alternative use of crude oil. For those of us alive in the early decades of the twenty-first century, electric lighting indoors and cars on paved roads have been around as long as we can remember. As a result, we tend to assume that things will go on pretty much as they “always” have. Other than for computer technology, the end of the first decade of the 21st century looks nearly indistinguishable from the last thirty or forty years of the last century. As the second decade of the 21st century began, applications based on computer technology were reaching a critical mass in terms of triggering shifts in some industries that had seemingly “always” been there.  Books, music and movies were certainly among the fastest moving, perhaps like the dramatic change in lighting and cars beginning a century and a half before with the discovery of crude oil.


The full essay is at "Computer Technology Revolutionizing Industries."