Spoiler Alert: These essays are ideally to be read after viewing the respective films.
Showing posts with label virtual reality. Show all posts
Showing posts with label virtual reality. Show all posts

Monday, February 27, 2017

Virtual Reality: Not Coming to a Theatre Near You

Virtual reality may be coming your way, and when it hits, it could hit big—as if all at once. The explosion of computers and cell phones provides two precedents. “Technologists say virtual reality could be the next computing platform, revolutionizing the way we play games, work and even socialize.”[1] Anticipating virtual reality as the next computing platform does not do the technology justice. I submit that it could revolutionize “motion pictures.” Even though the impact on screenwriting and filmmaking would be significant, I have in mind here the experience of the viewer.

Whereas augmented reality puts “digital objects on images of the real world,” virtual reality “cuts out the real world entirely.”[2] As a medium for viewing “films”—film itself already being nearly antiquated by 2017—virtual reality could thus cut out everything but a film’s story-world. The suspension of disbelief could be strengthened accordingly. The resulting immersion could dwarf that which is possible in a movie theatre. Already as applied to playing video games, “such full immersion can be so intense that users experience motion sickness or fear of falling.”[3] Imagine being virtually in a room in which a man is raping a woman, or a tiger is ready to pounce—or eating its prey, which happens to be a human whom you’ve virtually watched grow up. The possible physiological impacts on a viewer immersed in stressful content would present producers with ethical questions concerning how far it is reasonable to go—with the matter of legal liability not far behind, or in front. Watching, or better, participating in a film such as Jurassic Park could risk a heart attack.

On the bright side, the craft of light and storytelling made virtual could enable such amazing experiences that simply cannot be experienced without virtual reality being applied to film. To be immersed on Pandora in a nighttime scene of Avatar, for example, would relegate even the experience of 3-D in a theatre. The mind would not need to block out perspectivally all but the large rectangle at a distance in front. In short, the experience of watching a film would be transformed such that what we know as going to a movie would appear prehistoric—like travelling by horse to someone who drives a sports car.



1. Cat Zakrzewski, “Virtual Reality Comes With a Hitch: Real Reality,” The Wall Street Journal, February 24, 2017.
2. Ibid.
3. Ibid.

Monday, April 7, 2014

So Ends an Era: Classic Hollywood Cinema (1930-1950)

With the deaths of Shirley Temple on February 10, 2014, and of Mickey Rooney (Joe Yule) two months later, the world lost its two last major (on-screen) living connections to the classic Hollywood cinema of the 1930s and 1940s. The similarly-clustered deaths of Ed McManon, Farah Fawcett, and Michael Jackson during the summer of 2009 may have given people the impression that the celebrity world of the 1970s had become history in the new century.  

"Mickey Rooney" as "Andrew Hardy," flanked by his on-screen parents. Together, these three characters give us a glimpse of family life in a bygone era. Even in the 1940s, Andy Hardy's father may have been viewed as representing still another era, further back and on its way out. (Image Source: Wikipedia)


Lest we lament too much the loss of these worlds, as per the dictum of historians that history is a world lost to us, we can find solace in the actors’ immortality (and perhaps immorality) on screen. However, in the fullness of time, by which is not meant eternity (i.e., the absence of time as a factor or element), even films as illustrious or cinematically significant as Citizen Kane, Gone with the Wind, His Girl Friday, The Wizard of Oz, Philadelphia Story, Dracula, and even Mickey Rooney’s Andrew Hardy series of films will find themselves representing a decreasing percentage of films of note—assuming cinema or some continued evolution thereof goes on. As great as some ancient plays like Antigone are, the vast majority of Westerners today have never heard of the work (not to mention having seen it). Even more recent plays, such Shakespeare’s, are not exactly block-busters at movie theatres.

To be sure, cinema (and “the lower house,” television) has eclipsed plays as a form of story-telling. However, another technological innovation may displace our privileged mode sometime in the future. Virtual reality, for example, may completely transform not only how we watch movies, but also film-making itself (e.g., reversing the tendency of shorter shots and scenes so not to disorient the immersed viewer). Although the old “black and whites” can be colored and even restored, adapting them so the viewer is in a scene would hardly be possible without materially altering the original.

Aside from the decreasing proportion phenomenon relegating classic Hollywood gems, who’s to say how much play they will get even two hundred years from 2014, not to mention in 2500 years.  Even our artifacts that we reckon will “live on” forever (even if global warming has rid the planet of any humans to view the classics) will very likely come to their own “clustered deaths.” We humans have much difficulty coming to terms with the finiteness of our own world and ourselves within a mere slice of history. As Joe Yule remarked as Mickey Rooney in 2001, “Mickey Rooney is not great. Mickey Rooney was fortunate to have been an infinitesimal part of motion pictures and show business.”[1] En effet, motion pictures can be viewed as an infinitesimal phenomenon from the standpoint of the totality of history.




[1]Donna Freydkin, “Mickey Rooney Dead at 93” USA Today, April 7, 2014. 

Monday, January 13, 2014

The Television and Cinema Revolution: Virtual Reality and Holograms to Sideline UHD Curves

Ultimate high-definition, or UHD, which refers to television screens sporting at least four times the number of pixels as “mere” high-definition, goes beyond the capacity of the human eye. Hence, the potential problem in going beyond ultimate is moot; we could not visually discern any difference. Even so, I suspect this inconvenient detail would not stop people in marketing from primping the product as “beyond ultimate.” One unfortunate byproduct of such inventive marketing may ironically not be visible at all. Specifically, the illusionary distractions, or marketing mirages, come with an opportunity cost; that is, being suckered into seemingly exciting innovations can distract us from noticing other technological advances whose applications could alter entire industries and transform our daily lives.

At the International Consumer Electronics Show in January, 2014, Sony and Samsung showcased UDH televisions with curved screens. The “more immersive viewing” is possible only on larger screens; John Sciacca, an installer of audio/video, reports that at the small-screen level, the curved shape “has no advantage.”[1] Accordingly, the televisions with the option on display at the electronics show in Las Vegas, Nevada range from 55 to 105-inch screens.[2] The verdict there was mixed, the usual suspects playing their expected roles.

The 105-inch UHD television. Paragon of the 21st century?  (Image Source: Forbes)

Kaz Hirai, the CEO of Sony, insisted in an interview that “some people actually like [the curved screen] very much.”[3] He went on to add, “I personally think it’s a great experience because you do have that feeling of being a little bit more surrounded. . . . After you watch curved TV for a while and then you just watch a flat panel, it looks like, ‘What’s wrong with this panel?’”[4] In the midst of this vintage self-promotion, he did admit, as if as an afterthought, that the curved feature is “not for everyone.”[5]

John Sciacca would doubtless call the matter of preference an understatement. “For displays, at least, curved is a ‘marketing gimmick.’ I know that research has gone into finding the ideal curve for the ideal seating distance, but I think that it is still limiting its best results for a much narrower viewing position.”[6] That is, the curved shape can be counterproductive in settings where more than two or three people are sitting close together straining to capture the sweet spot of ideal viewing. To be sure, at that “dot” on a room diagram, Sciacca admits that the curved shape on big screens (i.e., 55-inch and up) “has real benefits for front projection as it has to do with how the light hits the screen at different points, and a curve helps with brightness uniformity and geometry.”[7] Granted, but who wants to do geometry in order to figure out where to sit?

Accordingly, the curved screen “seems like a marketing thing,” rather than anything “substantive,” says Kurt Kennobie, the owner of an audio-video store in Phoenix.[8] Kaz Hirai would doubtless disagree, at least publically. The usual trajectory of public attention would simply follow this debate back and forth, treating it as though it were a riveting tennis match. The fans getting their adrenaline fix would hardly notice the progress being made in a potentially transformative rather than incremental technology applicable to television and cinema. This invisible cost in chasing after minor points lies, in other words, in missing the big picture, which in this case involves a developing technology that leaps over the “curved” controversy entirely.

What exactly is this so-called revolutionary technology? It already exists, in gaming, which incidentally already merges movie and video-game features, and here’s the key, applied as “virtual reality.” Lest the philosophers among us just caught a whiff of metaphysics, the “reality” to which I refer is simply a novel way of viewing visuals such as video games. The viewing is provided by a headset. 

For example, Avegant’s head-mounted Glyph sports a “virtual retina display.” The devise “makes use of a couple million tiny mirrors—similar to the technology found in DLP projectors—to project a 720p [pixel] image straight onto your retina. The result is a stereoscopic 3D display that has almost zero crosstalk and no noticeable pixelation.”[9] In other words, suddenly the “curved vs flat” debate is obsolete, or trumped, as though by an interloper.

Moreover, the application of Glyph to watching television and movies could marginalize (though I suspect not eliminate) screens, whether in homes or movie theaters. That’s right, the cinemas would face a stiff headwind, and therefore likely be forced to concentrate on their comparative advantage—the “ultra” big screen. I suspect that experience would survive, for people would probably not want to confine all viewing to one mode. Indeed, even the “virtual reality” means of great immersion might have to contend with an even newer mode of viewing—that of the hologram. 


Needless to say, both means would mean changes to how films are written, shot, and edited. Perhaps a given film would have screen, virtual reality, and holograph versions, which at least in the case of virtual reality might involve alternative storylines allowing for viewer choice. A person could ask, for example, “How would the narrative play out were this or that character to die rather than live?”

With the increasing modes of viewing, the world as we know it in the 2010’s could change in such a marked way. In fact, people living in 2100 might look back on 2014 akin to how people in 2000 looked back on the automobile, telephone, and electric light dramatically changing daily life in the 1910’s.  The new viewing applications, along with the development of the internet and mobile devises, could distinguish the 2010’s in retrospect, and, moreover, the twenty-first century from the previous one. Known in its day as the century of technological change, the twentieth century could yet lose its mantle to a century in which mankind is cured of death, barring an accident or natural disaster. Change is relative, which is precisely my point concerning virtual reality and curved screens. Perhaps human perspective is itself curved in a way that only the most transient and titillating image lies in the “sweet spot.”



[1] Mike Snider, “Finding the Sweet Spot of Curved Displays,” USA Today, January 10, 2014.
[2] Ibid.
[3] Ibid.
[4] Ibid.
[5] Ibid.
[6] Ibid.
[7] Ibid.
[8] Ibid.
[9] USA Today, “Editors’ Choice Awards,” January 10, 2014. I should note that I have no financial interest in Avegant or the Glyph. I am merely using this product to make a broader point.

Wednesday, January 18, 2012

“The Great Gatsby” in 3D

It is difficult for us mere mortals to take a step back and view the wider trajectory that we are on. It is much easier to relate today’s innovation back to the status quo and pat ourselves on the back amid all the excitement over the new toy. I content that this is the case in cinema.

I was enthralled in viewing Avatar, the film in which James Cameron pushed the envelope on 3D technology on Pandora even as he added the rather down-to-earth element of a biologist who smokes cigarettes. Three years later, his other epic film, Titanic, would be re-released in 3D a century to the month after the actual sinking. As if a publicity stunt choreographed by Cameron himself, the Costa Concordia had conveniently hit a reef about twenty feet from an island off the coast of Tuscany three months before the re-release. “It was like a scene out of Titanic,” one passenger said once on dry land—perhaps a stone’s throw from the boat.

The question of whether a serious drama without a fictional planet or a huge accident can support an audience’s tolerance for 3D glasses was very much on the mind of Baz Luhrmann as he was filming his 3D rendition of F. Scott Fitzgerald’s “The Great Gatsby” in 2011. According to Michael Cieply, Luhrmann’s film “will tell whether 3-D can actually serve actors as they struggle through a complex story set squarely inside the natural world.”[1] According to Cieply, the director spoke to him of using 3D to find a new intimacy in film. “How do you make it feel like you’re inside the room?” Luhrmann asked.[2] This is indeed 3D coming into a state of maturity, past the rush of thrilling vistas and coming-at-you threats. Indeed, for the viewer to feel more like he or she is “inside the room” places the technology on a longer trajectory.

“The Great Gatsby,” for instance, was first on the screen as “Gatsby,” a silent film in 1926—just a year after the novel had been published. Being in black and white and without even talking, the film could hardly give the viewers the sense of being “inside the room.” Then came the 1949 version directed by Elliott Nugent. A review in the New York Times referred to Alan Ladd’s reversion to “that stock character he usually plays” and to the “completely artificial and stiff” direction. So much for being “inside the room.” Even the 1974 version starring Robert Redford left Luhrmann wondering just who the Gatsby character is. More than 3D would presumably be needed for the viewers to feel like they are “inside the room.” Even so, 3D could help as long as the other factors, such as good screenwriting, acting, and directing, are in line.

So Luhrmann and his troupe viewed Hitchcock’s 3D version of “Dial M for Murder” (1954)—this date itself hinting that 3D is not as novel as viewers of “Avatar” might have thought. Watching “Dial M” was, according to Luhrmann, “like theater”—that is, like really being there. Ironically, 3D may proffer “realism” most where films are set like (i.e., could be) plays. Polanski’s “Carnage” is another case in point, being almost entirely set in an apartment and hallway. With such a set, a film could even be made to be viewed as virtual reality (i.e., by wearing those game head-sets). In contrast, moving from an apartment living room one minute to the top of a skyscraper the next might be a bit awkward viewed in virtual reality. In that new medium, the viewer could establish his or her own perspective to the action and even select from alternative endings (assuming repeat viewings).

In short, 3D can be viewed as “one step closer” to being “inside the room.” As such, the technology can be viewed as a temporary stop in the larger trajectory that potentially includes virtual reality—really having the sense of being inside the room, but for direct involvement with the characters and being able to move things. Contrasting “Avatar” with “Gatsby” is mere child’s play compared to this. The most significant obstacle, which may be leapt over eventually as newer technology arrives, is perhaps the price-point for 3D. In my view, it is artificially high, and too uniform.

Luhrmann’s budget of $125 million before government rebates is hardly more than conventional releases. Even if theatres charge $3 more for 3D films because of the cheap glasses and special projectors, it might be in the distributors’ interest to see to it that the films wind up costing consumers the same as a conventional one shown at a theatre. As an aside, it is odd that films with vastly different budgets have the same ticket price (which suggests windfalls for some productions, which belie claims of competitive market). In other words, a film of $125 million distributed widely could be treated as a conventional film in terms of the final pricing, and it need not be assumed that theatres would be taking a hit. Adding more to already-high ticket prices is a model that does not bode well for 3D as a way-station on the road to virtual reality. Of course, technology could leap over 3D if greed artificially choke off demand for 3D glasses. I for one am looking forward to virtual reality. Interestingly, the filmmakers shooting on the cheap with digital cameras then distributing via the internet may tell us more about how films in virtual reality might be distributed and viewed than how 3D films are being distributed and priced. People have a way of voting with their wallets (and purses), and other candidates have a way of popping up unless kept out by a pushy oligarch. So perhaps it can be said that, assuming a competitive marketplace, 3D may become a viable way-station on our way to virtual reality on Pandora.


1. Michael Cieply, “The Rich Are Different: They’re in 3-D,” The New York Times, January 17, 2012. 
2. Ibid.

Wednesday, April 27, 2011

Computer Technology Revolutionizing Industries: Books and Films

Crude oil was first drilled in 1859 in northwestern Pennsylvania (not in the desert of the Middle East). It was not long before oil lamps became ubiquitous, lengthening the productive day for millions beyond daylight hours. Just fifty or sixty years later, as electricity was beginning to replace the lamps, Ford’s mass-produced automobile was taking off, providing an alternative use of crude oil. For those of us alive in the early decades of the twenty-first century, electric lighting indoors and cars on paved roads have been around as long as we can remember. As a result, we tend to assume that things will go on pretty much as they “always” have. Other than for computer technology, the end of the first decade of the 21st century looks nearly indistinguishable from the last thirty or forty years of the last century. As the second decade of the 21st century began, applications based on computer technology were reaching a critical mass in terms of triggering shifts in some industries that had seemingly “always” been there.  Books, music and movies were certainly among the fastest moving, perhaps like the dramatic change in lighting and cars beginning a century and a half before with the discovery of crude oil.


The full essay is at "Computer Technology Revolutionizing Industries."