Even after the century known for its astonishing
technological advances, the human inclination to revert to a childlike state in
innocently going overboard with the new toys proffered by still more advances as
the twenty-first century gained its own footing. With regard to film,
revolutionary special effects based on computer technology far outstripped any
directorial investment in depth of story, including the characters. Even before
the advent of computer special-effects way back in the 1970s, Charleston Heston
starred in Earthquake, a film worthy of note only for the creation of
an “earthquake-like” experience for viewers thanks to surround sound with a lot
of base. The narrative was bland and the characters were mere cut-outs.
Years later, as part of a course at a local public-access
cable studio, I concocted a music video out of footage the instructor and I had
shot of a salsa band playing in-studio. After too many hours in with the
computerized editing machine, I proudly emerged with my new Christmas tree only
for the instructor, Carlos, to hand the tapes back to me. “Now make one without
going over-board on all the bells and whistles,” he wisely directed. I had
indeed put in just about everything I could find. Back in the small editing
room, I used the fun fades sparingly, as good writers use adjectives.
For years after that course and some experience shooting and
directing public-access programming, I would recall the lesson each time I saw yet another film sporting the newest in film-making technology yet otherwise empty of
substance. James Cameron was a notable exception, centering Titanic (1997) not just on the obvious—the
sinking (by means of a real ship in-studio)—but also on a romance undergirded
by substantial character development. The next film to successfully do justice
to both technological development and depth of characterization along with a darn
good story was Cameron’s own Avatar (2009).
That Cameron accomplished such a technological leap in film-making without
sacrificing characterization and narrative says something rather unflattering about
all the technological eye-candy that has brought with it huge cavities in
narrative and characters.
In spite of the release date of Avatar 2 being in 2016, David Cameron has put out a preliminary trailer.
It was not until I saw Gravity
(2013) that I discovered a litmus test for determining whether a
film-making advance has come at the expense of narrative substance. Sandra
Bullock gave such an authentically-emotional performance that at one point I
found myself oblivious to the stunning visuals of Earth from orbit. In watching
Avatar, I had become so taken with
Neytiri’s eye-expressions that the technologically-achieved visuals on Pandora
receded into the background. As a criterion, the re-integration of dazzling
technologically-derived visuals back into the background as emotional-investment
in a character re-assumes its central place in the foreground of the suspension
of disbelief can separate the “men from the boys” in terms of film-making.
Ultimate high-definition, or UHD, which refers to television
screens sporting at least four times the number of pixels as “mere”
high-definition, goes beyond the capacity of the human eye. Hence, the
potential problem in going beyond ultimate
is moot; we could not visually discern any difference. Even so, I suspect
this inconvenient detail would not stop people in marketing from primping the
product as “beyond ultimate.” One unfortunate byproduct of such inventive
marketing may ironically not be visible at all. Specifically, the illusionary
distractions, or marketing mirages, come with an opportunity cost; that is,
being suckered into seemingly exciting innovations can distract us from
noticing other technological advances whose applications could alter entire
industries and transform our daily lives.
At the International Consumer Electronics Show in January,
2014, Sony and Samsung showcased UDH televisions with curved screens. The “more
immersive viewing” is possible only on larger screens; John Sciacca, an
installer of audio/video, reports that at the small-screen level, the curved
shape “has no advantage.”[1]
Accordingly, the televisions with the option on display at the electronics show
in Las Vegas, Nevada range from 55 to 105-inch screens.[2]
The verdict there was mixed, the usual suspects playing their expected roles.
The 105-inch UHD television. Paragon of the 21st century? (Image Source: Forbes)
Kaz Hirai, the CEO of Sony, insisted in an interview that
“some people actually like [the curved screen] very much.”[3]
He went on to add, “I personally think it’s a great experience because you do
have that feeling of being a little bit more surrounded. . . . After you watch
curved TV for a while and then you just watch a flat panel, it looks like,
‘What’s wrong with this panel?’”[4]
In the midst of this vintage self-promotion, he did admit, as if as an
afterthought, that the curved feature is “not for everyone.”[5]
John Sciacca would doubtless call the matter of preference
an understatement. “For displays, at least, curved is a ‘marketing gimmick.’ I
know that research has gone into finding the ideal curve for the ideal seating
distance, but I think that it is still limiting its best results for a much
narrower viewing position.”[6]
That is, the curved shape can be counterproductive in settings where more than
two or three people are sitting close together straining to capture the sweet
spot of ideal viewing. To be sure, at that “dot” on a room diagram, Sciacca
admits that the curved shape on big screens (i.e., 55-inch and up) “has real
benefits for front projection as it has to do with how the light hits the
screen at different points, and a curve helps with brightness uniformity and
geometry.”[7]
Granted, but who wants to do geometry in order to figure out where to sit?
Accordingly, the curved screen “seems like a marketing
thing,” rather than anything “substantive,” says Kurt Kennobie, the owner of an
audio-video store in Phoenix.[8]
Kaz Hirai would doubtless disagree, at least publically. The usual trajectory of
public attention would simply follow this debate back and forth, treating it as
though it were a riveting tennis match. The fans getting their adrenaline fix
would hardly notice the progress being made in a potentially transformative
rather than incremental technology applicable to television and cinema. This
invisible cost in chasing after minor points lies, in other words, in missing
the big picture, which in this case involves a developing technology that leaps
over the “curved” controversy entirely.
What exactly is this so-called revolutionary technology? It
already exists, in gaming, which incidentally already merges movie and
video-game features, and here’s the key, applied
as “virtual reality.” Lest the philosophers among us just caught a whiff of
metaphysics, the “reality” to which I refer is simply a novel way of viewing
visuals such as video games. The viewing is provided by a headset.
For example,
Avegant’s head-mounted Glyph sports a “virtual retina display.” The devise
“makes use of a couple million tiny mirrors—similar to the technology found in
DLP projectors—to project a 720p [pixel] image straight onto your retina. The
result is a stereoscopic 3D display that has almost zero crosstalk and no
noticeable pixelation.”[9]
In other words, suddenly the “curved vs flat” debate is obsolete, or trumped,
as though by an interloper.
Moreover, the application of Glyph to watching
television and movies could marginalize (though I suspect not eliminate)
screens, whether in homes or movie theaters. That’s right, the cinemas would
face a stiff headwind, and therefore likely be forced to concentrate on their comparative
advantage—the “ultra” big screen. I suspect that experience would survive, for
people would probably not want to confine all viewing to one mode. Indeed, even
the “virtual reality” means of great immersion might have to contend with an
even newer mode of viewing—that of the hologram.
Needless to say, both means
would mean changes to how films are written, shot, and edited. Perhaps a given
film would have screen, virtual reality, and holograph versions, which at least
in the case of virtual reality might involve alternative storylines allowing
for viewer choice. A person could ask, for example, “How would the narrative
play out were this or that character to die rather than live?”
With the increasing modes of viewing, the world as we know
it in the 2010’s could change in such a marked way. In fact, people living in
2100 might look back on 2014 akin to how people in 2000 looked back on the
automobile, telephone, and electric light dramatically changing daily life in
the 1910’s. The new viewing
applications, along with the development of the internet and mobile devises,
could distinguish the 2010’s in retrospect, and, moreover, the twenty-first
century from the previous one. Known in its day as the century of technological
change, the twentieth century could yet lose its mantle to a century in which
mankind is cured of death, barring an accident or natural disaster. Change is
relative, which is precisely my point concerning virtual reality and curved
screens. Perhaps human perspective is itself curved in a way that only the most
transient and titillating image lies in the “sweet spot.”
[1]
Mike Snider, “Finding the Sweet Spot of Curved Displays,” USA Today, January 10, 2014.
[2] Ibid.
[3]
Ibid.
[4]
Ibid.
[5]
Ibid.
[6]
Ibid.
[7]
Ibid.
[8]
Ibid.
[9]USA Today, “Editors’ Choice Awards,”
January 10, 2014. I should note that I have no financial interest in Avegant or
the Glyph. I am merely using this product to make a broader point.