Visual Effects: Morphing to Digital

 

George Lucas was never satisfied with the technical tools of filmmaking. Suddenly, he had the money and the power to change them. Crucially, he was willing to share these new tools with other filmmakers at a price—generally, commanding a very high one.

Using electronic tools to edit or alter images was one thing. Creating the images was another, far harder job. Old-school analog TV didn’t have nearly enough detail to stand comparison with movies. Not even close. Optical scientists experimented to determine just how much better television would have to improve to equal the appearance of film. The difference in visual resolution between the two was so great, it was hard to even measure it meaningfully. It would be like trying to reach the Moon by climbing a higher tree. Or so it seemed–at the time. That verdict would change.

The first crude digital moving images were made in the mid-Sixties, by computers at Bell Laboratories, and by the BESM-4 system in Russia. In 2001: A Space Odyssey, a few seconds of (supposedly) computer animation on a monitor shows a dish antenna. In actuality, the tiny sequence was hand-drawn; real digital animation barely existed yet. But by the time Star Wars was made a decade later, it did. The plan of the attack on the Death Star is illustrated on a small screen with simplified wire frame animation. At the time, that primitive “machine drawing” was state of the art.

The last major use of this technique in motion pictures was in the trailer and the opening credits of The Black Hole (1979). It was an impressive-looking way for the green grids of wireframe to leave us. They’d done their job in VFX history. From this point on, software companies like Utah’s Evans and Sutherland, and hardware companies like Sun and Silicon Graphics made the leap to adding textured surfaces to those wire frames, creating solid-looking objects of greater and greater realism. Television, as noted, had more forgiving visual standards than feature films, so some very limited bits of digital animation began to be seen in TV commercials.

Disney’s TRON (1982) was hot stuff in its day. Not every scene was digital, but the ones that were attracted lots of attention. Nowadays, the kind of on-the-fly visual calculations that made TRON have long since been bested by ordinary home video game consoles, but forty years ago, producing this film required the services of a Cray 1, the pride of Chippewa Falls, WI, then one of the fastest and most expensive computers in the world.

That same year, Star Trek II’s “Genesis Effect” became the first semi-realistic landscape computer-generated for a film, benefiting from graphics software exploiting a reinvigorated mathematical field, fractals. It was created by a special Lucas-owned digital unit that was legally distinct from ILM. They gave it a name: Pixar.

The next step was a fully digital character. That would happen in 1985 with the Stained-Glass Knight, who appeared for about half a minute in The Young Sherlock Holmes. Since it was a fantasy being, the then-still-major shortcomings of digital didn’t matter. Same with James Cameron’s The Abyss (1988), The mysterious, malevolent “water snake” was a purely digital invention that looked realistic enough to be treated as a real object. It was a new hallmark in effects. But it was, like the others, a fundamentally unreal creature that an audience couldn’t compare to anything in the real world. Digital had proven to be able to do fantasy. Reality was still out of reach.

As the Eighties ended, the vast majority of VFX was still done on film all the way. So were the movies in general. Up through the end of the century, they were still film-based. The jazzy new effects were computer-based, but every Hollywood production still originated on 35mm Eastman color negative. The final product still shipped around the world in heavy steel cans.

Gradually, the Society of Motion Picture and Television Engineers revisited the question of how much finer an electronic image would have to be to equal the perception of standard 35mm movies. It turned out that with digital, the jump was not nearly as unattainable as it once seemed. Those Fifties experiments compared television to original 35mm slides. But no movie audience ever sees original film; at the very best they saw a copy of a copy of a copy. At each step, there are microscopic line-up errors.

By 1991, in Terminator 2, James Cameron used digital to produce a liquid metal robot in human form. It wasn’t quite photorealistic, but it was getting there. (The most amazing and terrifying effect in T2 is the Hiroshima-like scene of the nuclear blast, but it doesn’t quite fit here because it was a mixture of techniques.) This pioneering “morphing” technology was also used that year most strikingly in a Michael Jackson music video, Black or White? which uses the person-to-person transformation of the startling special effect as what comes across as some kind of surprisingly early ad for identity fluidity.

It was Jurassic Park, two years later, that became, in effect, the Jazz Singer of digital character animation, a hit that would forever change audience expectations of what the movies can do. 1993’s audiences were wowed. They’d never seen anything approaching this level of magic. Steven Spielberg bet his whole movie on the effects being believable, and won. The value of CGI was proven. Hollywood never looked back.

Compare the digital visual effects of 1995’s Apollo 13 with roughly its Reagan-era equivalent, the film-based effects of 1983’s The Right Stuff. They’re both films that get a great response from a packed theater. The often-improvised, relatively low-tech techniques used in The Right Stuff are slightly funky and impressionistic, more or less matching the director’s tone, irreverent if not outright snarky. The digital effects of Apollo 13 are flawless and precise, resurrecting long gone visions like the majesty of a Saturn V launch. (Apollo 13 also used some practical effects and models.)

Periodically, filmmakers do set themselves the challenge of doing things with minimal or no CGI. Oppenheimer is a contemporary example, but even from the very beginning, there have always been a few films like Francis Coppola’s Bram Stoker’s Dracula (1991) that made a big point of not taking the easy way out, of staying old school.

The pre-Eighties field of visual effects was done by old guys with lots of experience but little technical education, who apprenticed twenty years earlier on finicky equipment that was ancient even then. There were only so many old FX guys and old optical printers to go around; that bottleneck was one reason why Lucas wanted to change the system. Nineties CGI brought new flexibility to VFX production. When Independence Day had too many effects shots for one company, they simply broke up the rush work among several FX houses. There was no custom equipment; their graphics workstations were leased in bulk and the software package was enterprise-wide.

Over the following years, motion picture special visual effects companies hired thousands of computer-trained graphic arts graduates, often right off campuses. The Star Wars and immediate post-Star Wars world of special effects was dominated by Boomers; the digital revolution brought Gen X into power in a big way. And for the first time, the numbers of women and men on movie tech credits began to equalize.

After setting HDTV standards, SMPTE got around to filmless movie theaters. They determined more than 25 years ago that theatrical digital video of roughly 4K (in today’s terms) would do the job. When silent movies gave way to sound, almost every piece of filmmaking equipment had to change, and audiences saw and heard the difference immediately. But seventy-plus years later, when film gave way to digital at both ends of the movie pipeline, it was barely noticeable to most people. Visual effects no longer had to be converted to film. They just joined the normal digital editing workflow.

By the turn of the century, movies like Gladiator used digital to do the things that only it could do, like aerial helicopter-style shots of crowded ancient Rome at its height. (It also brought deceased star Oliver Reed back to life for one necessary scene.) Overuse of CGI has been a recurring complaint for the nearly quarter century since, but for better or worse, 21st-century audiences responded to the lavish spectacle that visual effects enabled. Studios backed these Cinematic Universes because people paid to see them.

George Lucas had wanted his prequel trilogy to be digital, start to finish, but when he began shooting, there were still few top-quality digital cameras. The theaters weren’t ready in 1999. By the time of the next film in 2002, though most theaters still ran film, there were already a sizeable number of digital screens. George could justly boast in ads that a DVD copy of Star Wars II: Attack of the Clones was “a perfect clone.” From the point of view of copyright security, that boast would become a two-edged sword.

Since CGI pasted a reasonable facsimile of Oliver Reed’s face, it has been pressed to do more. When Paul Walker died in a car crash in 2013, his scenes in Furious 7 were completed by CGI that was vastly more capable than it had been 13 years earlier.

It’s never wise to bet against the wizardry of visual effects. That earlier post about dealing with the mortality of actors mentioned the decades-long SF concept of “synthespians,” actors reanimated by, uh, animators. Actors are now de-aged, sometimes pretty convincingly, and dead ones have briefly been brought back to life, usually not quite as convincingly. Yet. Nonetheless, it’s no small deal to Hollywood, currently embroiled in simultaneous strikes of writers and actors, partly motivated by fear of artificial intelligence. It’s real enough to them.

Visual effects has become a high-technology field, but its creative directors and business managers have been caught flatfooted by AI. VFX houses are going to have a tough time staying around, if all a director has to do is say, “Okay, ChatGPT, a view of Saturn from its moon Titan” and it appears on the screen instantly. With AI entering the writing arena as well, it’ll be able to do the whole job, dreaming up stories, drawing the backgrounds, composing the music, filling in realistic moving and acting images of background extras in whatever numbers are needed, and even doing the acting.

What’ll that do to Hollywood’s workforce? I’d like to be able to say: only a human can truly know what it is to be a human, so real artists will never be replaced for the big stuff, the prestige shows and films. For many of the simpler tasks, though, like illustrated textbook lessons, corporate training, documentaries, they wouldn’t be replaced—well, not completely, certainly not at first– but simply undercut by less soulful but cheaper alternatives. If this succeeds, is accepted and makes money, they’ll try doing synthe-soaps, in Latino daytime TV markets first. How soon could that happen? Not long ago I would have said, at least a generation or two away. (In practical terms, say 15 to 30 years away.) Now? A guess: at this rate, in fewer than five years.

In little more time than that, your smart TV, or even a smartphone might well be a “thin client” that accesses the network-enabled capability of making up its very own waste-of-time entertainment, its own basic cable quality shows and movies, on the fly, tailored to your tastes, starring anyone whose image service you subscribe to—or starring you. We’ve come a long, long way since George Melies took a trip to the Moon.

As alluring as some of that might be to some, my bet is it won’t take over everything. Completely machine-generated entertainment might very well have a dazzling start, then hit a limit of popular interest and acceptance, like disco, polyester, and Atari did.

Hollywood has been re-learning a painful, periodic lesson, one that slaps it in the face every couple of decades or so: this is a business of hits. Artists have hunches, make bets that others will find the same things interesting that they do. Software can’t do that.

This post concludes a three-part series about the history of special visual effects in the movies. Here are Part 1 and Part 2.

Published in General
This post was promoted to the Main Feed by a Ricochet Editor at the recommendation of Ricochet members. Like this post? Want to comment? Join Ricochet’s community of conservatives and be part of the conversation. Join Ricochet for Free.

There are 97 comments.

Become a member to join the conversation. Or sign in if you're already a member.
  1. Judge Mental Member
    Judge Mental
    @JudgeMental

    Somewhere in the in between days, you get The Last Starfighter, which, although they didn’t have time to render in the planned high resolution with complex textures, did introduce having many individual objects moving independently, and with purpose.  Such that even though the enemy ships looked a bit like plastic toys, you got to see what having dozens of pilots each doing their thing simultaneously is really like.  A breakthrough of its own.

    • #1
  2. Gary McVey Contributor
    Gary McVey
    @GaryMcVey

    Judge Mental (View Comment):

    Somewhere in the in between days, you get The Last Starfighter, which, although they didn’t have time to render in the planned high resolution with complex textures, did introduce having many individual objects moving independently, and with purpose. Such that even though the enemy ships looked a bit like plastic toys, you got to see what having dozens of pilots each doing their thing simultaneously is really like. A breakthrough of its own.

    An excellent reference, meaning I agree! The Last Starfighter is a definite marker on the timeline of VFX history. It was pretty well reviewed, too. Its VFX were mostly rendered on a Cray XM/P, giving it an advantage over earlier digital scenes. The XM/P actually appears as a suitably menacing futuristic prop in the background of TRON, but was just introduced in 1982, same as the film, which was actually rendered on a Cray 1. 

    I hate to be cynical, but that’s so Hollywood. Look at it from Cray 1’s point of view. They do the rendering on you, they tell you how it’s going to make you a star, and then something newer and hotter comes along and they get all the glamour close-ups. 

    • #2
  3. Percival Thatcher
    Percival
    @Percival

    Gary McVey: Since CGI pasted a reasonable facsimile of Oliver Reed’s face, it has been pressed to do more. When Paul Walker died in a car crash in 2013, his scenes in Furious 7 were completed by CGI that was vastly more capable than it had been 13 years earlier.

    Grand Moff Tarkin would like to have a word with you.

     

    • #3
  4. EJHill Podcaster
    EJHill
    @EJHill

    Percival: Grand Moff Tarkin would like to have a word with you.

    And so we’re coming to the point of the SAG-AFTRA strike. 

    • #4
  5. Red Herring Coolidge
    Red Herring
    @EHerring

    The Lord of the Rings lab where they did special effects is in Wellington, New Zealand. Sadly, the port was closed because of the tropical storm and I lost out. I’m not likely to get another chance to visit it.

     

    • #5
  6. Judge Mental Member
    Judge Mental
    @JudgeMental

    Gary McVey (View Comment):

    Judge Mental (View Comment):

    Somewhere in the in between days, you get The Last Starfighter, which, although they didn’t have time to render in the planned high resolution with complex textures, did introduce having many individual objects moving independently, and with purpose. Such that even though the enemy ships looked a bit like plastic toys, you got to see what having dozens of pilots each doing their thing simultaneously is really like. A breakthrough of its own.

    An excellent reference, meaning I agree! The Last Starfighter is a definite marker on the timeline of VFX history. It was pretty well reviewed, too. Its VFX were mostly rendered on a Cray XM/P, giving it an advantage over earlier digital scenes. The XM/P actually appears as a suitably menacing futuristic prop in the background of TRON, but was just introduced in 1982, same as the film, which was actually rendered on a Cray 1.

    I hate to be cynical, but that’s so Hollywood. Look at it from Cray 1’s point of view. They do the rendering on you, they tell you how it’s going to make you a star, and then something newer and hotter comes along and they get all the glamour close-ups.

    I would love to see a version rendered in the way originally planned.

    • #6
  7. Randy Weivoda Moderator
    Randy Weivoda
    @RandyWeivoda

    Gary McVey: In little more time than that, your smart TV, or even a smartphone might well be a “thin client” that accesses the network-enabled capability of making up its very own waste-of-time entertainment, its own basic cable quality shows and movies, on the fly, tailored to your tastes, starring anyone whose image service you subscribe to—or starring you.

    It could be stupendously lucrative if a company who owned the rights to a popular movie could make customized versions that replace the real actors’ faces and voices with the ones the client supplies.  Can’t you see someone saying, “I want a copy of Smokey and the Bandit, but swap me in for Burt Reynolds.  I want my dad to play Buford T. Justice, my brother to be Snowman, and his dog to be Fred.”

    • #7
  8. Judge Mental Member
    Judge Mental
    @JudgeMental

    Randy Weivoda (View Comment):

    Gary McVey: In little more time than that, your smart TV, or even a smartphone might well be a “thin client” that accesses the network-enabled capability of making up its very own waste-of-time entertainment, its own basic cable quality shows and movies, on the fly, tailored to your tastes, starring anyone whose image service you subscribe to—or starring you.

    It could be stupendously lucrative if a company who owned the rights to a popular movie could make customized versions that replace the real actors’ faces and voices with the ones the client supplies. Can’t you see someone saying, “I want a copy of Smokey and the Bandit, but swap me in for Burt Reynolds. I want my dad to play Buford T. Justice, my brother to be Snowman, and his dog to be Fred.”

    I wanna be Godzilla!

    • #8
  9. Mad Gerald Coolidge
    Mad Gerald
    @Jose

    Gary McVey: With AI entering the writing arena as well, it’ll be able to do the whole job, dreaming up stories, drawing the backgrounds, composing the music, filling in realistic moving and acting images of background extras in whatever numbers are needed, and even doing the acting.

    I’m not following the AI kerfuffle closely, but the Chat GPT writing I’ve read sounds like it is very generic, edited by a committee, and does not have any distinguishing features.  I expect that would be because all these things are true; Chat GPT is trained by looking at vast quantities of text, and it is logical that it’s output is average.  That also applies to much of current media output, but I don’t think AI will replace the very best writers.

    I’m inclined to think the same thing would apply to music composition.  Regarding realistic imagery, I just don’t know.

    • #9
  10. Judge Mental Member
    Judge Mental
    @JudgeMental

    Mad Gerald (View Comment):

    Gary McVey: With AI entering the writing arena as well, it’ll be able to do the whole job, dreaming up stories, drawing the backgrounds, composing the music, filling in realistic moving and acting images of background extras in whatever numbers are needed, and even doing the acting.

    I’m not following the AI kerfuffle closely, but the Chat GPT writing I’ve read sounds like it is very generic, edited by a committee, and does not have any distinguishing features. I expect that would be because all these things are true; Chat GPT is trained by looking at vast quantities of text, and it is logical that it’s output is average. That also applies to much of current media output, but I don’t think AI will replace the very best writers.

    I’m inclined to think the same thing would apply to music composition. Regarding realistic imagery, I just don’t know.

    I think you can improve that aspect by specifying a style of writing.  For example, I saw a story written in the style of Dr. Seuss that was spot on.

    • #10
  11. Mad Gerald Coolidge
    Mad Gerald
    @Jose

    Gary McVey: As the Eighties ended, the vast majority of VFX was still done on film all the way. So were the movies in general. Up through the end of the century, they were still film-based.

    During Operation Desert Storm in 1991, the USAF sent a Combat Camera team to the theater with analog still cameras (maybe Sony Mavicas?) and a satellite dish.  They were able to transmit the images back to the states very quickly for what was, to my knowledge, the first time imagery wasn’t dependent on shipping physical media.  I was not impressed by the image quality, but the speed was impressive.

    Immediately after Desert Storm, Operation Provide Comfort, based in Turkey was started to help all the Kurds that Saddam Hussein had driven into the mountains.  The satellite dish was moved to Incirlik Air Base, but once there the AF technicians were never able to connect to the satellite.  I don’t know if it was human error or technology problem, but it was a downer after the success of the previous operation.

    Edit: The cameras were described as “still video”.

    • #11
  12. Miffed White Male Member
    Miffed White Male
    @MiffedWhiteMale

    Mad Gerald (View Comment):
    During Operation Desert Storm in 1991, the USAF sent a Combat Camera team to the theater with analog still cameras (maybe Sony Mavicas?) and a satellite dish.  They were able to transmit the images back to the states very quickly for what was, to my knowledge, the first time imagery wasn’t dependent on shipping physical media.  I was not impressed by the image quality, but the speed was impressive.

    I’m guessing that was the inspiration for the SNL bits where Al Franken was the “roving correspondent” at the political conventions in 1992(?)  where he was running around the convention floor with a backpack and a satellite dish on his head.

     

    Edit:  Nope, quick google search shows that was 1988.  So maybe SNL inspired the military for Desert Storm…

    • #12
  13. Mad Gerald Coolidge
    Mad Gerald
    @Jose

    Gary McVey: After setting HDTV standards, SMPTE got around to filmless movie theaters. They determined more than 25 years ago that theatrical digital video of roughly 4K (in today’s terms) would do the job.

    I think the first movie I saw from a digital projector was V for Vendetta (2005).  A co-worker had talked about how the quality was high enough to be indistinguishable from film, but I was very skeptical. 

    During my USAF career I worked closely with photographers.  I watched them transition from film only, to digital only.  I had seen the differences in the finished products, and I could, usually, detect and analyze the difference between film and digital.

    So I watched the movie and sure enough, I couldn’t detect any digital deficiencies. It was very impressive.  During the end credits of that movie, or maybe another soon after, I saw that it was set at 4K resolution.  So that question was answered. 

    At the beginning of the digital photography transition, I was always apprehensive that the overall quality would decline.  That has happened in other areas, such as poorer audio quality over “telephone” connections.  But I’ve been pleased that TV and Cinema have maintained and improved the experience.

    • #13
  14. Gary McVey Contributor
    Gary McVey
    @GaryMcVey

    Red Herring (View Comment):

    The Lord of the Rings lab where they did special effects is in Wellington, New Zealand. Sadly, the port was closed because of the tropical storm and I lost out. I’m not likely to get another chance to visit it.

     

    Peter Jackson is not only a fine director and effects expert, but he’s been one of the leaders of using digital techniques to make faded, jumpy old film footage steadier and much more like it looked a century ago. As far as I’m concerned, They Shall Not Grow Old was a masterpiece. 

    • #14
  15. Gary McVey Contributor
    Gary McVey
    @GaryMcVey

    Judge Mental (View Comment):

    Randy Weivoda (View Comment):

    Gary McVey: In little more time than that, your smart TV, or even a smartphone might well be a “thin client” that accesses the network-enabled capability of making up its very own waste-of-time entertainment, its own basic cable quality shows and movies, on the fly, tailored to your tastes, starring anyone whose image service you subscribe to—or starring you.

    It could be stupendously lucrative if a company who owned the rights to a popular movie could make customized versions that replace the real actors’ faces and voices with the ones the client supplies. Can’t you see someone saying, “I want a copy of Smokey and the Bandit, but swap me in for Burt Reynolds. I want my dad to play Buford T. Justice, my brother to be Snowman, and his dog to be Fred.”

    I wanna be Godzilla!

    It is indeed a business opportunity! And it’s no idle dream. 

    • #15
  16. Percival Thatcher
    Percival
    @Percival

    Mad Gerald (View Comment):

    Gary McVey: With AI entering the writing arena as well, it’ll be able to do the whole job, dreaming up stories, drawing the backgrounds, composing the music, filling in realistic moving and acting images of background extras in whatever numbers are needed, and even doing the acting.

    I’m not following the AI kerfuffle closely, but the Chat GPT writing I’ve read sounds like it is very generic, edited by a committee, and does not have any distinguishing features. I expect that would be because all these things are true; Chat GPT is trained by looking at vast quantities of text, and it is logical that it’s output is average. That also applies to much of current media output, but I don’t think AI will replace the very best writers.

    I’m inclined to think the same thing would apply to music composition. Regarding realistic imagery, I just don’t know.

    Think about the last N first-run Hollywood movies you saw in a theater. How many of them rose above generic Movie? The one big advantage to AII “talent” is that it won’t blurt out something insanely stoopid at press opportunities that honk off large swaths of your audience.

    I’m kind of surprised that Bob Iger hasn’t arranged to have Rachel Zegler held incommunicado in a safehouse in Pacoima by now.

    • #16
  17. Gary McVey Contributor
    Gary McVey
    @GaryMcVey

    Percival (View Comment):

    Mad Gerald (View Comment):

    Gary McVey: With AI entering the writing arena as well, it’ll be able to do the whole job, dreaming up stories, drawing the backgrounds, composing the music, filling in realistic moving and acting images of background extras in whatever numbers are needed, and even doing the acting.

    I’m not following the AI kerfuffle closely, but the Chat GPT writing I’ve read sounds like it is very generic, edited by a committee, and does not have any distinguishing features. I expect that would be because all these things are true; Chat GPT is trained by looking at vast quantities of text, and it is logical that it’s output is average. That also applies to much of current media output, but I don’t think AI will replace the very best writers.

    I’m inclined to think the same thing would apply to music composition. Regarding realistic imagery, I just don’t know.

    Think about the last N first-run Hollywood movies you saw in a theater. How many of them rose above generic Movie? The one big advantage to AII “talent” is that it won’t blurt out something insanely stoopid at press opportunities that honk off large swaths of your audience.

    I’m kind of surprised that Bob Iger hasn’t arranged to have Rachel Zegler held incommunicado in a safehouse in Pacoima by now.

    Funny that you should mention his name, Percival. Bob Iger is one of the “stars” of my very next post, The Powers That Be–the TV Show. 

    • #17
  18. Gary McVey Contributor
    Gary McVey
    @GaryMcVey

    Miffed White Male (View Comment):

    Mad Gerald (View Comment):
    During Operation Desert Storm in 1991, the USAF sent a Combat Camera team to the theater with analog still cameras (maybe Sony Mavicas?) and a satellite dish. They were able to transmit the images back to the states very quickly for what was, to my knowledge, the first time imagery wasn’t dependent on shipping physical media. I was not impressed by the image quality, but the speed was impressive.

    I’m guessing that was the inspiration for the SNL bits where Al Franken was the “roving correspondent” at the political conventions in 1992(?) where he was running around the convention floor with a backpack and a satellite dish on his head.

     

    Edit: Nope, quick google search shows that was 1988. So maybe SNL inspired the military for Desert Storm…

    In 1957, that was considered a really small TV camera. Picture that (maybe a little smaller) on a shoulder-rest. That’s how political reporters covered the nominating conventions right from the floor of the hall. TV crews referred to the setup as the “creepie peepie”. 

    • #18
  19. Gary McVey Contributor
    Gary McVey
    @GaryMcVey

    Mad Gerald (View Comment):

    Gary McVey: After setting HDTV standards, SMPTE got around to filmless movie theaters. They determined more than 25 years ago that theatrical digital video of roughly 4K (in today’s terms) would do the job.

    I think the first movie I saw from a digital projector was V for Vendetta (2005). A co-worker had talked about how the quality was high enough to be indistinguishable from film, but I was very skeptical.

    During my USAF career I worked closely with photographers. I watched them transition from film only, to digital only. I had seen the differences in the finished products, and I could, usually, detect and analyze the difference between film and digital.

    So I watched the movie and sure enough, I couldn’t detect any digital deficiencies. It was very impressive. During the end credits of that movie, or maybe another soon after, I saw that it was set at 4K resolution. So that question was answered.

    At the beginning of the digital photography transition, I was always apprehensive that the overall quality would decline. That has happened in other areas, such as poorer audio quality over “telephone” connections. But I’ve been pleased that TV and Cinema have maintained and improved the experience.

    My first seeing a digital movie in a theater was The Count of Monte Cristo. Like you, I had enough experience to be a critical viewer, and I was impressed. 

    The trick, of course, is to have enough so many scan lines that you don’t see them, paradoxical as that sounds. Large screen TV looked lousy in the old pre-HD days because there weren’t enough lines to blur together. Although we did have a technique called “spot wobble”. Sounds elegant already, right? Spot wobble was very high frequency, low volume noise applied to the magnetic coils that sweep the electron beam from side to side and up and down. It “wobbled” the scan lines just raggedly enough to blur them together. 

    Sounds crazy, but it worked–up to a point. 

    • #19
  20. Mad Gerald Coolidge
    Mad Gerald
    @Jose

    Gary McVey (View Comment):

    Mad Gerald (View Comment):

    Gary McVey: After setting HDTV standards, SMPTE got around to filmless movie theaters. They determined more than 25 years ago that theatrical digital video of roughly 4K (in today’s terms) would do the job.

    I think the first movie I saw from a digital projector was V for Vendetta (2005). A co-worker had talked about how the quality was high enough to be indistinguishable from film, but I was very skeptical.

    During my USAF career I worked closely with photographers. I watched them transition from film only, to digital only. I had seen the differences in the finished products, and I could, usually, detect and analyze the difference between film and digital.

    So I watched the movie and sure enough, I couldn’t detect any digital deficiencies. It was very impressive. During the end credits of that movie, or maybe another soon after, I saw that it was set at 4K resolution. So that question was answered.

    At the beginning of the digital photography transition, I was always apprehensive that the overall quality would decline. That has happened in other areas, such as poorer audio quality over “telephone” connections. But I’ve been pleased that TV and Cinema have maintained and improved the experience.

    My first seeing a digital movie in a theater was The Count of Monte Cristo. Like you, I had enough experience to be a critical viewer, and I was impressed.

    The trick, of course, is to have enough so many scan lines that you don’t see them, paradoxical as that sounds. Large screen TV looked lousy in the old pre-HD days because there weren’t enough lines to blur together. Although we did have a technique called “spot wobble”. Sounds elegant already, right? Spot wobble was very high frequency, low volume noise applied to the magnetic coils that sweep the electron beam from side to side and up and down. It “wobbled” the scan lines just raggedly enough to blur them together.

    Sounds crazy, but it worked–up to a point.

    Crazy and paradoxical – yeah.  I used to contribute a lot of graphic still imagery for video productions, such as bullet text and emblems.  They tended to look “jagged” and ugly on video.  The solution was to run a “Gaussian blur” on everything.  By blurring and softening the graphics, they appeared sharp and smooth on playback.  “Spot wobble” sounds very similar.

    • #20
  21. Gary McVey Contributor
    Gary McVey
    @GaryMcVey

    In the Fifties experiments to determine how much better film visual resolution was, it appeared to be at or near the limits of measurement at that time, approximately 35,000-40,000 lines of resolution. It might have been higher, they thought. But they made some testing errors. 

    In the Eighties they found that the real experimental value of human eyesight was closer to 8000-9000 lines of resolution. 8K. That didn’t really show up the old tests that badly–they were within an order of magnitude, anyway–and it was indeed beyond what a TV system could handle. But it wasn’t way-out-in-the-clouds crazy anymore. And then when they discovered that only 4K was required to fool the eye, it was suddenly reachable. 

    • #21
  22. Judge Mental Member
    Judge Mental
    @JudgeMental

    Mad Gerald (View Comment):

    So I watched the movie and sure enough, I couldn’t detect any digital deficiencies. It was very impressive.  During the end credits of that movie, or maybe another soon after, I saw that it was set at 4K resolution.  So that question was answered. 

     

    The first one I recall seeing that was shot digitally was 28 Days Later.  There are a fair amount of somewhat grainy, desaturated shots in that, but I can’t say whether that’s tech, or cinematography.

    • #22
  23. Gary McVey Contributor
    Gary McVey
    @GaryMcVey

    Judge Mental (View Comment):

    Mad Gerald (View Comment):

    So I watched the movie and sure enough, I couldn’t detect any digital deficiencies. It was very impressive. During the end credits of that movie, or maybe another soon after, I saw that it was set at 4K resolution. So that question was answered.

     

    The first one I recall seeing that was shot digitally was 28 Days Later. There are a fair amount of somewhat grainy, desaturated shots in that, but I can’t say whether that’s tech, or cinematography.

    It’s hard to tell nowadays–is this odd gloomy effect the result of a weirdly cloudy-bright day, or did the cinematographer do it, or was it the post-production colorists? Call me old fashioned, but I don’t think it should have to take the consensus opinion of five highly paid technicians to determine that a baseball diamond is green and a bottle of orange soda is orange. 

    • #23
  24. Gary McVey Contributor
    Gary McVey
    @GaryMcVey

    I’m not sure I got this across in the OP, but the thing about George’s tinkering isn’t just that he improved the tools of the craft, but that he was smart enough to figure out how to make it profitable. Mostly. A side note:

    It wasn’t just Industrial Light and Magic. (As noted in the post, Pixar essentially became ILM’s digital sub-division, before it was sold off to Steve Jobs.) Lucas started other spinoff tech companies: for theatrical exhibition, the THX theater alignment program, famously featuring THX level sound, and for post production, EditDroid, filmless digital editing.

    As brand names, THX and ILM have lasted for most of half a century. EditDroid had a more mixed record. As an idea, it was one of the best of a new wave of electronic editing solutions, helping advance the art. As a business, it wilted. It became a standard of TV show editing but lost out to other companies for feature films. Former EditDroid execs point fingers at George Lucas, who went right on editing with 35mm film, ignoring his own creation. EditDroid’s costly, dedicated equipment of the ‘80s was eventually superseded by far cheaper, off-the-shelf personal computer-based systems of the ‘90s.

    • #24
  25. Mad Gerald Coolidge
    Mad Gerald
    @Jose

    It’s good to hear the George Lucas contributed more than Jar Jar Binks to the industry! 

    • #25
  26. Miffed White Male Member
    Miffed White Male
    @MiffedWhiteMale

    Gary McVey (View Comment):

    In the Fifties experiments to determine how much better film visual resolution was, it appeared to be at or near the limits of measurement at that time, approximately 35,000-40,000 lines of resolution. It might have been higher, they thought. But they made some testing errors.

    In the Eighties they found that the real experimental value of human eyesight was closer to 8000-9000 lines of resolution. 8K. That didn’t really show up the old tests that badly–they were within an order of magnitude, anyway–and it was indeed beyond what a TV system could handle. But it wasn’t way-out-in-the-clouds crazy anymore. And then when they discovered that only 4K was required to fool the eye, it was suddenly reachable.

    So if I buy a 4k disk player, I shouldn’t need to upgrade again in the future?

     

    • #26
  27. Gary McVey Contributor
    Gary McVey
    @GaryMcVey

    Miffed White Male (View Comment):

    Gary McVey (View Comment):

    In the Fifties experiments to determine how much better film visual resolution was, it appeared to be at or near the limits of measurement at that time, approximately 35,000-40,000 lines of resolution. It might have been higher, they thought. But they made some testing errors.

    In the Eighties they found that the real experimental value of human eyesight was closer to 8000-9000 lines of resolution. 8K. That didn’t really show up the old tests that badly–they were within an order of magnitude, anyway–and it was indeed beyond what a TV system could handle. But it wasn’t way-out-in-the-clouds crazy anymore. And then when they discovered that only 4K was required to fool the eye, it was suddenly reachable.

    So if I buy a 4k disk player, I shouldn’t need to upgrade again in the future?

     

    Once it reaches 8K, I think we’re done. After that, it becomes a question of making the 8K screens autostereoscopic (no glasses 3D), making them even larger and even cheaper. 

    • #27
  28. Gary McVey Contributor
    Gary McVey
    @GaryMcVey

    Remaining areas for development are brightness and color rendition, for both home TV and theatrical. It took a while for digital screens to catch up with the best of film projection. Color rendition has been slower to improve, though it clearly has. A top of the line digital system can now equal human eye response to color.

    On a colorimetry chart, sensitivity to color fidelity looks like a tilted triangle, or the United Federation of Planets’ A-shaped logo. Television color rendition was a stubby subsection nestled in the chart’s center section. Digital color rendition now extends beyond the human edges of the triangle, meaning that if Netflix sells subscriptions on Arcturus, presumably they won’t get any complaints about the color. 

    • #28
  29. Locke On Member
    Locke On
    @LockeOn

    Gary McVey (View Comment):

    Red Herring (View Comment):

    The Lord of the Rings lab where they did special effects is in Wellington, New Zealand. Sadly, the port was closed because of the tropical storm and I lost out. I’m not likely to get another chance to visit it.

     

    Peter Jackson is not only a fine director and effects expert, but he’s been one of the leaders of using digital techniques to make faded, jumpy old film footage steadier and much more like it looked a century ago. As far as I’m concerned, They Shall Not Grow Old was a masterpiece.

    That was amazing, but there have been points where Jackson seemed to spend more love on the tech than the story *cough* Hobbit *cough*.  Like too little story scraped over too much film (classical ref!).

    • #29
  30. Locke On Member
    Locke On
    @LockeOn

    Gary McVey: After setting HDTV standards, SMPTE got around to filmless movie theaters. They determined more than 25 years ago that theatrical digital video of roughly 4K (in today’s terms) would do the job.

    The limits of HDTV were a compromise due in part to what bit rate could be crammed into a 6 MHz bandwidth using the silicon and compression technology of the time, that being what was offered on the new digital cable systems. A great leap forward from the curse that was NTSC, but not good enough for big screen theater. True digital theater had the advantage of being able to dispense with the band-limited pipe, with either a dedicated network connection or delivery on fixed digital media.

    Source: I was involved in some early ‘interactive TV’ projects and trials. The Apple rep to the HDTV committee worked down the hall and was a friend, so I periodically got an earful.

    • #30
Become a member to join the conversation. Or sign in if you're already a member.