Alt Images: Smooth Sorcery

 

In 1984, I was one of a lucky handful of people from the LA film festival who were invited to pay a secret visit to Douglas Trumbull’s anonymous, windowless special effects stage and laboratory. Located in an industrial area down at the marina, it was many miles away from the movie studios that paid Trumbull to create magic for films like Columbia’s Close Encounters of the Third Kind (1977) and Warner Bros. Blade Runner (1982). Trumbull called us there to make a dramatic demonstration of his new process, Showscan, which he claimed would grow into much more than just a technical improvement, becoming a profoundly deeper artistic and psychological experience than moviegoing had ever known.

It sounded, I have to say, like an absurdly pretentious claim. But for weeks after that screening, whenever any of us got together, I heard Twilight Zone-like stories of the experience of having those hypnotic Showscan moments merge into their own memories and dreams. 37 years ago, we were among the first to experience the peculiar, uncharted subconscious world of HFR, high frame rate moving images.

Showscan’s main feature was it ran through a camera and a projector at 60 frames a second, two and a half times the speed of normal sound film. At that speed, viewers agreed, everything in front of the cameras became strangely hyper-real, more crisp, immediate, and somehow real-seeming than reality itself.

Today, Showscan itself is gone, the company long bankrupt, but high frame rate is a feature, if a controversial one, of some of the most elaborate fantasy and science fiction films, and an often-overlooked setting on most new TV sets. Yet we still don’t really know much about how or why HFR affects our perceptions at such a deep level, or what the long-term, mind-numbing effects of this new intensification of media might be.

Back in 1984, Doug Trumbull’s iconic, dystopian images of the far-off year of 2019 had already entered film history, like the project that first made Trumbull’s reputation as a special effects wizard, the Stargate sequence of 2001: A Space Odyssey (1968). The lights went down and the screening began.

This was made up of selected scenes from his film Brainstorm (1983), presented the way he originally intended them to be seen. The stripped-down premise is this: a married pair of North Carolina scientists on the verge of splitting up is part of a medical instrumentation team that’s discovered how to directly record and play back thoughts. Even while the earliest prototype is still a laboratory secret, it’s used and misused to record the actual, total experience of everything from sex to death.

The detailed plot of Brainstorm isn’t relevant to our theme, but the filming and presentation technique was. The original intention was to alternate dramatic scenes shown in regular 70mm at normal speeds, interspersed with the supposed “tapes” of actual human thought. This required switching to an alternate projector running at 60 frames a second, using Showscan’s subliminal effects to make the mind of the viewer perceive those minutes of the movie as being somehow realer than real.

This World’s Fair-like gimmick could have been done, but it wasn’t. They filmed it, but it was too expensive an experiment for most theaters. The film was already in trouble, pushing the limits of what could be done to fix a story whose leading lady, Natalie Wood, died in mid-production. Brainstorm was released to theaters with its Showscan scenes reduced in size and speed to regular proportions. Trumbull left feature filmmaking and went on to make special effects-laden short films for theme park attractions like Back to the Future—The Ride.

The lack of industry support for 60 fps was, naturally, a disappointment to Eastman Kodak, which looked forward to selling two and a half times as much film. But the idea came back in a different form when high definition television became a practical reality. Japan’s analog HDTV system, seemingly on the brink of world conquest, was struck down like Godzilla by a small laboratory in San Diego that created an all-digital system that became the universally adaptable HDTV that you have in your home today. As an afterthought, the standards included the later possibility of HFR, high frame rate video.

There’s little or no motion blur in HFR. We’re subconsciously used to a bit of blur in artificial media like film, though it isn’t as prominent a factor in live television. In Catch 22 (1970) the most intense scenes on the bomber are filmed with a short shutter opening, clipping, and “freezing” the most horrifying moments of the film in an almost stroboscopic way, heightening the reality by artificial but very effective means. This is a technique known to still photographers but rarely used in feature films.

By the time 21st-century flatscreens were capable of HFR, the movies were back in the game, with Peter Jackson one of the leaders of new experiments in the medium. “Films” and television are now made with roughly the same kind of digital equipment.

Jackson’s cheerleading for HFR (The Hobbit was made at 48 fps) was influential. Ang Lee’s making films like Gemini Man at 120 fps is more controversial, because more filmgoers, especially not-so-young ones, complain of the “soap opera effect”, a paradox: the more real it looks and feels, the more obvious that it’s a fake. If you see even a conventionally made film like Star Wars: A New Hope on an HDTV at high frame rates, you are seeing the same movie you’ve always known, yet it’s strangely different. Instead of being immersed in the Star Wars story, you’re aware of watching three young American actors spending the summer of 1976 horsing around on an English sound stage.

Younger viewers seem to have less resistance to HFR. They haven’t spent decades growing accustomed to the way movies have traditionally looked in theaters. Ricochet member @misthiocracy, who knows pro cinematography, suggests that HFR’s effects on human consciousness may not be built-in, but acquired, subconscious training in distinguishing between fiction and reality. Fellow member @hankrhody, the best explainer of scientific mystery on Ricochet (and just about anywhere), speculates that the human mind stitches together moments of perception in such a way as to create an artificial continuity. That sounded reasonable, which in this context means “Damn, that’s good. I bet I could steal that idea”.

There’s an arena of film where that you-are-there sensation is riveting, not distracting, and that’s nonfiction film. I don’t (necessarily) mean documentaries, whose narration and selective point of view would show up under HFR as being as artificial as fiction, but the unedited raw footage that documentaries are made from.

The 16mm movie cameras that accompanied Apollo to the Moon ran at only 12 frames a second, to save film and weight. That’s slower, more flickery, and blurry than old-time silent film ever was, which ran at least 16 fps, usually more. When the lunar film was processed, it was double-printed to run at a normal, if stuttering 24 frames a second. Today, using modern graphic technology, we can do a lot better than that. We can insert four artificially created in-between frames between each original one, giving those Apollo films a vividness and an immediacy they never had before. At sixty frames a second, you are no longer even looking at old film, but through an invisible window at something live and unique, something real: mankind’s few precious moments on the Moon.

It’s not crazy to wonder if direct, immediate access to alternative reality would make the whole canon of the dramatic arts up through now, from Sophocles to Cecil B. De Mille to Aaron Sorkin, seem strange, remote, too old fashioned to matter, like the way we now think of musty 1890s stage melodrama, or soap opera on old-time radio, or quaint Forties musicals where farmhands spontaneously burst into song.  Altering the whole perception of “watching something” in ways we didn’t anticipate, creating movies with a reality so heightened they don’t feel like movies anymore: if this phenomenon somehow turned against us, would we even know the difference in time?

Altered Images (aside from being the name of an ’80s band) is a short series of posts revealing how motion picture images can be altered, with a specific emphasis on changing reality. 

Published in General
This post was promoted to the Main Feed by a Ricochet Editor at the recommendation of Ricochet members. Like this post? Want to comment? Join Ricochet’s community of conservatives and be part of the conversation. Join Ricochet for Free.

There are 66 comments.

Become a member to join the conversation. Or sign in if you're already a member.
  1. Douglas Pratt Coolidge
    Douglas Pratt
    @DouglasPratt

    Awesome. Thank you. IMAX fits in there somewhere, right? I remember seeing “The Dream is Alive” at the IMAX theatre in the Air & Space Museum in the late Eighties. It was amazing, but the film blew through the camera so fast they had to cut every two or three minutes.

    • #1
  2. Gary McVey Contributor
    Gary McVey
    @GaryMcVey

    Douglas Pratt (View Comment):

    Awesome. Thank you. IMAX fits in there somewhere, right? I remember seeing “The Dream is Alive” at the IMAX theatre in the Air & Space Museum in the late Eighties. It was amazing, but the film blew through the camera so fast they had to cut every two or three minutes.

    Yep, IMAX runs at 48 frames a second. In effect, IMAX is the Showscan that made it. It’s still 70mm film, but it runs through the projector sideways. 

    • #2
  3. Gary McVey Contributor
    Gary McVey
    @GaryMcVey

    Re: Brainstorm: What seems to have interested Trumbull—it certainly interested me—was the nerdish romance of product development, the Modern Marvels feel of a story arc that begins with mind sampling devices that looked like hair dryers hooked up to a roomful of rack equipment and a recorder that used reels of massive iridescent tape. It was soon reduced to the size of a desk, and then made suitcase-portable, although still heavy and bulky. Finally, in a triumph of industrial design, it was an elaborate earpiece with a wireless link to a briefcase, and could pump raw thoughts over plain old telephone lines. Exaggerated, sure, to an almost comical degree; but similar to the rapid development and miniaturization of technology that was commonplace by 1984.

    • #3
  4. Mark Camp Member
    Mark Camp
    @MarkCamp

    For me: New, scientifically mysterious, utterly fascinating info.  How much more of this stuff do you (and misthiocracy) know?

    • #4
  5. Gary McVey Contributor
    Gary McVey
    @GaryMcVey

    Mark Camp (View Comment):

    For me: New, scientifically mysterious, utterly fascinating info. How much more of this stuff do you (and misthiocracy) know?

    Misthi knows plenty, I’ll tell you that. He even owns one of the more advanced 16mm movie cameras, a Canon Scoopic, which marks him as a real moneybags connoisseur. 

    • #5
  6. namlliT noD Member
    namlliT noD
    @DonTillman

    Gary McVey:

    Showscan’s main feature was it ran through a camera and a projector at 60 frames a second, two and a half times the speed of normal sound film. At that speed, viewers agreed, everything in front of the cameras became strangely hyper-real, more crisp, immediate and somehow real-seeming than reality itself.

    Today, Showscan itself is gone, the company long bankrupt, but high frame rate is a feature, if a controversial one, of some of the most elaborate fantasy and science fiction films, and an often-overlooked setting on most new TV sets. Yet we still don’t really know much about how or why HFR affects our perceptions at such a deep level, or what the long-term, mind-numbing effects of this new intensification of media might be.

    The dots, pixels, we see on computer displays are fixed in position, shape, and size.

    But on film, the dots are called grains, and they’re distributed somewhat randomly in position, shape, and size.

    I always thought that the big win for HFR film was not so much in watching things move around fast, but rather that it averages out the film graininess (position, shape, and size), effectively multiplying the limited resolution of the medium.  And the graininess goes by so rapidly that the eye isn’t be able to perceive it in the usual way.

    Somebody want to do an experiment?  Try it out on 8mm film, where the grain is really obvious.

    (I think I just made up a new cinematic term: “8mm HFR”.)

    • #6
  7. EJHill Podcaster
    EJHill
    @EJHill

    Everyone remembers the groundbreaking special effects of Star Wars in 1977 but it wasn’t all that different to techniques of the past, especially compared to today’s computer driven images. Lucas was still using matte paintings, which for the uninitiated, are paintings on glass that have holes to shoot live action through.

     

    • #7
  8. Judge Mental Member
    Judge Mental
    @JudgeMental

    EJHill (View Comment):

    Everyone remembers the groundbreaking special effects of Star Wars in 1977 but it wasn’t all that different to techniques of the past, especially compared to today’s computer driven images. Lucas was still using matte paintings, which for the uninitiated, are paintings on glass that have holes to shoot live action through.

     

    The real leap forward was The Last Starfighter, a few years later.

    • #8
  9. namlliT noD Member
    namlliT noD
    @DonTillman

    EJHill (View Comment):

    Everyone remembers the groundbreaking special effects of Star Wars in 1977 but it wasn’t all that different to techniques of the past, especially compared to today’s computer driven images. Lucas was still using matte paintings, which for the uninitiated, are paintings on glass that have holes to shoot live action through.

    I always thought that the groundbreaking part of Star Wars was the robot camera that was programmed to zip around the model by taking the script for the model’s movements, and then calculating the mathematical inverse (camera <=> subject) to drive the robot.

    To me, that’s out-of-the-box thinking.

    • #9
  10. Judge Mental Member
    Judge Mental
    @JudgeMental

    namlliT noD (View Comment):

    EJHill (View Comment):

    Everyone remembers the groundbreaking special effects of Star Wars in 1977 but it wasn’t all that different to techniques of the past, especially compared to today’s computer driven images. Lucas was still using matte paintings, which for the uninitiated, are paintings on glass that have holes to shoot live action through.

    I always thought that the groundbreaking part of Star Wars was the robot camera that was programmed to zip around the model by taking the script for the model’s movements, and then calculating the mathematical inverse (camera <=> subject) to drive the robot.

    To me, that’s out-of-the-box thinking.

    Just in the last week I saw a video showing how they did the cool semi-circular camera pans around freeze frame scenes in The Matrix.  (Like the shot the first time Neo leans over backwards to dodge bullets.)

    They set up dozens of 35mm still cameras on tripods in a semi circle around around the subject, raising or lowering the tripods so the viewpoint could also rise or descend, and then wired them together to all take a single synchronized shot (of Keanu hanging in a harness).  Then they spliced them all together, using one still shot for each frame of motion.

    • #10
  11. Gary McVey Contributor
    Gary McVey
    @GaryMcVey

    One thing that Kubrick and Lucas had in common was a dislike of the obvious build-up of graininess inherent in special effects that were made one stage at a time. They each re-invented fairly ordinary, long-known processes but were fanatically meticulous about making them as perfect as possible.

    Kubrick hated blue screen (it wasn’t green yet), the simple way that effects were done in the mid-’60s, so he used plenty of workarounds–teams of ladies who hand-traced the outline of a spacewalking astronaut, front projection (he also hated rear projection), and above all, exact numerical control of repeated takes that allowed one piece of film to be exposed one picture element at a time, and processed only after the final “layer” was in place. 

    Lucas made a hard, sensible compromise. He used blue screen. But, like Stan the Man, he made the most out of an old process, raising it from hack quality. And George used Kubrick’s microscopically exact, perfectly repeatable camera moves–say, filming an X-wing, then rewinding the film in the camera and doing the whole shot again, this time with only the Millennium Falcon, then a third pass, this time only for the tiny lights and fake “windows” on the Falcon. 

    • #11
  12. HankRhody Freelance Philosopher Contributor
    HankRhody Freelance Philosopher
    @HankRhody

    Gary McVey: The detailed plot of Brainstorm isn’t relevant to our theme, but the filming and presentation technique was. The original intention was to alternate dramatic scenes shown in regular 70mm at normal speeds, interspersed with the supposed “tapes” of actual human thought. This required switching to an alternate projector running at 60 frames a second, using Showscan’s subliminal effects to make the mind of the viewer perceive those minutes of the movie as being somehow realer than real.

    I kept waiting for continuation of this story. “What was difficult and expensive to manage with analogue projectors was accomplished by a team of bored students at the Orange County Film Tech School (Go OCFTS!) where a team of students spliced the movie together digitally, allowing this effect to be viewed by any dork with a YouTube connection.” 

    It seems like we’d be able to run that experiment.

    • #12
  13. Gary McVey Contributor
    Gary McVey
    @GaryMcVey

    Judge Mental (View Comment):

    EJHill (View Comment):

    Everyone remembers the groundbreaking special effects of Star Wars in 1977 but it wasn’t all that different to techniques of the past, especially compared to today’s computer driven images. Lucas was still using matte paintings, which for the uninitiated, are paintings on glass that have holes to shoot live action through.

     

    The real leap forward was The Last Starfighter, a few years later.

    That film had an interesting, almost unique business backstory. The initial funding came from investors in movie theaters, the idea being, “Let’s create our own Star Wars, and this time we get to keep most of the money”.  They got away with it, but the studios quietly passed the word that they didn’t like their usual dance partners, the theaters, cutting in on their action. 

    • #13
  14. Clavius Thatcher
    Clavius
    @Clavius

    Fascinating as always.  I remember hearing about this back in the 80s.  It does stand to reason that it would look more realistic.

    I heard a story (perhaps apocryphal) that Spielberg picked a 24 FPS digital camera for a film (I forget which) because “it looked more like a movie.”

    I know that a feature of Sony TVs is that they interpolate between frames to smooth out motion.  This gets more important as screens get bigger and your vision is dominated by the picture.

    Finally I don’t know why TriStar elected to film Billy Lynn’s Long Halftime Walk (2016) in 120 frame HFR.

    • #14
  15. Judge Mental Member
    Judge Mental
    @JudgeMental

    Judge Mental (View Comment):

    namlliT noD (View Comment):

    EJHill (View Comment):

    Everyone remembers the groundbreaking special effects of Star Wars in 1977 but it wasn’t all that different to techniques of the past, especially compared to today’s computer driven images. Lucas was still using matte paintings, which for the uninitiated, are paintings on glass that have holes to shoot live action through.

    I always thought that the groundbreaking part of Star Wars was the robot camera that was programmed to zip around the model by taking the script for the model’s movements, and then calculating the mathematical inverse (camera <=> subject) to drive the robot.

    To me, that’s out-of-the-box thinking.

    Just in the last week I saw a video showing how they did the cool semi-circular camera pans around freeze frame scenes in The Matrix. (Like the shot the first time Neo leans over backwards to dodge bullets.)

    They set up dozens of 35mm still cameras on tripods in a semi circle around around the subject, raising or lowering the tripods so the viewpoint could also rise or descend, and then wired them together to all take a single synchronized shot (of Keanu hanging in a harness). Then they spliced them all together, using one still shot for each frame of motion.

    BTW, I saw that in an Oliver Harper Retrospective/Review on YouTube.  If you don’t know these, you should check them out.  He doesn’t mean a review in the sense of whether it’s worth watching.  It’s more a review of every detail of the movie.  Script, development, director, producer, casting, special effects, score, and a deep dive on story line.  They run 15 to 45 minutes.  Worth your time.

    • #15
  16. Gary McVey Contributor
    Gary McVey
    @GaryMcVey

    HankRhody Freelance Philosopher (View Comment):

    Gary McVey: The detailed plot of Brainstorm isn’t relevant to our theme, but the filming and presentation technique was. The original intention was to alternate dramatic scenes shown in regular 70mm at normal speeds, interspersed with the supposed “tapes” of actual human thought. This required switching to an alternate projector running at 60 frames a second, using Showscan’s subliminal effects to make the mind of the viewer perceive those minutes of the movie as being somehow realer than real.

    I kept waiting for continuation of this story. “What was difficult and expensive to manage with analogue projectors was accomplished by a team of bored students at the Orange County Film Tech School (Go OCFTS!) where a team of students spliced the movie together digitally, allowing this effect to be viewed by any dork with a YouTube connection.”

    It seems like we’d be able to run that experiment.

    Trumbull is still alive. I’m actually a bit surprised that Paul Allen or some other thrillionaire hasn’t offered to restore the Showscan footage to its original form. 

    But not that surprised, because ol’ Doug was kinda weak in the script department; great ideas, not such a great sense of  dramatic development. There’s a dishonestly written, obnoxious subplot that involves the sinister US military grasping to take over the revolutionary new technology to read minds and (it’s hinted) torture prisoners. The film didn’t need it, but Trumbull obviously felt he was reaching for “relevance”. 

    • #16
  17. Percival Thatcher
    Percival
    @Percival

    Excellent again, Gary.

    It brings to mind the great holy wars between the analog vs digital audio afficionados. A war once fought out in the letters sections of Stereophile, Hi-Fi, Audiophile, EE Times, the parking lots of Radio Shacks …”Your sound is tinny!” “Your ‘warmth’ is just finely tuned distortion!” – and it never really ended, not really, just faded away into individual listening rooms worldwide. In some cavern back in the hills, a robed figure bathed in the faint glow of his tube amp gently lowers the diamond-tipped stylus onto Miles Davis’ “Kind of Blue” atop a Technics SP-10 and smiles.

    De gustibus non est disputandum.

    • #17
  18. SkipSul Inactive
    SkipSul
    @skipsul

    It’s not like 60fps is even out of reach of people on a budget now.  The latest ostensibly “stills” cameras by Sony and Canon can shoot 4k120, or 8k30, or 8k60 (with some restrictions until you get to the actual cinema grade cameras).  Quite a lot of Youtube amateur content is shot at at least 1080-60FPS, with more and more at 4k60 as the standard.  Youtube camera folks insist that the 60fps footage does better, even on small smartphone screens, because at 60fps it feels more intimate and lifelike.  Of course, unlike movie makers, they’re not trying to obscure reality, or even edit well – jump-cuts are considered normal, whereas cleanly stitched cuts are seen as being rather fake, and authenticity online is hard won but easily lost.  

    60fps, as you note has a tendency to feel fake when you’re aiming at fiction because you can see the lighting and the soundstage, and the camera shadows, and all the other stuff that isn’t supposed to break the 4th wall.  It is definitely a challenge for modern cinematographers.  We upgraded to a 4k screen at home when COVID hit (our old one was dying), and we showed the kids Jurassic Park. I remember seeing this on the big screen when it came out in ’93, and being impressed.  On this much smaller but faster screen though, the effects were unconvincing, the dinosaurs made of rubber, and their movements obviously mechanical.  It was akin, for me, to watching early Dr. Who episodes.  Other films that weren’t trying so hard, though, lost nothing – but then they weren’t effects-driven in the first place.

    • #18
  19. Mark Camp Member
    Mark Camp
    @MarkCamp

    Percival (View Comment):

    Excellent again, Gary.

    It brings to mind the great holy wars between the analog vs digital audio afficionados. A war once fought out in the letters sections of Stereophile, Hi-Fi, Audiophile, EE Times, the parking lots of Radio Shacks …”Your sound is tinny!” “Your ‘warmth’ is just finely tuned distortion!” – and it never really ended, not really, just faded away into individual listening rooms worldwide. In some cavern back in the hills, a robed figure bathed in the faint glow of his tube amp gently lowers the diamond-tipped stylus onto Miles Davis’ “Kind of Blue” atop a Technics SP-10 and smiles.

    De gustibus non est disputandum.

    As Jack Benny might have said, “Now, that’s intellectual.”

    (When he puts his mind to it, P. is among the best of our writers, at least in a sprint like this.  Probably doesn’t have any wind, though.)

    [EDIT: My sense of humor sometimes confuses my fellow Ricochetti; The “Probably…” was supposed to be funny.]

    • #19
  20. Hoyacon Member
    Hoyacon
    @Hoyacon

    Percival (View Comment):

    Excellent again, Gary.

    It brings to mind the great holy wars between the analog vs digital audio afficionados. A war once fought out in the letters sections of Stereophile, Hi-Fi, Audiophile, EE Times, the parking lots of Radio Shacks …”Your sound is tinny!” “Your ‘warmth’ is just finely tuned distortion!” – and it never really ended, not really, just faded away into individual listening rooms worldwide. In some cavern back in the hills, a robed figure bathed in the faint glow of his tube amp gently lowers the diamond-tipped stylus onto Miles Davis’ “Kind of Blue” atop a Technics SP-10 and smiles.

    De gustibus non est disputandum.

    You rang?  It’s a Rega table though.

    • #20
  21. Aaron Miller Inactive
    Aaron Miller
    @AaronMiller

    Video game graphics are commonly measured by both resolution and framerate. 

    High resolution alone can break immersion by revealing details the audience isn’t meant to notice. But a high resolution (such as required for huge movie theater screens) can be counteracted by a low framerate. In other words, a high framerate alone does not break immersion, but can when it clarifies a high resolution. 

    Video games often simulate film effects like film grain, motion blur, depth of field, lens flare, and chromatic aberrations. Sometimes such trickery is for dramatic effect, but often it is to blend staged elements together because a collection of models similarly looks too neat to be real. A major advancement from one generation of games to another is the amount of details and clutter that fill a scene well beyond essential objects and elements. 

    A selling point of recent gaming hardware by Microsoft and Sony is the power to play many games at 60 FPS (frames per second). But 60 FPS isn’t the limit. Some games can be played on high-end PCs at 120 FPS. Still today, expansive and richly detailed “open world” games more commonly aim for a stable 30 FPS.

    That 30 FPS framerate remains acceptable to most gamers. But 60 FPS feels better for a medium defined by interactivity.

    There are debates and imperfect studies concerning typical “framerates” of human vision — the natural experience and upper practical limits. But the eyes and brain go about vision very differently than cameras or computers. I would say only that the leap in framerate is like the leap from 480p to 720p and finally 4K or 8K resolutions: the value of the new standard is best understood when you return to the old standard after being spoiled. 

    • #21
  22. Mark Camp Member
    Mark Camp
    @MarkCamp

    Aaron Miller (View Comment):

    Video game graphics are commonly measured by both resolution and framerate.

    High resolution alone can break immersion by revealing details the audience isn’t meant to notice. But a high resolution (such as required for huge movie theater screens) can be counteracted by a low framerate. In other words, a high framerate alone does not break immersion, but can when it clarifies a high resolution.

    Video games often simulate film effects like film grain, motion blur, depth of field, lens flare, and chromatic aberrations. Sometimes such trickery is for dramatic effect, but often it is to blend staged elements together because a collection of models similarly looks too neat to be real. A major advancement from one generation of games to another is the amount of details and clutter that fill a scene well beyond essential objects and elements.

    A selling point of recent gaming hardware by Microsoft and Sony is the power to play many games at 60 FPS (frames per second). But 60 FPS isn’t the limit. Some games can be played on high-end PCs at 120 FPS. Still today, expansive and richly detailed “open world” games more commonly aim for a stable 30 FPS.

    That 30 FPS framerate remains acceptable to most gamers. But 60 FPS feels better for a medium defined by interactivity.

    There are debates and imperfect studies concerning typical “framerates” of human vision — the natural experience and upper practical limits. But the eyes and brain go about vision very differently than cameras or computers. I would say only that the leap in framerate is like the leap from 480p to 720p and finally 4K or 8K resolutions: the value of the new standard is best understood when you return to the old standard after being spoiled.

    Wow. I have never heard of any of this.  So much of it is surprising to a simple engineer, who assumes simple, linearly independent, monotonic, intuitively obvious value functions. Faster is better, more pixels is better, no interactions between the two, etc. It is so much more subtle than that.

    • #22
  23. Gary McVey Contributor
    Gary McVey
    @GaryMcVey

    Mark Camp (View Comment):

    Percival (View Comment):

    Excellent again, Gary.

    It brings to mind the great holy wars between the analog vs digital audio afficionados. A war once fought out in the letters sections of Stereophile, Hi-Fi, Audiophile, EE Times, the parking lots of Radio Shacks …”Your sound is tinny!” “Your ‘warmth’ is just finely tuned distortion!” – and it never really ended, not really, just faded away into individual listening rooms worldwide. In some cavern back in the hills, a robed figure bathed in the faint glow of his tube amp gently lowers the diamond-tipped stylus onto Miles Davis’ “Kind of Blue” atop a Technics SP-10 and smiles.

    De gustibus non est disputandum.

    As Jack Benny might have said, “Now, that’s intellectual.”

    (When he puts his mind to it, P. is among the best of our writers, at least in a sprint like this. Probably doesn’t have any wind, though.)

    He’s wearing a suit of armor. He can’t do the 500 yard dash in that outfit. 

    • #23
  24. Gary McVey Contributor
    Gary McVey
    @GaryMcVey

    Aaron Miller (View Comment):

    Video game graphics are commonly measured by both resolution and framerate.

    High resolution alone can break immersion by revealing details the audience isn’t meant to notice. But a high resolution (such as required for huge movie theater screens) can be counteracted by a low framerate. In other words, a high framerate alone does not break immersion, but can when it clarifies a high resolution.

    Video games often simulate film effects like film grain, motion blur, depth of field, lens flare, and chromatic aberrations. Sometimes such trickery is for dramatic effect, but often it is to blend staged elements together because a collection of models similarly looks too neat to be real. A major advancement from one generation of games to another is the amount of details and clutter that fill a scene well beyond essential objects and elements.

    A selling point of recent gaming hardware by Microsoft and Sony is the power to play many games at 60 FPS (frames per second). But 60 FPS isn’t the limit. Some games can be played on high-end PCs at 120 FPS. Still today, expansive and richly detailed “open world” games more commonly aim for a stable 30 FPS.

    That 30 FPS framerate remains acceptable to most gamers. But 60 FPS feels better for a medium defined by interactivity.

    There are debates and imperfect studies concerning typical “framerates” of human vision — the natural experience and upper practical limits. But the eyes and brain go about vision very differently than cameras or computers. I would say only that the leap in framerate is like the leap from 480p to 720p and finally 4K or 8K resolutions: the value of the new standard is best understood when you return to the old standard after being spoiled.

    Fascinating stuff, Aaron, and thanks! It’s true that we don’t have exact criteria for finding a point of diminishing returns, but I’d gently push back that we can approximate where the response slope flattens markedly for the vast majority of viewers. There’s a threshold effect at above 60. Gamers are a specialized market that can detect 120. (Naval aviators and USAF pilots can distinguish important patterns, like the profile of enemy jets, in 1/220th of a second, but that’s an even more specialized field). With resolution, 8K really is close to human limits, unless TV screens in the home become 50 feet wide. 

    • #24
  25. Gary McVey Contributor
    Gary McVey
    @GaryMcVey

    SkipSul (View Comment):

    It’s not like 60fps is even out of reach of people on a budget now. The latest ostensibly “stills” cameras by Sony and Canon can shoot 4k120, or 8k30, or 8k60 (with some restrictions until you get to the actual cinema grade cameras). Quite a lot of Youtube amateur content is shot at at least 1080-60FPS, with more and more at 4k60 as the standard. Youtube camera folks insist that the 60fps footage does better, even on small smartphone screens, because at 60fps it feels more intimate and lifelike. Of course, unlike movie makers, they’re not trying to obscure reality, or even edit well – jump-cuts are considered normal, whereas cleanly stitched cuts are seen as being rather fake, and authenticity online is hard won but easily lost.

    60fps, as you note has a tendency to feel fake when you’re aiming at fiction because you can see the lighting and the soundstage, and the camera shadows, and all the other stuff that isn’t supposed to break the 4th wall. It is definitely a challenge for modern cinematographers. We upgraded to a 4k screen at home when COVID hit (our old one was dying), and we showed the kids Jurassic Park. I remember seeing this on the big screen when it came out in ’93, and being impressed. On this much smaller but faster screen though, the effects were unconvincing, the dinosaurs made of rubber, and their movements obviously mechanical. It was akin, for me, to watching early Dr. Who episodes. Other films that weren’t trying so hard, though, lost nothing – but then they weren’t effects-driven in the first place.

    You and @clavius are the real experts in digital imaging here. (Hell, when Clavius images a star field, he uses actual stars. But then, the studio lot he works on was once known for being the home of “more stars then there are in heaven”. It is remarkable how much (formerly) still cameras have entered professional filmmaking in the past generation. 

    • #25
  26. Clavius Thatcher
    Clavius
    @Clavius

    Gary McVey (View Comment):

    SkipSul (View Comment):

    It’s not like 60fps is even out of reach of people on a budget now. The latest ostensibly “stills” cameras by Sony and Canon can shoot 4k120, or 8k30, or 8k60 (with some restrictions until you get to the actual cinema grade cameras). Quite a lot of Youtube amateur content is shot at at least 1080-60FPS, with more and more at 4k60 as the standard. Youtube camera folks insist that the 60fps footage does better, even on small smartphone screens, because at 60fps it feels more intimate and lifelike. Of course, unlike movie makers, they’re not trying to obscure reality, or even edit well – jump-cuts are considered normal, whereas cleanly stitched cuts are seen as being rather fake, and authenticity online is hard won but easily lost.

    60fps, as you note has a tendency to feel fake when you’re aiming at fiction because you can see the lighting and the soundstage, and the camera shadows, and all the other stuff that isn’t supposed to break the 4th wall. It is definitely a challenge for modern cinematographers. We upgraded to a 4k screen at home when COVID hit (our old one was dying), and we showed the kids Jurassic Park. I remember seeing this on the big screen when it came out in ’93, and being impressed. On this much smaller but faster screen though, the effects were unconvincing, the dinosaurs made of rubber, and their movements obviously mechanical. It was akin, for me, to watching early Dr. Who episodes. Other films that weren’t trying so hard, though, lost nothing – but then they weren’t effects-driven in the first place.

    You and @ clavius are the real experts in digital imaging here. (Hell, when Clavius images a star field, he uses actual stars. But then, the studio lot he works on was once known for being the home of “more stars then there are in heaven”. It is remarkable how much (formerly) still cameras have entered professional filmmaking in the past generation.

    It is interesting that you can have too many, too small pixels in astrophotography.  Long focal length telescopes work better with larger pixels because you don’t want the star spreading across too many pixels which reduces sensitivity.  Conversely, you want the pixels small enough so that they are smaller than stars so that the stars aren’t blocky. 1-2 arcseconds/pixel is the target.

    For solar system imaging (Moon, planets, Sun), one best uses a video camera.  Light sensitivity isn’t the issue, it is noise from the atmosphere blurring the image.  But if you take a couple thousand frame sequence, you can find a couple hundred good frames which, when averaged together, can make stunning images.

    • #26
  27. SkipSul Inactive
    SkipSul
    @skipsul

    Clavius (View Comment):
    It is interesting that you can have too many, too small pixels in astrophotography. Long focal length telescopes work better with larger pixels because you don’t want the star spreading across too many pixels which reduces sensitivity. Conversely, you want the pixels small enough so that they are smaller than stars so that the stars aren’t blocky. 1-2 arcseconds/pixel is the target.

    The larger the pixels are, the more light-sensitive they are.  I’ve got a Sony A7RIII, which is a 42MP rig, but it has far better dynamic range than a Sony A7RIV at 62MP.  And neither can touch the Sony “S” series, which are only 12MP but can practically see in the dark.

    That being said, I wish I had that 62MP beast when I could pick up Saturn, simply because a greater density of pixels on the sensor would have yielded a better image.  Saturn, Jupiter, and even Mars are so big and so close that a higher density would have, I think made up for only having an effective 800mm (400mm focal length plus a 2x converter).  But I’m still learning, and Ohio is so overcast so often that it might be months between attempts for me.

    • #27
  28. Clavius Thatcher
    Clavius
    @Clavius

    SkipSul (View Comment):

    Clavius (View Comment):
    It is interesting that you can have too many, too small pixels in astrophotography. Long focal length telescopes work better with larger pixels because you don’t want the star spreading across too many pixels which reduces sensitivity. Conversely, you want the pixels small enough so that they are smaller than stars so that the stars aren’t blocky. 1-2 arcseconds/pixel is the target.

    The larger the pixels are, the more light-sensitive they are. I’ve got a Sony A7RIII, which is a 42MP rig, but it has far better dynamic range than a Sony A7RIV at 62MP. And neither can touch the Sony “S” series, which are only 12MP but can practically see in the dark.

    That being said, I wish I had that 62MP beast when I could pick up Saturn, simply because a greater density of pixels on the sensor would have yielded a better image. Saturn, Jupiter, and even Mars are so big and so close that a higher density would have, I think made up for only having an effective 800mm (400mm focal length plus a 2x converter). But I’m still learning, and Ohio is so overcast so often that it might be months between attempts for me.

    Take a movie instead of a still shot and use Registax / AVI Stack / AutoStakkert or some other frame stacking software.  Believe me, it is like magic.

    And that’s not bad, you can see Saturn.  But here is Saturn with a web cam, almost 14 years ago:

    Saturn @ f40 -- March 31, 2007
Saturn, March 31, 2007, C-11 @ f40, stacked and sharpened with Registax using sigma clip, color balanced and noise reduction in PixInsight, final processing in Photoshop

    • #28
  29. SkipSul Inactive
    SkipSul
    @skipsul

    Clavius (View Comment):
    Take a movie instead of a still shot and use Registax / AVI Stack / AutoStakkert or some other frame stacking software. Believe me, it is like magic.

    I’ll look at those.  I haven’t been doing much more than doinking around with what I have so far – nothing serious.  

    • #29
  30. Aaron Miller Inactive
    Aaron Miller
    @AaronMiller

    Gary McVey (View Comment):
    There’s a threshold effect at above 60. Gamers are a specialized market that can detect 120. (Naval aviators and USAF pilots can distinguish important patterns, like the profile of enemy jets, in 1/220th of a second, but that’s an even more specialized field). With resolution, 8K really is close to human limits, unless TV screens in the home become 50 feet wide. 

    In part, framerate is offset in gaming by input lag. Combined with “response time” and other factors, this is the delay between when the player communicates his intent on a controller (joystick, mouse, keyboard, etc), when the game communicates that input to the TV, and the TV displays the consequent action. 

    Input lag is measured in milliseconds. It’s a testament to the human brain that such a minute measure of time as 15 milliseconds can be significant. Probably, as reflexes slow with age, a gamer’s experience of input lag also changes. 

    Training might also factor in. Perhaps a fighter pilot responds quicker with practice not only because he can assess a situation quicker but also because his brain has adapted to a need for less mitigation between assessment and response. 

    For most games, input lag is negligible or else colors the experience in a way that does not impact player performance. Games it does impact are commonly referred to as “twitch” or reflex games. Primary among these are FPS (First-Person Shooter) games and vehicle (racing or aerial combat) games. But “side-scrollers” and “platformers” also benefit. 

    In competitive multiplayer gaming, Internet latency is yet another complicating factor in player performance. 

    There is a significant hardware market for gamers and TV enthusiasts who want the best. But most gamers are as content with 30 FPS as movie goers are content with HD, I think. 

    • #30
Become a member to join the conversation. Or sign in if you're already a member.