Of Myths and Math

Several people have asked me to expound a bit on digital projection and use math to refute the claims others are making. I have been reluctant do to it because:

a) everyone hates math in blogs and
b) I already think I go on about digital too much.

But that said, I’m going to do what was requested (now several months ago). And now, I’ll get the inevitable hate mail that “you hate digital! You’re a luddite, we hate you, move with the times.” And once again, I point out that I don’t hate digital at all. I don’t like cheap digital that’s passed off as perfection. And the new projectors are cheap digital. We were so enamored of the idea that we could save money that we jumped in head-first before the technology was ready.  (I point out, to those newbies, that I did the restoration of King of the Kongo in digital, and then it went out to film.)

Now, I always encourage you to disbelieve me. After all, people call me stupid and wrong all the time, especially on Facebook. (Facebook is the great open pasture where everyone is wrong and no one is convinced about anything.) I carefully referenced everything here, so you can look things up. Even though I may be stupid and wrong, do you really thing all these links are stupid and wrong, too? Well, judging by some political polls, a lot of you do. But I digress.

Let’s hit the biggest myth first:

MYTH: Digital projection is really better than film already (or at least almost as good) and the only people who don’t like it are elitist whiner punks, the same ones who didn’t like CDs over vinyl.

MATH: This is wrong. It’s demonstrably wrong. It’s all about sampling. Let’s take sampling. A digital signal is sampled (http://en.wikipedia.org/wiki/Sampling_(signal_processing) ). The sampling rate of a CD is 44.1kHz (this is 44,100 samples per second). Under the Nyquist sampling theorem ( http://en.wikipedia.org/wiki/Nyquist–Shannon_sampling_theorem ), this means that the highest frequency that can be reliably heard in a CD would be half of this, about 22000Hz.

The highest human hearing, for an unimpaired individual, measures in at about 20,000Hz.

THIS MEANS THAT IF THERE IS SAMPLING DISTORTION IN A CD, THEN YOU CAN’T HEAR IT. If your dog complains that he doesn’t like the sound of a CD, then you should listen to him. And if he does that, then you must be Dr. Doolittle. (Please, no singing.)

At least this is true for the time-based (temporal) sampling. There are good arguments about the dynamic range causing problems with things like hall ambience etc, but these arguments are often for elitist whiner punks. (I’m kidding, but not a lot… the CD technology is mathematically pretty sound.)

Now for digital imaging, let’s talk the same thing. Let’s not even bother about the limits of human sight, which is what we did in the case of audio. Let’s just make it as good as film. How’s that for fair? Have we ever measured the resolution of film?

Well, sure we have. And I’ll even be extra fair. I’ll go back to the 1980s when we first did this, back when film had lower resolution than it has now. How much nicer can I be?

Back in the 1980s, there was a groundbreaking movie made called Tron. It was the first film that made extensive use of computer graphics. The makers of Tron wanted to make sure that they generated images didn’t show “jaggies,” also known as stair-stepping. This is where you can see the pixels in the output device, which in this case is film ( http://en.wikipedia.org/wiki/Jaggies )

So, they tested their system, and they discovered that they needed to run 4000 lines of resolution before you couldn’t see the jaggies. Don’t believe me? Let’s look at another source:
https://design.osu.edu/carlson/history/tree/magi.html

I’ve actually seen the machine they used to do this. It’s at Blue Sky Studios now.  Here is a picture of it:

IMG_7999

Now, 4000 lines are needed for a native digital image, or an image that started life as a digitally, not something you are scanning from an outside source. If you’re sampling an analog world, like with a camera or a scanner, you’d need to follow Nyquist’s rule and use 8000 lines. You wanna know why they’re scanning Gone With the Wind at 8K? Now you know.

So you’d expect today’s digital projectors to be about 4000 lines if they’re as good as film, right? Let’s see what the specs are.  This is the list for the Digital Cinema Package (DCP), which is the standard for motion picture digital projection.

http://en.wikipedia.org/wiki/Digital_Cinema_Package

There are two formats used in DCPs. 2K and 4K. 2000 lines and 4000 lines, right?

DCP 2K = 2048×1080
DCP 4K = 4096×2160

That’s 2048 pixels wide (columns) by 1080 pixels high (lines) and 4096 pixels wide (columns) by 2160 pixels high (lines).

OK, so wait, that means 2048 pixels WIDE by 1080 LINES, right? So the Tron 4K rule says we should be seeing 4000 lines and we’re seeing 1080? Or the 4K top-end projectors, that not many theaters use, they’re using 2160????

So 2K is a big lie. It’s 2K horizontal, not vertical. It’s really 1K.

That’s about half the resolution that they should be running.

Don’t blame me. Blame the math.

Oh, and you know how Quentin Tarantino is always complaining about digital projection being “TV in public?” http://www.digitalspy.com/movies/news/a441960/quentin-tarantino-i-cant-stand-digital-filmmaking-its-tv-in-public.html

Well, what’s HDTV? Well, don’t believe me, see the specs here:

http://en.wikipedia.org/wiki/1080p

Wikipedia says it’s 1920 X 1080. But wait a second: 2K DCP, used in theaters all over the world, is 2048 X 1080. That’s almost identical to 2K theatrical projection.

Quentin Tarantino is right: Digital film presentation is TV in public, almost literally. Sure the screen is bigger, but that only makes the pixels show up more.  (We can argue about a lot of other things Tarantino says, but the math is behind him on this one.)

MYTH: Even though you just showed it isn’t as sharp, it looks better in my theater than the 35mm did, so you’re still wrong.

MATH: The digital projectors look nicer because the 35mm projectors in your old theater were junky, maladjusted, and old. They were run by projectionists and technicians who didn’t care about adjusting things correctly.  Sometimes there hadn’t been a technician in the theater in decades.  No, that isn’t a joke.

Further, for the last many years, Hollywood has been churning out prints that are made from something called DIs. Digital Intermediates. These are film prints made from digital masters. Almost all of these are made at 2K (1080 lines). Is it any wonder that you project a soft print through an 80-year-old projector with greasy lenses and it doesn’t look as good as a new digital projector showing the same thing? (Digital intermediates started in 2000 or so, the first major film using them being O Brother Where Art Thou? )

Try projecting a REAL 35mm print, made from good materials, especially an old print or a first-rate digital one. Then compare that to digital projection. It’s not even close.

I projected a 35mm print of Samsara a few years ago and I thought there was something wrong with it. Why was it so sharp? It looked like an old Technicolor print. Why was it sharp? Digital imaging at 6K and originated on 65mm film. Worth seeing.

MYTH: There’s nothing to misadjust on digital projectors, so they’re going to be more reliable than the 35mm projectors.

MATH: I know projector repairmen, and they tell me the digital projectors break down more often. I don’t have a lot of measurable math, because it’s early yet, but I’ve seen the sensitive projectors break down very often, and the lamps often turn green before they fail. Since there’s no projectionist in the booth most of the time, then there’s no one to report arc misfires, dirty lenses, etc.

Oh, and the projector is a server on the internet, with a hard drive in it. Computer users will tell you that the internet never crashes, and further, hard drives are 100% reliable. I was working in a theater once where the movie stopped running because someone in Los Angeles accidentally turned off the lamp. (Since the projector was on the internet, some schmo accidentally shut off the wrong projector. Nothing we could do about it.)

Digital projectors can be out of focus, they are sensitive to popcorn oil, they have reflectors that are sensitive and need replaced. Don’t think that digital means reliable.

MYTH: 35mm prints just inherently get beaten up, so they don’t look good even after a few days.

MATH: Dirty projectors and careless projectionists cause most of the print damage you will ever see. Hollywood has had a war on projectionists for about 50 years, and now they’ve killed them off. For the last 35 years, most projectionists have been minimum-wage workers with little-to-no training. They do double-duty on the film and in the snack bar.

These are known in the trade as popcorn monkeys. Please blame them for most print abuse.

MYTH: The credits on digital films look sharper than they do on film, so that means that digital is sharper, no matter what you say.

MATH: Digital imaging favors something called aliasing. http://en.wikipedia.org/wiki/Aliasing. Aliasing means just what you think it might. Due to sampling problems, the signal you end up with is different than the one you started with. It goes under an alias. This gets really technical, but you remember the old days, when they used to have big pixels in video games? (Hipsters, you won’t remember this, but if you’re the typical hipster who doesn’t think anything worth knowing happened before your birth then you won’t be reading this anyway.) Remember this kind of blocky image:

lincblockThis image is undersampled (meaning that we should have more pixels in it than we do). The blockiness is called temporal aliasing, which means that we are getting a different signal out than we put in! Normally, we should filter this until the blocks go away, because in the math world this is high-frequency information that is bogus and not part of the signal (remember Nyquist?).

If we do the recommended filtering, it should look more like this:

lincsoftThis picture more accurately represents the input signal, although it’s blurry, and that’s OK, because the undersampling lost us the high frequency (sharp details) in the image.
Now, I’ve already shown you that the digital image is undersampled, but let’s take a look at credits. Instead of Lincoln, let’s take a look at an undersampled F:

fblock
Now, wait, that looks a lot better than Lincoln, right? If we filter it so we get the actual image we should have been sampling, it should look something like this:

fsoft
But, wait, I hear you cry: the blocky undersampled Lincoln looked bad, and the blocky undersampled F looks better than the properly filtered F. WHY IS THAT?

That’s because the blockiness of the undersampling just happens to favor the correct look of the F. In other words, we’re getting a distorted signal out, but the distortion gives us a more pleasing image! Credits will look sharper on digital projection, because they don’t do the proper edge filtering. This is why a lot of people complain they can see the pixels in the projection. (You can see the pixels in Lincoln, too, before the filtering.)  If you did the proper filtering, you wouldn’t see the pixels, but then the credits would look softer again.

Now, I picked the ideal case with the F, where every part of it was a vertical or horizontal line. The worst case scenario is a S.

An undersampled S of the same scale as the F looks like this:

sblock
But with proper filtering, it looks like this:

ssoftYour brain filters this out when you’re watching credits and you tend to see the vertical and horizontal edges like in an F, which is what we read for cues with our brain. This is also why filmmakers are now favoring sans-serif fonts, because they render better at low resolution.

So the credits aren’t sharper. It’s an illusion caused by undersampling and your brain. And I showed you with a minimum of math. YAY!

Fun with signal processing!!!

MYTH: OK, Mr. Boring Math guy, I still think that the digital stuff looks better. Can you show me a combination of what you see in digital projection vs. what it should look like?

MATH: Why yes! Thank you for asking!!!

What I notice most about digital projection is that they have boosted the apparent sharpness with something called a Hough transform, which make edges look more pronounced. This also causes edging artifacts (called ringing) that I find obnoxious.

Further, the green response is compromised rather a lot. Most digital projection I see today represents all grass as an annoying neon green. It can’t seem to represent a range of colors. We’re also getting an overly red look to compensate for the strange greens. Let’s take an ideal case: this is a Kodak color sample:

kodak real

Now, I’ve exaggerated this to make it more obvious for you, but here’s what I see in most digital projection:

kodak fake

Notice we’re seeing almost no green in the avocado, the grapes look dead, the flesh tones are too red, the whites are AAAAALMOST blown out, and we’re seeing edge artifacts from over-sharpening.

This, folks, is why I miss film. We could do digital and do it well, but we’re not.

And, you ask, why is it that we just don’t use more pixels, use better color projection, so we don’t have to do this?  It’s because more pixels = more hard drive space = bigger computer needed = more cost.  Since Hollywood is in love with cheap, they’re not going to do it right until right is cheap.

2 thoughts on “Of Myths and Math”

  1. Thank you for this splendid post, Dr. Film! I personally love 35mm AND 16mm film, although it has been awhile since I have seen those Blackhawk and Castle Films 8mm movies that fascinated me all those years ago. Properly maintained projectors showing I.B. Tech and Kodachrome prints are a thing of beauty.

Comments are closed.