Of Myths and Math

Several people have asked me to expound a bit on digital projection and use math to refute the claims others are making. I have been reluctant do to it because:

a) everyone hates math in blogs and
b) I already think I go on about digital too much.

But that said, I’m going to do what was requested (now several months ago). And now, I’ll get the inevitable hate mail that “you hate digital! You’re a luddite, we hate you, move with the times.” And once again, I point out that I don’t hate digital at all. I don’t like cheap digital that’s passed off as perfection. And the new projectors are cheap digital. We were so enamored of the idea that we could save money that we jumped in head-first before the technology was ready.  (I point out, to those newbies, that I did the restoration of King of the Kongo in digital, and then it went out to film.)

Now, I always encourage you to disbelieve me. After all, people call me stupid and wrong all the time, especially on Facebook. (Facebook is the great open pasture where everyone is wrong and no one is convinced about anything.) I carefully referenced everything here, so you can look things up. Even though I may be stupid and wrong, do you really thing all these links are stupid and wrong, too? Well, judging by some political polls, a lot of you do. But I digress.

Let’s hit the biggest myth first:

MYTH: Digital projection is really better than film already (or at least almost as good) and the only people who don’t like it are elitist whiner punks, the same ones who didn’t like CDs over vinyl.

MATH: This is wrong. It’s demonstrably wrong. It’s all about sampling. Let’s take sampling. A digital signal is sampled (http://en.wikipedia.org/wiki/Sampling_(signal_processing) ). The sampling rate of a CD is 44.1kHz (this is 44,100 samples per second). Under the Nyquist sampling theorem ( http://en.wikipedia.org/wiki/Nyquist–Shannon_sampling_theorem ), this means that the highest frequency that can be reliably heard in a CD would be half of this, about 22000Hz.

The highest human hearing, for an unimpaired individual, measures in at about 20,000Hz.

THIS MEANS THAT IF THERE IS SAMPLING DISTORTION IN A CD, THEN YOU CAN’T HEAR IT. If your dog complains that he doesn’t like the sound of a CD, then you should listen to him. And if he does that, then you must be Dr. Doolittle. (Please, no singing.)

At least this is true for the time-based (temporal) sampling. There are good arguments about the dynamic range causing problems with things like hall ambience etc, but these arguments are often for elitist whiner punks. (I’m kidding, but not a lot… the CD technology is mathematically pretty sound.)

Now for digital imaging, let’s talk the same thing. Let’s not even bother about the limits of human sight, which is what we did in the case of audio. Let’s just make it as good as film. How’s that for fair? Have we ever measured the resolution of film?

Well, sure we have. And I’ll even be extra fair. I’ll go back to the 1980s when we first did this, back when film had lower resolution than it has now. How much nicer can I be?

Back in the 1980s, there was a groundbreaking movie made called Tron. It was the first film that made extensive use of computer graphics. The makers of Tron wanted to make sure that they generated images didn’t show “jaggies,” also known as stair-stepping. This is where you can see the pixels in the output device, which in this case is film ( http://en.wikipedia.org/wiki/Jaggies )

So, they tested their system, and they discovered that they needed to run 4000 lines of resolution before you couldn’t see the jaggies. Don’t believe me? Let’s look at another source:

I’ve actually seen the machine they used to do this. It’s at Blue Sky Studios now.  Here is a picture of it:


Now, 4000 lines are needed for a native digital image, or an image that started life as a digitally, not something you are scanning from an outside source. If you’re sampling an analog world, like with a camera or a scanner, you’d need to follow Nyquist’s rule and use 8000 lines. You wanna know why they’re scanning Gone With the Wind at 8K? Now you know.

So you’d expect today’s digital projectors to be about 4000 lines if they’re as good as film, right? Let’s see what the specs are.  This is the list for the Digital Cinema Package (DCP), which is the standard for motion picture digital projection.


There are two formats used in DCPs. 2K and 4K. 2000 lines and 4000 lines, right?

DCP 2K = 2048×1080
DCP 4K = 4096×2160

That’s 2048 pixels wide (columns) by 1080 pixels high (lines) and 4096 pixels wide (columns) by 2160 pixels high (lines).

OK, so wait, that means 2048 pixels WIDE by 1080 LINES, right? So the Tron 4K rule says we should be seeing 4000 lines and we’re seeing 1080? Or the 4K top-end projectors, that not many theaters use, they’re using 2160????

So 2K is a big lie. It’s 2K horizontal, not vertical. It’s really 1K.

That’s about half the resolution that they should be running.

Don’t blame me. Blame the math.

Oh, and you know how Quentin Tarantino is always complaining about digital projection being “TV in public?” http://www.digitalspy.com/movies/news/a441960/quentin-tarantino-i-cant-stand-digital-filmmaking-its-tv-in-public.html

Well, what’s HDTV? Well, don’t believe me, see the specs here:


Wikipedia says it’s 1920 X 1080. But wait a second: 2K DCP, used in theaters all over the world, is 2048 X 1080. That’s almost identical to 2K theatrical projection.

Quentin Tarantino is right: Digital film presentation is TV in public, almost literally. Sure the screen is bigger, but that only makes the pixels show up more.  (We can argue about a lot of other things Tarantino says, but the math is behind him on this one.)

MYTH: Even though you just showed it isn’t as sharp, it looks better in my theater than the 35mm did, so you’re still wrong.

MATH: The digital projectors look nicer because the 35mm projectors in your old theater were junky, maladjusted, and old. They were run by projectionists and technicians who didn’t care about adjusting things correctly.  Sometimes there hadn’t been a technician in the theater in decades.  No, that isn’t a joke.

Further, for the last many years, Hollywood has been churning out prints that are made from something called DIs. Digital Intermediates. These are film prints made from digital masters. Almost all of these are made at 2K (1080 lines). Is it any wonder that you project a soft print through an 80-year-old projector with greasy lenses and it doesn’t look as good as a new digital projector showing the same thing? (Digital intermediates started in 2000 or so, the first major film using them being O Brother Where Art Thou? )

Try projecting a REAL 35mm print, made from good materials, especially an old print or a first-rate digital one. Then compare that to digital projection. It’s not even close.

I projected a 35mm print of Samsara a few years ago and I thought there was something wrong with it. Why was it so sharp? It looked like an old Technicolor print. Why was it sharp? Digital imaging at 6K and originated on 65mm film. Worth seeing.

MYTH: There’s nothing to misadjust on digital projectors, so they’re going to be more reliable than the 35mm projectors.

MATH: I know projector repairmen, and they tell me the digital projectors break down more often. I don’t have a lot of measurable math, because it’s early yet, but I’ve seen the sensitive projectors break down very often, and the lamps often turn green before they fail. Since there’s no projectionist in the booth most of the time, then there’s no one to report arc misfires, dirty lenses, etc.

Oh, and the projector is a server on the internet, with a hard drive in it. Computer users will tell you that the internet never crashes, and further, hard drives are 100% reliable. I was working in a theater once where the movie stopped running because someone in Los Angeles accidentally turned off the lamp. (Since the projector was on the internet, some schmo accidentally shut off the wrong projector. Nothing we could do about it.)

Digital projectors can be out of focus, they are sensitive to popcorn oil, they have reflectors that are sensitive and need replaced. Don’t think that digital means reliable.

MYTH: 35mm prints just inherently get beaten up, so they don’t look good even after a few days.

MATH: Dirty projectors and careless projectionists cause most of the print damage you will ever see. Hollywood has had a war on projectionists for about 50 years, and now they’ve killed them off. For the last 35 years, most projectionists have been minimum-wage workers with little-to-no training. They do double-duty on the film and in the snack bar.

These are known in the trade as popcorn monkeys. Please blame them for most print abuse.

MYTH: The credits on digital films look sharper than they do on film, so that means that digital is sharper, no matter what you say.

MATH: Digital imaging favors something called aliasing. http://en.wikipedia.org/wiki/Aliasing. Aliasing means just what you think it might. Due to sampling problems, the signal you end up with is different than the one you started with. It goes under an alias. This gets really technical, but you remember the old days, when they used to have big pixels in video games? (Hipsters, you won’t remember this, but if you’re the typical hipster who doesn’t think anything worth knowing happened before your birth then you won’t be reading this anyway.) Remember this kind of blocky image:

lincblockThis image is undersampled (meaning that we should have more pixels in it than we do). The blockiness is called temporal aliasing, which means that we are getting a different signal out than we put in! Normally, we should filter this until the blocks go away, because in the math world this is high-frequency information that is bogus and not part of the signal (remember Nyquist?).

If we do the recommended filtering, it should look more like this:

lincsoftThis picture more accurately represents the input signal, although it’s blurry, and that’s OK, because the undersampling lost us the high frequency (sharp details) in the image.
Now, I’ve already shown you that the digital image is undersampled, but let’s take a look at credits. Instead of Lincoln, let’s take a look at an undersampled F:

Now, wait, that looks a lot better than Lincoln, right? If we filter it so we get the actual image we should have been sampling, it should look something like this:

But, wait, I hear you cry: the blocky undersampled Lincoln looked bad, and the blocky undersampled F looks better than the properly filtered F. WHY IS THAT?

That’s because the blockiness of the undersampling just happens to favor the correct look of the F. In other words, we’re getting a distorted signal out, but the distortion gives us a more pleasing image! Credits will look sharper on digital projection, because they don’t do the proper edge filtering. This is why a lot of people complain they can see the pixels in the projection. (You can see the pixels in Lincoln, too, before the filtering.)  If you did the proper filtering, you wouldn’t see the pixels, but then the credits would look softer again.

Now, I picked the ideal case with the F, where every part of it was a vertical or horizontal line. The worst case scenario is a S.

An undersampled S of the same scale as the F looks like this:

But with proper filtering, it looks like this:

ssoftYour brain filters this out when you’re watching credits and you tend to see the vertical and horizontal edges like in an F, which is what we read for cues with our brain. This is also why filmmakers are now favoring sans-serif fonts, because they render better at low resolution.

So the credits aren’t sharper. It’s an illusion caused by undersampling and your brain. And I showed you with a minimum of math. YAY!

Fun with signal processing!!!

MYTH: OK, Mr. Boring Math guy, I still think that the digital stuff looks better. Can you show me a combination of what you see in digital projection vs. what it should look like?

MATH: Why yes! Thank you for asking!!!

What I notice most about digital projection is that they have boosted the apparent sharpness with something called a Hough transform, which make edges look more pronounced. This also causes edging artifacts (called ringing) that I find obnoxious.

Further, the green response is compromised rather a lot. Most digital projection I see today represents all grass as an annoying neon green. It can’t seem to represent a range of colors. We’re also getting an overly red look to compensate for the strange greens. Let’s take an ideal case: this is a Kodak color sample:

kodak real

Now, I’ve exaggerated this to make it more obvious for you, but here’s what I see in most digital projection:

kodak fake

Notice we’re seeing almost no green in the avocado, the grapes look dead, the flesh tones are too red, the whites are AAAAALMOST blown out, and we’re seeing edge artifacts from over-sharpening.

This, folks, is why I miss film. We could do digital and do it well, but we’re not.

And, you ask, why is it that we just don’t use more pixels, use better color projection, so we don’t have to do this?  It’s because more pixels = more hard drive space = bigger computer needed = more cost.  Since Hollywood is in love with cheap, they’re not going to do it right until right is cheap.

News Flash: I Don’t Hate Everything Digital

I keep getting asked this question, so I suppose I have to answer it.

“Why is it that you hate everything digital?”

Here’s the short answer:  I don’t.  What follows is the longer answer.

Before I start, I know that I’ll be called on the carpet as a luddite, anti-digital idiot.  This is inaccurate.  The Dr. Film pilot was shot and edited digitally, right on a hard drive… only a few seconds of it was ever on digital tape.  My background is in Electrical Engineering, and I used to write digital imaging programs that would make your eyes glaze over.  I welcome digital technology, but I use film, too.  They both have strengths and weaknesses, and I think that throwing out film is a mistake.

I can best describe my reaction to the digital revolution with an analogy.  A good friend of mine once refused to go to a fast-food Mexican restaurant with me.  “I hate that stuff,” he said.  A few months later, he suggested going to a Mexican restaurant.  “I thought you hated that stuff,” I said.

“No,” he said.  “I just hate cheap Mexican food, especially when it’s passed off as the real thing.”

I was just in attendance at a premiere showing of a DVD.  This was supposed to be a high-class, dress-up affair.  The projection was inexcusable.  It was set the way that 95% of all DVD projectors are set, with maximum brightness, so that the white levels bloom and clip, leaving anything bright looking like either hopelessly angelic or like a rejected effect from Star Trek: The Motion Picture.

I sat calmly and gritted my teeth as I watched the projector’s brightness overload.  Fortunately, most of the footage was shot indoors, because all of the outdoor stuff looked awful.  It made me sit and stew for an hour as I watched a good documentary be marred by guy who set up the projector and didn’t know what he was doing.

This is the digital that I do hate, and I hate it not because it’s digital, but because it looks bad.  We’re sold this bunkum about its being state-of-the-art, and yet it would look better on a TV screen. Now, mind you, I’m talking about a standard-resolution DVD, not a Blu-Ray.  And I am in a good position to complain because I have run film in that very venue and it looked a whale of a lot better than their presentation did tonight.

I’ll make a few points here and then back away:

  1. Standard-resolution DVDs aren’t intended for large screen projection and seldom look good unless projected on the very best equipment.  It’s easy to mis-adjust the projectors and blow out the whites on it.  They have just 525 lines of resolution.  (Sorry about the math, but more lines = sharper picture.  That’s all you need to know.)
  2. Projected Blu-Ray (1080 lines) can look very good, and if it was sourced from good materials (usually film elements), it can look better than many 16mm prints and some 35mm prints.
  3. Many proud Blu-Ray owners tell me that their images are always better (or at least as good) as 35mm film. I can’t argue with your perception.  What I will do is cite a measurable statistic: Blu-Ray uses 1080 lines.  In theaters, the high-end digital projectors that will replace 35mm film are 4000 lines (actually 4096 in most cases, but let’s not haggle).  That’s right.  Cost-conscious Hollywood studios think they need 4000 lines to replace 35mm.  Don’t you think they would all use cheaper 1080-line Blu-ray projection if they thought they could get by with it?

Even though it’s demonstrably not true, people tell me that a standard DVD is “just as good as film.”  I heard those very words this weekend.

People are serving me Taco Bell projection and telling me it’s just as good as authentic Mexican.  It isn’t.  Good digital is fine.  Third-rate digital is not only annoying, but it also makes good films look bad.

If good digital is out there, then why do I tirelessly advocate film? Well, for starters, a lot of really great material isn’t on Blu-Ray, DVD, or 4000-line digital.  Much of it never will be.  I also think projected film has a beautiful, rich quality missing in all but the best digital presentations.  If you’re careful and picky about prints (and few are pickier than I am), then you can find nice, sharp materials that are sometimes better than what was used as a source for the DVDs.

Much of the point of the Dr. Film show is to give people an opportunity to see rare materials that are not easy to find in the marketplace.  My live shows are intended as way to see rare films in a theatrical venue, with an audience, as they were intended to be seen.

I am fully aware that film projection will eventually go the way of the steam engine.  It won’t be as fast as some say, because most movies are still shot on 35mm, and archival preservation still takes place on 35mm.  I don’t mind being compared with a guy who fixes a steam engine.  Diesel engines have no romance.  I think we need to be able to see movies shown on film for as long as we can.  I am not in a rush, as most places are, to throw out all my film and replace it with digital copies (partly because I can’t!)

I know lots of theaters that are gleefully ripping out their 35mm projectors and then running only third-rate DVDs, mis-adjusted, at sizes never intended for that use.  They all say the same thing:  “It’s just as good.”  I will continue to rail against this, because it’s wrong.

It isn’t “just as good.”  It isn’t even good.  In the mad rush to get cheaper and easier projection, we’ve thrown quality out the window.  I hope I’m not the only one who notices it.