Of Myths and Math

Several people have asked me to expound a bit on digital projection and use math to refute the claims others are making. I have been reluctant do to it because:

a) everyone hates math in blogs and
b) I already think I go on about digital too much.

But that said, I’m going to do what was requested (now several months ago). And now, I’ll get the inevitable hate mail that “you hate digital! You’re a luddite, we hate you, move with the times.” And once again, I point out that I don’t hate digital at all. I don’t like cheap digital that’s passed off as perfection. And the new projectors are cheap digital. We were so enamored of the idea that we could save money that we jumped in head-first before the technology was ready.  (I point out, to those newbies, that I did the restoration of King of the Kongo in digital, and then it went out to film.)

Now, I always encourage you to disbelieve me. After all, people call me stupid and wrong all the time, especially on Facebook. (Facebook is the great open pasture where everyone is wrong and no one is convinced about anything.) I carefully referenced everything here, so you can look things up. Even though I may be stupid and wrong, do you really thing all these links are stupid and wrong, too? Well, judging by some political polls, a lot of you do. But I digress.

Let’s hit the biggest myth first:

MYTH: Digital projection is really better than film already (or at least almost as good) and the only people who don’t like it are elitist whiner punks, the same ones who didn’t like CDs over vinyl.

MATH: This is wrong. It’s demonstrably wrong. It’s all about sampling. Let’s take sampling. A digital signal is sampled (http://en.wikipedia.org/wiki/Sampling_(signal_processing) ). The sampling rate of a CD is 44.1kHz (this is 44,100 samples per second). Under the Nyquist sampling theorem ( http://en.wikipedia.org/wiki/Nyquist–Shannon_sampling_theorem ), this means that the highest frequency that can be reliably heard in a CD would be half of this, about 22000Hz.

The highest human hearing, for an unimpaired individual, measures in at about 20,000Hz.

THIS MEANS THAT IF THERE IS SAMPLING DISTORTION IN A CD, THEN YOU CAN’T HEAR IT. If your dog complains that he doesn’t like the sound of a CD, then you should listen to him. And if he does that, then you must be Dr. Doolittle. (Please, no singing.)

At least this is true for the time-based (temporal) sampling. There are good arguments about the dynamic range causing problems with things like hall ambience etc, but these arguments are often for elitist whiner punks. (I’m kidding, but not a lot… the CD technology is mathematically pretty sound.)

Now for digital imaging, let’s talk the same thing. Let’s not even bother about the limits of human sight, which is what we did in the case of audio. Let’s just make it as good as film. How’s that for fair? Have we ever measured the resolution of film?

Well, sure we have. And I’ll even be extra fair. I’ll go back to the 1980s when we first did this, back when film had lower resolution than it has now. How much nicer can I be?

Back in the 1980s, there was a groundbreaking movie made called Tron. It was the first film that made extensive use of computer graphics. The makers of Tron wanted to make sure that they generated images didn’t show “jaggies,” also known as stair-stepping. This is where you can see the pixels in the output device, which in this case is film ( http://en.wikipedia.org/wiki/Jaggies )

So, they tested their system, and they discovered that they needed to run 4000 lines of resolution before you couldn’t see the jaggies. Don’t believe me? Let’s look at another source:
https://design.osu.edu/carlson/history/tree/magi.html

I’ve actually seen the machine they used to do this. It’s at Blue Sky Studios now.  Here is a picture of it:

IMG_7999

Now, 4000 lines are needed for a native digital image, or an image that started life as a digitally, not something you are scanning from an outside source. If you’re sampling an analog world, like with a camera or a scanner, you’d need to follow Nyquist’s rule and use 8000 lines. You wanna know why they’re scanning Gone With the Wind at 8K? Now you know.

So you’d expect today’s digital projectors to be about 4000 lines if they’re as good as film, right? Let’s see what the specs are.  This is the list for the Digital Cinema Package (DCP), which is the standard for motion picture digital projection.

http://en.wikipedia.org/wiki/Digital_Cinema_Package

There are two formats used in DCPs. 2K and 4K. 2000 lines and 4000 lines, right?

DCP 2K = 2048×1080
DCP 4K = 4096×2160

That’s 2048 pixels wide (columns) by 1080 pixels high (lines) and 4096 pixels wide (columns) by 2160 pixels high (lines).

OK, so wait, that means 2048 pixels WIDE by 1080 LINES, right? So the Tron 4K rule says we should be seeing 4000 lines and we’re seeing 1080? Or the 4K top-end projectors, that not many theaters use, they’re using 2160????

So 2K is a big lie. It’s 2K horizontal, not vertical. It’s really 1K.

That’s about half the resolution that they should be running.

Don’t blame me. Blame the math.

Oh, and you know how Quentin Tarantino is always complaining about digital projection being “TV in public?” http://www.digitalspy.com/movies/news/a441960/quentin-tarantino-i-cant-stand-digital-filmmaking-its-tv-in-public.html

Well, what’s HDTV? Well, don’t believe me, see the specs here:

http://en.wikipedia.org/wiki/1080p

Wikipedia says it’s 1920 X 1080. But wait a second: 2K DCP, used in theaters all over the world, is 2048 X 1080. That’s almost identical to 2K theatrical projection.

Quentin Tarantino is right: Digital film presentation is TV in public, almost literally. Sure the screen is bigger, but that only makes the pixels show up more.  (We can argue about a lot of other things Tarantino says, but the math is behind him on this one.)

MYTH: Even though you just showed it isn’t as sharp, it looks better in my theater than the 35mm did, so you’re still wrong.

MATH: The digital projectors look nicer because the 35mm projectors in your old theater were junky, maladjusted, and old. They were run by projectionists and technicians who didn’t care about adjusting things correctly.  Sometimes there hadn’t been a technician in the theater in decades.  No, that isn’t a joke.

Further, for the last many years, Hollywood has been churning out prints that are made from something called DIs. Digital Intermediates. These are film prints made from digital masters. Almost all of these are made at 2K (1080 lines). Is it any wonder that you project a soft print through an 80-year-old projector with greasy lenses and it doesn’t look as good as a new digital projector showing the same thing? (Digital intermediates started in 2000 or so, the first major film using them being O Brother Where Art Thou? )

Try projecting a REAL 35mm print, made from good materials, especially an old print or a first-rate digital one. Then compare that to digital projection. It’s not even close.

I projected a 35mm print of Samsara a few years ago and I thought there was something wrong with it. Why was it so sharp? It looked like an old Technicolor print. Why was it sharp? Digital imaging at 6K and originated on 65mm film. Worth seeing.

MYTH: There’s nothing to misadjust on digital projectors, so they’re going to be more reliable than the 35mm projectors.

MATH: I know projector repairmen, and they tell me the digital projectors break down more often. I don’t have a lot of measurable math, because it’s early yet, but I’ve seen the sensitive projectors break down very often, and the lamps often turn green before they fail. Since there’s no projectionist in the booth most of the time, then there’s no one to report arc misfires, dirty lenses, etc.

Oh, and the projector is a server on the internet, with a hard drive in it. Computer users will tell you that the internet never crashes, and further, hard drives are 100% reliable. I was working in a theater once where the movie stopped running because someone in Los Angeles accidentally turned off the lamp. (Since the projector was on the internet, some schmo accidentally shut off the wrong projector. Nothing we could do about it.)

Digital projectors can be out of focus, they are sensitive to popcorn oil, they have reflectors that are sensitive and need replaced. Don’t think that digital means reliable.

MYTH: 35mm prints just inherently get beaten up, so they don’t look good even after a few days.

MATH: Dirty projectors and careless projectionists cause most of the print damage you will ever see. Hollywood has had a war on projectionists for about 50 years, and now they’ve killed them off. For the last 35 years, most projectionists have been minimum-wage workers with little-to-no training. They do double-duty on the film and in the snack bar.

These are known in the trade as popcorn monkeys. Please blame them for most print abuse.

MYTH: The credits on digital films look sharper than they do on film, so that means that digital is sharper, no matter what you say.

MATH: Digital imaging favors something called aliasing. http://en.wikipedia.org/wiki/Aliasing. Aliasing means just what you think it might. Due to sampling problems, the signal you end up with is different than the one you started with. It goes under an alias. This gets really technical, but you remember the old days, when they used to have big pixels in video games? (Hipsters, you won’t remember this, but if you’re the typical hipster who doesn’t think anything worth knowing happened before your birth then you won’t be reading this anyway.) Remember this kind of blocky image:

lincblockThis image is undersampled (meaning that we should have more pixels in it than we do). The blockiness is called temporal aliasing, which means that we are getting a different signal out than we put in! Normally, we should filter this until the blocks go away, because in the math world this is high-frequency information that is bogus and not part of the signal (remember Nyquist?).

If we do the recommended filtering, it should look more like this:

lincsoftThis picture more accurately represents the input signal, although it’s blurry, and that’s OK, because the undersampling lost us the high frequency (sharp details) in the image.
Now, I’ve already shown you that the digital image is undersampled, but let’s take a look at credits. Instead of Lincoln, let’s take a look at an undersampled F:

fblock
Now, wait, that looks a lot better than Lincoln, right? If we filter it so we get the actual image we should have been sampling, it should look something like this:

fsoft
But, wait, I hear you cry: the blocky undersampled Lincoln looked bad, and the blocky undersampled F looks better than the properly filtered F. WHY IS THAT?

That’s because the blockiness of the undersampling just happens to favor the correct look of the F. In other words, we’re getting a distorted signal out, but the distortion gives us a more pleasing image! Credits will look sharper on digital projection, because they don’t do the proper edge filtering. This is why a lot of people complain they can see the pixels in the projection. (You can see the pixels in Lincoln, too, before the filtering.)  If you did the proper filtering, you wouldn’t see the pixels, but then the credits would look softer again.

Now, I picked the ideal case with the F, where every part of it was a vertical or horizontal line. The worst case scenario is a S.

An undersampled S of the same scale as the F looks like this:

sblock
But with proper filtering, it looks like this:

ssoftYour brain filters this out when you’re watching credits and you tend to see the vertical and horizontal edges like in an F, which is what we read for cues with our brain. This is also why filmmakers are now favoring sans-serif fonts, because they render better at low resolution.

So the credits aren’t sharper. It’s an illusion caused by undersampling and your brain. And I showed you with a minimum of math. YAY!

Fun with signal processing!!!

MYTH: OK, Mr. Boring Math guy, I still think that the digital stuff looks better. Can you show me a combination of what you see in digital projection vs. what it should look like?

MATH: Why yes! Thank you for asking!!!

What I notice most about digital projection is that they have boosted the apparent sharpness with something called a Hough transform, which make edges look more pronounced. This also causes edging artifacts (called ringing) that I find obnoxious.

Further, the green response is compromised rather a lot. Most digital projection I see today represents all grass as an annoying neon green. It can’t seem to represent a range of colors. We’re also getting an overly red look to compensate for the strange greens. Let’s take an ideal case: this is a Kodak color sample:

kodak real

Now, I’ve exaggerated this to make it more obvious for you, but here’s what I see in most digital projection:

kodak fake

Notice we’re seeing almost no green in the avocado, the grapes look dead, the flesh tones are too red, the whites are AAAAALMOST blown out, and we’re seeing edge artifacts from over-sharpening.

This, folks, is why I miss film. We could do digital and do it well, but we’re not.

And, you ask, why is it that we just don’t use more pixels, use better color projection, so we don’t have to do this?  It’s because more pixels = more hard drive space = bigger computer needed = more cost.  Since Hollywood is in love with cheap, they’re not going to do it right until right is cheap.

Digital is Over There! It’s Only a Matter of Sampling!

Bruce Lawton made me aware of an article in the New York Times that I found highly annoying.  It was highly annoying because it was inaccurate.  It reflects the complete misunderstanding of what “digital” means in the media and public.  In short, the public and media seem to believe this:

“Digital imaging processes are a modern miracle and are a complete replacement and upgrade from older technologies.  All digital images are perfect by their nature and will never degrade or become outdated.”

This is simply not true.  I hate to burst your bubble.  A closer summation would be this:

“Digital imaging is a miraculous tool that allows us to do things that were previously impossible to accomplish.  They can produce very high quality, not perfect, reproductions of their source images.  Their biggest drawback is that they become outdated quickly and most digital storage devices have short shelf lives.”

Now, once again, I’ll draw criticism from the masses: “You hate anything digital!  You’re a luddite!  You’re clinging to an outdated technology like film!  Get with the modern program!”

Once again, this is not true.  I use digital imaging all the time.  I think it’s great.  I did digital restorations for the Buster Keaton picture Seven Chances.  I am doing a digital restoration on King of the Kongo.  But I still believe in film.  Film doesn’t get computer viruses, hard drive crashes, or incompatible software upgrades.

I have film, actual film stock, manufactured in 1926 that is still projectable in modern projectors and plays fine.  I have digital images from 1991, carefully saved and copied,  that are incompatible with any modern program.

What would you think of a library that had a book from 1991 that you couldn’t read anymore?  Not because it was damaged in some way, but rather because they couldn’t figure out how to open it. You’d say they were crazy.  You’d be right.

I’m going to refute the New York Times article point by point, but first I have to lay out some ground work.  Fear not, technophobes. I’ll try to make it as clear as possible and minimize all the math.  It really is pretty simple, but for some reason, people want to believe in the miracle part of it instead of the truth.

In the early 1980s, Disney made the first real computer feature.  It took years to complete, but it was called Tron, released in 1982.  Tron was made with a bank of computers each with less computing power than your iPhone.  Your old iPhone.  Yeah, that slow one.

Tron is not notable for many dramatic triumphs (after all, it’s basically The Wizard of Oz set inside a computer), but for cinema, it was a real breakthrough.  Disney experimented with various resolutions.  Now, before you get all paranoid about a scary word like resolutions, let me explain.  It simply means how many pixels (little squares, like the ones you see in the image above) are used in the image.

Higher resolution = more pixels = smaller squares = sharper image.  In television, this is also measured in lines, which is the number of horizontal lines in the TV picture.  You know how people keep trying to sell you 1080p HDTV?  Well, standard definition was 525 lines, and HDTV is 1080.  Again, more lines = more pixels = sharper image.  See?  Simple!

Disney knew that they would have to output their computer graphics to 35mm film in some way.  There was no digital projection at the time.  They were very concerned about “stair-stepping.”  This is an effect also called aliasing.  Don’t be scared.  Look at the picture above.  You notice that it’s made of little squares?  Omar Sharif’s collar isn’t a collar, but it’s a jagged set of white lines.  You went to plot something that was supposed to be a line and you ended up with a jagged representation instead.  It’s aliased because the thing you tried to plot isn’t what you got!

Disney’s people discovered that they could see aliasing on most images until they put the resolution at 4000 lines.  This has been the “gold standard” of digital imaging for years.  Well, almost.  Tron had a limited color palette because of the software and hardware of the time.  This made jagged lines easier to spot.  As we were able to represent more colors and shades, we discovered that we could drop the resolution to 2000 lines, and it still looked pretty good… just a little blurry to some people. Remember, this is for material generated by the computer, not something scanned from an outside source.

In engineering parlance, 4000 lines = 4K, 2000 lines = 2K, and HDTV at 1080 lines makes almost exactly 1K.

I have to introduce one last concept.  It’s called the Nyquist Sampling Theorem.  I know, it’s an engineer thing.  Nyquist is a law of digital sampling.  It says that if you are scanning an analog signal (like a piece of film), the minimum rate you can use, so that you get no significant loss of data, is twice the number of the highest frequency in the source.

Oh, no.  The mathophobes are dying now.  Please don’t.  That simply means if you’re scanning a 4K image, you need to scan it at 8K or else you’re get a picture blurrier than it should be.  For a 2K image, you scan at 4K.

Now, we can tackle this article.  Take a deep breath.

Error 1:

“(Lawrence of Arabia was shot in 65 millimeter — nearly twice the width of a 35-millimeter frame — so its negative had to be scanned in 8K, creating 8,192 pixels across each line. But it is still referred to as a 4K scan because it has the same density of pixels, the same resolution across 65 millimeters that 4K has across 35 millimeters.)”

This is a very poor way of explaining the concept.  They’re saying that this means they’re scanning more lines because the negative is bigger, not because they’re scanning more lines per inch of film.

And, guess what?  What we’re seeing here, by Nyquist, through Disney’s research, shows that they’re undersampling (blurring) the negative.  Now, I don’t blame them, and it’s probably “good enough,” and very expensive to do more, but let’s start on the right playing field.

Errors 2-3:

“When Lawrence was last restored, in 1988, some of these flaws could be disguised by ‘wetgate printing,’ a process of dousing the print in a special solution. But the new restoration has no prints. The film’s digital data are stored on a hard drive, about the size of an old videocassette, which is inserted into a 4K digital projector. In short, the problems would now have to be fixed.”

Wetgate printing is still used.  It’s simple enough.  You take the negative (not the print), and soak it gently in a fluid (some archives use dry cleaning fluid), that fills in the scratches on the clear film base.  That fluid evaporates by the time the film hits the takeup reel.  Similar processes can be used in scanning.  If it wasn’t done that way in this case, then it means more work for the people retouching the images.

The new restoration has no prints.  SO WHAT?  That has nothing to do with what you’re talking about and is a diversion from the point.  Wetgate has to do with the scanning or printing the negative, not projection. Note to the sticklers out there: yes, we can use wetgate transfers on prints, if that’s all we have, but that is not what is happening here.

Error 4:

“Luckily, there have been dramatic advances in digital-restoration technology in just the last few years. New software can erase scratches, clean dirt and modify contrast and colors not just frame by frame but pixel by pixel. In the old days (circa 2006), if you wanted to brighten the desert sand in one scene because it was too dark, you’d have to brighten the sky too. Now you can brighten the sand — or even a few grains of the sand — while leaving everything else alone. And in those days there was a limited palette for restoring faded colors. Today’s digital palettes are much vaster.

“In one sense, this restored Lawrence might look better than the original. Because of the film stock’s exposure to the desert’s heat, some of its photochemical emulsion dried and cracked, resulting in vertical fissures. ‘Some were just a few pixels wide,’ Mr. Crisp said, ‘but some scenes had hundreds of them, filling as much as one-eighth of the frame.’”

The way this is written implies that there were shooting errors that caused exposure problems with things being too dark or too bright.  It further implied that Grover Crisp and his co-workers are going in and haplessly changing things to suit their own artistic eye, not that of director of photography Freddie Young or director David Lean.

I have a lot of respect for Grover Crisp, and I know he’s not doing that.

Lawrence of Arabia was shot on Eastman color stock that was very unstable (it was especially bad from 1958-63.)  The colors fade unevenly, and brightness fades unevenly.  What they are actually doing, despite the way the article is written, is to match the colors with the way some of the old Technicolor reference prints look (Technicolor prints don’t fade, but they are 35mm and 2-3 generations down from the negative).  This is restoration, not willy-nilly artistry.  There are certain colors that will be almost entirely gone (especially blues and greens).

Error 5:

“Sony went to so much trouble to create not just this release but also a new archive for the ages. Film degrades; digital files of 0’s and 1’s do not. In the coming years, new software might allow still better restorations. But the technicians making them can work from the 4K scan. They won’t have to go back to the negative.”

This is just crazy on a lot of levels:

  1. Robert Harris made a nice duplicate negative in 65mm, on color-stable stock, for the 1980s restoration.   At the time he made it, there were already a number of unrecoverable scenes and missing bits.  This article makes it seem that Harris’ work is now outdated and rather trivial.  Nothing could be further from the truth.  Harris and director David Lean worked together to save Lawrence of Arabia, and without them, Lawrence would be less than it is today.
  2. Ones and zeroes don’t degrade.  Hard drives do.  These are spinning media that are subject to magnetic fields, ball bearing problems, heat, cold, and probably the most fatal problem, sticktion.  A hard drive with sticktion has had the spinning magnetic rotor stick to the read head (much like a sticky record album sticking to the needle).  If it sticks too hard, then the drive can’t spin, and the disk is ruined.
  3. Ones and zeroes don’t degrade, but file formats aren’t forever.  Neither are disk drive formats.  Had Lawrence of Arabia been restored digitally in 1989, the results could have been saved on 5.25” floppy disks, and no one could read them today.
  4. Scanners are wonderful and they get better every day.  I’d bet that if the film is stored well, it will hang together well enough to survive until better scanners come along so that it can be scanned and improved again.

This same thing happens often with other “restorations.”  Gone With the Wind and The Wizard of Oz were shot in 3-strip Technicolor, which produces three extremely stable black-and-white negatives.  These are a pain to reproduce, so they got “restored” in the 60s to “modern” Eastman color stock.

Whoops, the restoration faded in a few years.  No trouble.  They reprinted it again, with better technology, in the 1970s.  They went back to the black-and-white negatives, which were still around.

Whoops, that restoration faded too.  No trouble.  Another restoration was done in the 1980s.  Guess how?  From the black-and-white negatives.

Oh, wait, they got a better way to reproduce the film and make the alignment sharper?  Back to the negatives.

And they needed to re-scan to make a Blu-ray (well, this time, they did an 8K transfer, which is what the Nyquist sampling theorem says we should do for such a film).  Gee, they went back to the negatives.

The moral of the story: save the negatives for as long as you can because they seem to get used a lot for restorations.

Error 6:

“Between the detective work and lots of video improvement (before the days of digital), it took Mr. Harris 26 months to restore the movie — 10 months longer than it took David Lean to make it.”

The preservation work Harris did on Lawrence of Arabia was on film.  He didn’t use video improvement.  There was no video that would do the work.

Error 7-8:

“Its life in home video has been spotty as well. The first DVD, in 2001, was made from a badly done HD transfer: colors were way off, contrasts too bright or dim. A redo, two years later, was much better, but the dirt and scratches were cleaned up by a ham-fisted process called ‘digital noise resolution’ — the easiest and, for some problems, the only technique available at the time, but it softened the focus and dulled detail.”

I am not sure, and it’s not really worth looking up, but I doubt that the DVD was made from an HD (High Definition) transfer in 2001.  It’s technically possible, but it’s unlikely.  It was probably done from a standard definition transfer, which would also account for the color drift, since the color gamut on standard definition television is pretty limited.

I have no idea what “digital noise resolution” is.  I suspect that what he means is “digital video noise reduction” (also DVNR), which is an automated process to remove scratches and other imperfections from films.  Cartoon aficionados have been bemoaning this for years.  DVNR is still used, fairly often in fact, but it can be done gently or in a ham-fisted way that the author describes.

“A forthcoming Blu-ray Disc of the film, out Nov. 13, fixes all those problems, in part because it’s Blu-ray but more because it’s mastered from the same 4K restoration as the theatrical release.”

Is the mere fact that something is Blu-ray some way of saying it’s anointed with a perfection not yet seen?  Blu-rays, DVDs, films, and videos can all look great or terrible depending on how they are handled technically.

The overarching thing that the author misses (and that others are not missing) is that this digital restoration is not archival no matter how much we would like it to be.  I’m on mailing list after mailing list from archives in a panic about how to store things so that they will last.

I was at the Library of Congress recently seeing the process of the entire run of Laugh-In being copied from 2” tape, a format now long obsolete, to something now (we hope) more permanent.

At the same visit, I saw a roll of film made in 1893 by the Edison people.

Which of these is archival?

The Library of Congress still uses, and intends to use, 35mm film for archival storage.  They haven’t found anything to beat it yet.  They are keeping Kodak and Fuji from shutting down the manufacturing lines.  Other archives demand film, too.  It just holds up better.

That doesn’t mean digital doesn’t have its place.  It’s just that digital isn’t the magic panacea that cured the world’s ills.

It’s a tool, just like anything else.