The Degradation of the Cinematic Experience
This isn’t one of those rose-colored glasses “back in my day” feelings but a genuine technical explanation of the degradation of the cinematic experience. We focus on the topic of image resolution and the impact that has.
Growing up in the 80’s and 90’s, some of my fondest memories were of going to the cinema to watch a film for the first time on the “big screen”. It was often seen as an event, and as a family, we’d dress up to go and see The Last Emperor (is it just me or has this film been criminally buried over the years?) with a drive down to Chadstone Shopping Center to the Hoyts Cinema. I’ve only seen the film once, but images from that film were forever etched in my mind.
For convenience’s sake, I’ll pretend I didn’t grow up in Australia where we use PAL for Standard Definition Television (SDTV), and I’ll go by the US NTSC standards.
Watching something on broadcast television meant you saw an image at 720 x 480 pixels. Video Home System (VHS) tapes were even worse at 333 x 480 pixels. Images were beamed to you on a Cathode Ray Tube (CRT) television that was usually 27 inches or less in a “boxy” 4:3 aspect ratio. Most families had one main larger television and several smaller 15 inch televisions around the house.
Here is an actual 720 x 480 pixel image, the same as SD resolution. This was love, this was life.
We don’t often reminisce on old sports games from the SD era because they just look and sound like crap.
These days, the standard for broadcast television is High Definition Television (HDTV), which is either 720p or 1080i and is over twice the resolution of SD.
I remember getting my first HD television around 2007 and being mesmerized as I watched ice hockey, a sport I couldn’t care less about, look absolutely amazing and fluid, all in stunning HD resolution. This was the future.
Coming back to the original point, and this is where the lines get blurred because of analog standards, but essentially, the best we had at home was plain old SDTV.
When the world consisted of analog cinema projectors (prior to 2005), the standard for projecting in cinemas was 35mm film. There’s no apples-to-apples comparison with digital resolution since there are varying factors in the digitizing process, but the general expert1 opinion is that 35mm film is roughly equivalent to 4K or higher digital resolution. This means about 4096 x 2160 pixels. 4K TVs have a similar resolution of 3840 x 2160.
Historically, this juxtaposition of sitting with your family crowding around your home tube television to watching a film at your favorite mall multiplex and having your mind blown was the cinematic experience. It had a “wow” factor because not only was the screen bigger, but you were also experiencing 25x more resolution. You had a reason to leave the house early, line up at the box office, fight for the best seats, and pay for expensive popcorn and drinks. I won’t cover the additional selling points, such as films not coming to video rental stores until months later.
Now here’s the secret that they don’t tell you about. In the digitization of the multiplex, which began in the mid-2000s, projectors were installed with 2K resolution. To this day, most digital cinemas are 2048 x 1080 pixels, the equivalent of HDTV! The same quality you watch at home! You might even be surprised to know that most IMAX cinemas are actually 2K xenon digital projectors introduced in 2008, a phenomenon commonly known as LieMax2.
For most viewers, the cinematic experience has gotten worse in the last 20 years. We’ve gone from a 4K equivalent 35mm analog film projection to 2K digital projections in cinemas. Many of us have an equivalent or better viewing experience at home. If you have a 4K television and you watch a film in 4K, you’re more than likely watching a film at a higher resolution than if you paid to go to a multiplex! That doesn’t make for a good value proposition and a reason to go out and pay for overpriced condiments.